Zap Project (Murgatroyd the Talking Moose)

The “Zap” Project                          

A “secret keeper” for kids age 5 and up

Concept:

A “simpler” interactive  talking toy that doesn’t just spit back whatever it hears, tries to teach a 5 year old how to count, strongly hints at “robots are always war machines” or records whatever it hears for storage at some corporation’s head office.  The stuffed toy will be a friend and confidant to his companion child (of whatever age).  It will listen to and “understand” human speech but is incapable of making enough of the same noises human speech consists of and has developed its own “language” called “Zap”.  The child (or older person) will have to put the effort into learning to understand “Zap”.  What this gives a child is a toy/companion/friend he or she can tell anything that will “keep his or her secrets” because nobody else in the house understands what the toy is “saying”.

This is intended to be a child’s “best friend” who listens to what he’s told and doesn’t tell “Mom and Dad” about everything.  I believe this will stimulate “true” imagination (that which is triggered by nothing but itself) in the user by requiring the willing effort to imagine that “Zap” is actually speech and that there is an actual response from the toy.  Preliminary sketch of the concept is below.  The sketch is of a teddy-bear but the toy could be any stuffed animal of the necessary size.

preliminary-sketch-v-1

Project Log

Day 1:

General contemplation of what I want the project to do and what the difference is between this and other interactive toys.

Most of the “interactive” toys  I’ve seen on the market have several drawbacks in my opinion:

      1. Pre-recorded and predictable human phrases triggered by a squeeze or similar
      2. Pre-recorded “babble” triggered as above
      3. Random and pointless movement (waddle or roll or other)
      4. Overly “educational” (teach my toddler to count, etc.) for my target age group
      5. Some of them actually record ambient (i.e. your conversations etc.) and store them at the manufacturer’s head office (privacy issues abound here see article copied from the New York Times below):“SAN FRANCISCO — My Friend Cayla, a doll with nearly waist-length golden hair that talks and responds to children’s questions, was designed to bring delight to households. But there’s something else that Cayla might bring into homes as well: hackers and identity thieves.Earlier this year, Germany’s Federal Network Agency, the country’s regulatory office, labeled Cayla “an illegal espionage apparatus” and recommended that parents destroy it. Retailers there were told they could sell the doll only if they disconnected its ability to connect to the internet, the feature that also allows in hackers. And the Norwegian Consumer Council called Cayla a “failed toy.”

        The doll is not alone. As the holiday shopping season enters its frantic last days, many manufacturers are promoting “connected” toys to keep children engaged. There’s also a smart watch for kids, a droid from the recent “Star Wars” movies and a furry little Furby. These gadgets can all connect with the internet to interact — a Cayla doll can whisper to children in several languages that she’s great at keeping secrets, while a plush Furby Connect doll can smile back and laugh when tickled.”

      6. They’re some variation of the “remote control robot” (usually some type of “armed combat units”)
      7. Tremendously expensive for the actual product offerings

Day 2:

Online research for the parts that may be required outside of the class kit (preliminary list below): 

      • Feather Wing with MP3 (or other format) storage (internally or on a micro-SD card) and playback capability
            • Found Adafruit Music Maker Feather Wing. 
                  • There is, however no (detectable from the product description) recording capability.  I may have to pre-record some speech, pre-“scramble” it into “Zap” and work with a variable timer (1 to 3 seconds at random). 
            • Found the Electret Mic Amp – MAX9814. 
                  • I’m going to use the microphone as a trigger for a response rather than a pickup for ambient voices.  There will be no privacy concerns as it will only trigger the playback.
            • Found the LiPo 2000mAh #2011 battery
                  • Should be enough to power the toy as there will be no continuous drain.

Feather Wing, microphone and lithium ion battery on order from Adafruit (DHL delivery … hopefully fast).

Preliminary Fritzing Sketch:

screen-shot-2017-12-30-at-10-57-22-am

Preliminary Schematic Diagram:

screen-shot-2017-12-30-at-11-07-07-am

Day Three:

Found a toy at Value Village (see photo below). 

murgatroyd

This will become “Murgatroyd the Talking Moose”.  It’s suitable due to the zippered opening in the back (making parts insertion/repair simpler for the prototype).  Created a user survey form for testing period (link below). 

https://docs.google.com/forms/d/e/1FAIpQLSdo15jfwOgUsIRWE0q9PYcejVhlgml8NzPm89iNOOaNvSGrgQ/viewform

Day Four:

Parts have arrived (yay).  Unfortunately, I can’t get downtown to solder them today … will go tomorrow.  Spent over an hour making and recording silly noises in separate tracks to load onto the SD card.  This was simply taking a consonant and adding a vowel (Ba pronounced as “Bah” for example) or just a vowel alone.  I intend to simply use the “shuffle” feature to make Murgatroyd “speak” in “Zap”.

Day Five:

All soldered up and ready to go (hopefully).  Note from photo of the circuit assembly below that I’ve deliberately installed the header pins such that the “working surfaces” of the Feather and Feather Wing are facing rather than exposed.  This will help prevent (hopefully) any jerked or pulled contacts and impact injuries inside the toy.  Music Maker library downloaded and installed in Arduino.   Files transferred to SD card as TRACK001 – TRACK037. 

circuit_assembly

moose_with_circuits

Day Six:

Adafruit’s basic test sketches for the Music Maker and microphone check (link below):

https://github.com/DFstop/Zap/tree/master

The code for the Music Maker verifies and uploads but I’m getting message “SD failed, or not present”.  Tried re-seating the SD card with same result.  I hope I don’t have what the Adafruit site defines as a “non brand knock-off”.  Visually checked all solder points and connection points on the SD card … everything seems OK but will not recognize the card.  Have tried two brands (Kingston and Nextech) 32 GB micro-SD cards.  Adafruit’s test sketch for the microphone seems to recognize the input (getting a result in the serial monitor anyway). 

Day Seven:

I hate this …  I absolutely hate it.  Arduino’s test code for the microphone gives me a result I can use as input for the trigger and the Music Maker feather wing test code verifies and uploads but will not find the SD card (error copied below from serial monitor output):

Adafruit VS1053 Feather Test

VS1053 found

SD failed, or not present

I’ve tried everything I can think of.  Without access to the sound files I can’t code to test for the full project.  Failure in coding, SD card, wiring or other.  I’ve simulated the final assembly (all the pieces fit in the toy) but it doesn’t work (photo below)

circuits_in_moose

Day Eight and Later (December 20 – 26):

Ran the user survey as best I could.  

https://docs.google.com/forms/d/e/1FAIpQLSdo15jfwOgUsIRWE0q9PYcejVhlgml8NzPm89iNOOaNvSGrgQ/viewanalytics

Eight survey responses (neighbours and relatives) about the toy (solely as concept since I can’t make the thing work).

Final Fritzing Sketch

screen-shot-2017-12-30-at-10-26-06-am

Speculative Repair Kit List (If it actually worked):

  1. Micro-screwdriver set
  2. black electrician’s tape
  3. Needle & thread (for possible repairs to Murgatroyd)
  4. soldering kit (for possible repairs to the circuitry)

B.O.M. for the project:

screen-shot-2017-12-30-at-11-21-20-am

Air DJ

Experiment #5: Air DJ

Emilia Mason,  Roxanne Henry, & Kristy Boyce

Project Description

Emilia, Kristy and Roxanne were inspired to create a fun and innovative way to create music and play with your friends. The Air DJ is a gesture-based musical device which allows you to load up your favourite sound effects and assign them to gestures which are intuitive to the user. It runs on the Adafruit Feather M0, combined with the Music Maker Wing. A Sparkfun ZX gesture sensor serves as the input. Then, you take your device with you, and jam wherever and whenever. Up to six sound effects can be loaded per device.


We want to bring fun music to everyone, everywhere.

screen-shot-2017-12-13-at-7-31-14-pm
PROMOTIONAL VIDEO: https://www.youtube.com/watch?v=WxZOndgZ8vM&t=6s

 

Code: https://github.com/rh11lp/rh11lp.github.io/blob/master/experiment5/experiment5.ino

Fritzing Diagram: experiment5_bb

 

 

_dsc0496Parts used:

3 SparkFun ZX Distance and Gesture Sensor

3 Adafruit Feather M0

3 Adafruit Music Maker FeatherWing

3 Lithium-Ion Polymer (LiPo) Battery (3.7V 1000mAh)

3 Micro SD Cards

4 Sets of Speakers

Beforehand: Preparations and Testing

What needs to be done ahead of time?

To make this experiment possible it was necessary to get the SparkFun ZX Distance and Gesture Sensor on time so we booked one full day to drive to Markham and buy all the pieces necessaries.

During our conversations, we decided to 3D print all three enclosures, for this it was going to be necessary to go time ahead to 100 McCaul to print them. This meant we needed to have the design ready time ahead.

To test Air Dj we booked Thursday to work together before presenting the experiment on Friday. We also decided to meet during the weekend for our second session of testing and film a promotional video.

 

Do you need extra batteries?

No, we won’t need extra batteries. We will recharge the 3 Lithium-Ion batteries overnight every time we tested the devices.

What goes into your repair kit?

-Electrical tape

-Hot glue

-Roxanne 🙂

The assembly of our experiment was not complicated at all. The recurrent issue was to reseat the Music Maker Featherwing, this was necessary every certain amount of swipes (about every 20). Having to reset the Featherwing meant we needed to have easy access to it, for this reason we decided to add velcro to it.

 

The Plan

 

How will you document your own user experiences while using the device? Notes? Journaling? Photos? Video? Audio recording?

 

We will be using video and photos of us testing Air Dj. We also want other people to interact with this experiment, we will be interviewing them as well.

 

What will you do to ensure that your data collection methods are consistent for each group member?

Data collection method: Working together to make sure we collect consistent data.

 

For each group member, what are the dates & times of testing?

Roxanne: One 4 hour session on Thursday

   One 4 hour session on Sunday  

 

Emilia: One 4 hour session on Thursday

           One 4 hour session on Sunday

 

Kristy: One 4 hour session on Thursday

          One 4 hour session on Sunday

 

Air Dj is an interactive device that works better when played with others and with a backtrack.

For this reason we have chosen to do two 4 hour sessions in which the three of us would be present, using and testing all three devices at the same time.

 

Will you be doing your day-to-day activities or doing a special activity?

 

We will be meeting in the Digital Futures Studio together. The idea is to have all three devices working at the same time and playing a backbeat for us to practice our music skills and test Air Dj.

 

Process Journal

Deciding what to do:

For experiment 5 we had so many ideas and it took us a few days to days to full decide what our experiment was going to be.

Some of our ideas:

-King of the hill game

-Light up shoes

-Personal/wearable drum set = EL wire

-Wearable Bongo butt, you literally tap that ass and make beautiful music

-AR glove – force feedback, detect collisions w virtual objects -magnetorheological

-Haptic undapants (what?)

-Conductive pen greeting card /copper tape

-Lazer tag phasers – IR sensors

-Virtual dj set, air dj

Emilia mentioned she wanted to make a wearable or something like Behnaz Farahi

http://behnazfarahi.com/breathing-wall-ii/


When we finally decided to develop Air Dj we made a list of the materials we would be needing:

2 more feather wings (Kristy already had 1 from the Mansplaining project)

3 micro sd cards

Buy 1 battery (Emilia already had 2 from the Love Corner project)

3 Proximity sensors / Gesture sensors

Since we need we were going to need more Music Maker Feather wings, Kristy immediately made an order for two more from the provider she got hers. That was very convenient!

Kristy mentioned it was very easy to use and sent us this tutorial :


https://www.adafruit.com/product/3357#tutorials

screen-shot-2017-12-13-at-7-38-34-pm

 

24259582_10159797633605096_338549185_o

 

 

Our package arrived!
Our package arrived!

Even though we had the Music Maker Feather Wings we were still deciding what sensor to use. At first we thought the proximity sensor we had in our CandC boxes were ok but then we decided to talk to Afaq and consult him which sensor he would recommend for our idea.

zx_product_cropped

The SparkFun ZX Distance and Gesture Sensor is the winner! Afaq recommended other sensors as well but because of price and availability, we chose the SparkFun ZX.

The only “issue” we had is they were only available in Canada Robotix and because of time we knew ordering online was not going to be our best option. We booked Saturday December 2nd for a roadtrip to Markham!!

During Friday’s class Roxanne and Emilia had the chance to speak with Nick regarding the Air Dj idea. Nick recommended to play with the size of our device, instead of making the device small to make it a little bit bigger, this would make the user’s experience more interesting because of the movements. We totally agreed!

That Friday (Dec 1st) we started to look for different enclosure ideas:

https://www.thingiverse.com/thing:571270/#files
https://www.thingiverse.com/thing:571270/#files

 

 

https://www.amazon.ca/Fisher-Price-Classics-Record-Player/dp/B003CGVCXS/ref=sr_1_sc_1?ie=UTF8&qid=1512159502&sr=8-1-spell&keywords=Fisher%20Price%20Classics%20Record%20Playe
https://www.amazon.ca/Fisher-Price-Classics-Record-Player/dp/B003CGVCXS/ref=sr_1_sc_1?ie=UTF8&qid=1512159502&sr=8-1-spell&keywords=Fisher%20Price%20Classics%20Record%20Playe

 

 

Early Sketches
Early Sketches
Early Sketches
Early Sketches

 

 

After researching possible enclosure ideas we decided our enclosures should be 3D printed turntables. We really liked the idea of 3D printing something that looked like a record for our Air Dj device.

 

This project was full of adventures! On Saturday Kristy and Emilia went to Markham to get the 3 ZX gesture sensors and 3 micro SD cards.

img_0375 25400656_10159671932215057_195626453_o

 

We also tried to buy these speakers but they were not available anymore and the project was becoming waaaaaaay too expensive. We decided to use regular speakers for the Air Dj.

24282066_10159625411155057_1282000775_n

 

 

Saturday night we all met in school and Roxanne started working on the code and explaining how it works.

 

From the very beginning, we had problems with the pieces. One of the Music Maker Feather Wings was broken and we had to use tape for the micro SD to stay in place.

https://learn.sparkfun.com/tutorials/zx-distance-and-gesture-sensor-smd-hookup-guide?_ga=2.251769418.219913925.1512157062-1708196433.1505250634

 

https://vimeo.com/247238710

 

Programming and Assembly:

Programming for the SparkFun ZX gesture sensor was surprisingly easy. SparkFun was kind enough to provide a comprehensive and well-rounded library which contained useful public methods such as readGesture() and gestureSpeed(). It was fairly trivial to create a case statement which iterated through the available gestures (sort of like an if statement, but cleaner for comparing different possibilities of the same variable) as they were being executed. Then, it was only a matter of determining the speed of the gesture to branch off into different sound effect options.

Following this, the music wing needed to be configured to play the sound clips. It was discovered by Sana, Kylie and Ramona that the music wing only accepted headphones which had a TRS jack. This meant that any speaker cable which also had a microphone would not work. This saved us a lot of time on troubleshooting since we all regularly use iphone headphones, which are TRRS.

One of the first problems we encountered after that was that wav files sounded really distorted. We didn’t have time to test whether file size impacted this, so we decided to simply use mp3 for everything. At this point, we realized that smaller file sizes were optimal, and tried to keep all of our files <= 50kb. The next challenge revolved around the feel of the sound bites we wanted to use. Since the Air DJ is gesture based, which means no haptic feedback, we needed to make sure the sounds associated to each gesture were intuitive and felt good. Once we accomplished this, it was only a matter of implementing the branches based on speed for each gesture.

We had to decide whether or not we wanted to play the full file or allow the feather to continue to operate while the sound bite played. In the end, the sound files were so short and the sensor reactive enough, that we settled on using startPlayingFile() instead of playFullFile(). This allowed the user to play more quickly, but also put the wing at risk of overloading and crashing. Allowing the full file to play wasn’t significantly less at risk of crashing the wing, so the tradeoff was worth it in the end for increasing the experience quality.

Speaking of crashing, boy do those music wings crash a lot! This made troubleshooting especially challenging since the life of the wing before crashing was limited to 5-10 gestures, depending on the sound bites it was trying to load. Some sound bites reacted better than others. In one case, a tiny 40kb mp3 file was crashing the wing almost constantly. We shortened it and compressed it a bit and it was suddenly much more responsive. We have yet to figure out if there was something corrupted about the file before or if the size alone caused the problem (though we had many sound files which were around 50kb, so it’s not likely the size alone which caused the problem). Our theory is that the music wing wasn’t meant to rapidly load and unload as much as we demanded of it.

The problem was always the same: the gesture would register, the sound bite was beginning to load from the SD card, and then the wing would stop working entirely. The loop function would stop executing too, which made it clear that the entire feather crashed. A manual reset of the wing would then be necessary, where the sound bite that was loading off memory would finally play, and then the wing would be good to go for another 5 to 10 gestures. Once every 8 or so resets, though, the feather would need to be powered off and then powered back on. I suspect that at these moments, there is simply too much corruption for it to recover after a reset.

These were some of our worst problems concerning the hardware. A simple, though no less devastating problem was that one of our music wings arrived with a broken SD slot. The pin that holds the SD in place was absent, so the card would have to be taped into its chamber in order for it to read. This was unreliable and frustrating to work with since it seemed that even jostling the card could disconnect it, and the feather required a reset when this happened.

 

The Enclosure and Branding

After figuring out how to make the sensor and the music maker featherwing we decided to make the coolest enclosure ever made!

From the very beginning we knew we wanted to 3D print the enclosures but the 3D printers in 205 Richmond were broken at the moment, Kristy decided to go to 100 McCaul to get them printed there

The enclosures ended up taking from Mon-Thurs to print due to a failed print on Monday in our lab and the size of the print job. In total we made 3 versions of the enclosures: The 3D printed ones (in grey below, later painted black,) a wooden backup, and then after testing, the melted record top Emilia can be seen working on below.

One of the problems with the casing (more on this later,) is that we didn’t consider that a feather mounted on a square protoboard wouldn’t be able to get close to a circular wall structure. This meant that plugging in headphones wasn’t as easy to do as we envisioned. We also didn’t think of putting in a hole for the charging cable.

New enclosure with stand
New enclosure with stand
Old enclosure with stand
Old enclosure with stand

Otherwise, the physical design of the product looked and felt really great. The rippled look of the vinyls gave a kind of unique affordance to the device which we might not have been able to convey otherwise. A small adjustment we should have considered sooner in the production, however, was the placement of the feather.

Since we had to open the case so often to reset the feather, and putting the lid back on to align with the sensor wasn’t a trivial task, we should have placed the entire device itself on the lid so we could adhere the sensor to the lid. With the feather and sensor adhered to the bottom of the body, it was easy to jostle the solid core cables a bit too much, and several of the connections ended up snapping at certain points during the testing.

 

img_4719 img_3744 25035310_10159642687810057_910550177_o 24879404_10159826765625096_554412854_o

Wood backup prototype
Wood backup prototype

img_0388 img_7373

 

 

Logo Designs:

24550312_10159817598595096_608801761_n 24337556_10159817599215096_1738569749_n 24331357_10159817599205096_1860321117_n

 

 

Survey and User Testing

Survey Link: https://www.surveymonkey.com/r/QKB5MWB

screen-shot-2017-12-13-at-9-05-27-pm screen-shot-2017-12-13-at-9-04-48-pm screen-shot-2017-12-13-at-9-04-31-pm screen-shot-2017-12-13-at-9-04-05-pm screen-shot-2017-12-13-at-9-03-47-pm

After Testing Session:

We tested the Air Dj amongst ourselves and students at OCAD. In general, everyone was very excited to interact with the device and seemed to enjoy using it. There is a steep learning curve in terms of actually making anything that resembles music. We found that it felt more or less intuitive and organic to use depending on the beat of the backing track that was playing.

The most frustrating part (touched on earlier) was that the microprocessor in the featherwing would stop functioning within approx 20 swipes. This affected its portability and caused us to change the design of the top of the enclosure(mentioned earlier) as the 3D printed enclosure was incredibly tight which we thought would be fine if you were just opening it occasionally to charge the battery, but was not feasible if it needs to be opened after a minute or two of use.

Again, to resolve this, we switch to the melted record top option which we then secured via velcro.

Kristy preferred the older design, while literally, everyone else liked the look of the new melted record top. Regardless of design, functionality needed to prevail. Most people (ourselves included) really enjoyed interacting with the Air DJ, we just wish it would last longer each session. We were also testing using large external speakers, though we had originally envisioned the Air DJ as an all in one enclosed product. This was a strictly financial choice as the project had become quite expensive. Our ideal iteration (especially after testing,) would have an enclosed speaker along with the battery with a small port added on the side to plug the battery in.

 

Final Product and Photos From Class Show:

_dsc0479 img_1450

Project Context and References:

We wanted this product to be fun and portable, so we researched other musical devices, things DJs are currently doing and toys.

After already deciding on the Air DJ, we came across this great device that creates a MIDI interface out of a pizza box:

https://learn.adafruit.com/circuit-playground-pizza-box-dj-controller/overview

Other Tutorials:

https://learn.sparkfun.com/tutorials/zx-distance-and-gesture-sensor-smd-hookup-guide?_ga=2.251769418.219913925.1512157062-1708196433.1505250634

https://vimeo.com/247238710

https://www.adafruit.com/product/3357#tutorials

Keith McMillen is also an interesting musician that’s creating different sounds and instruments:

https://www.keithmcmillen.com/

A look at what’s currently on the market:

https://store.djtechtools.com/

Productogyser

capture

Dikla Sinai, Finlay Braithwaite, Karo Castro-Wunsch
1

Productogyser is a chaos management system for shared workspaces. It polls users’ desire to focus and creates a generative artistic centrepiece that indicates the office’s combined desire for focus. The device is a small control box with a clean and minimal design. It allows the user to use the encoder to express their state of mind. Red light is an indication for your coworkers that you need to focus and preferred not to be disturbed. Green light is an indication that you are willing to socialize.

The Productogyser is a custom-made product, specially designed to answer the needs of Victory Social Club, an open-space shared workplace. Victory Social Club is a multi-disciplinary production and design collective based in Toronto, Canada. The people who are part of this group come from different backgrounds such as filmmakers, directors, designers, sound editors, picture editors, animators, and educators. They all share one open space which makes it challenging to maximize the productivity in such a busy environment.

dsc_0728

2

Design Kit

List of electronic components – Here

3

4

Process

Since our product is custom made for a specific client, it was necessary for us to listen to our clients and get their feedback in order to come up with the best solution for them. We thought that it would give us the best insights about the functionality, group dynamic, etc. During the whole process of designing and testing, we worked together with our clients to make sure we were synced regarding the expected outcome. It was fascinating to learn about the process of creating a real product, to serve other users. For some of us, it was the first interaction with such a process, and it was highly educative.

Day 1 – Interview

We got together with the people from the studio to hear from them about their work experience in an open space. We pitched our idea and asked them some questions to understand if there is a real need for our product and get their ideas, comment about it. After the interview, we set together to define our product, based on their feedback.

We come up with a list of requirements:

  • The product should be able for each user to indicate his/her abilities to focus or socialize.
  • The individual should be able to express their state of mind, in a comfortable and not embarrassing way.
  • The sign for the individual’s state of mind should be visible to others.
  • There should be a shared area where everyone can be aware of the overall state of mind.
  • Since they have many guests and clients in the studio, they don’t want to codes for ‘please be quiet’ and ‘let’s socialize’ to be very clear to someone on the outside. They don’t want to make them feel uncomfortable. The codes should be more of an abstract than a traffic light.
  • The design of the product should be minimalistic and suit the overall studio design.

Some other things people mentioned in the interview:

  • If they need quiet, they just put on their earphones.
  • They think that most of the people are just not aware of the fact they are causing so much noise.

Full interview link: Victory Social Club – Charette

6

7

Day 2 – Worked on our first version of product

Code Ideas

The individual stations hold and broadcast a value for the desire for focus of each individual. The value can be positive or negative, representing more or less desire for. The value for ‘focus desire’ is dialled in using a rotary encoder. The local focus desire is reflected in the red/green LEDs integrated in the rotary encoder, red signifying a desire for focus (negative value) and green signifying an openness to distraction. This gives the user live feedback that their input has been accepted by the system. It also allows visitors at close proximity to observe and respect the user’s desire for focus.

 

Either state, positive or negative depreciates over time, eventually landing and resting at 0. This is ‘dead man’s switch’, eventually negating the user’s output over time, so that if they leave their desk their influence over the global office state diminishes.

 

void Depreciation() {

if (millis() – DepreciationMarker >= DepreciationRate) {

if (dialValue < 0) {++dialValue;}

if (dialValue > 0) {–dialValue;}

DepreciationMarker = millis();}}

 

Each device publishes their local state to feed the central generative display. Each device publishes to its own PubNub Channel. The data published is the desire for focus ranging from -255 to 255.

Channel1: {“focusDesire”: -111}
Channel2: {“focusDesire”: +200}
Channel3: {“focusDesire”: -5}

 

The generative display aggregates the data from all user stations and uses it to build its display. The generative display is a javascript site translates the incoming data into an engaging yet informative peripheral notification. Low saturation red and green background inform the space of the overall desire for focus. The shape of the generative display is circular for each user until they dial in a desire for focus (negative value) wherein the shapes become more angular, representing the desire for focus. There can be multiple shapes both circular and angular that represent the mood of individuals in the studio. However, there can only be one background colour representing the cumulative mood of the entire studio.

This cumulative mood is returned via PubNub to Channel0. Each local station translates this global value to a Neopixle LED, turning red for negative values, green for positve. In this way, each user can know the state of the studio, irregardless of their ability to see the generative display.

8

9

User Station Code – Arduino

https://github.com/KaroAntonio/productogyser/blob/master/arduino/Productogyser.ino

Generative Display Code – Javascript

https://github.com/KaroAntonio/productogyser/blob/master/script.js

Diagram

10

Day 3 – Design, testing plan and code

  • We created a list of questions for the user testing process –  Here. The survey contains questions to provide feedback about our product. With our survey, we aim to understand if the product works as expected and how intuitive is the user experience, to understand the need of each feature, get feedback about the product design, measure the overall satisfaction level, and understand what needs to be improved.
  • We start discussing ideas for the physical design, keeping in mind our clients demands. We did some research online for fancy plastic boxes and we found some ideal soap containers at MUJI. We thought that since they are semi-transparent, they could be perfect for our needs since the neopixel lights are too bright and this will help dim their output a bit. We also tried to optimize the configuration of all parts on a breadboard to find out what is the minimum size required for the box.
  • We went shopping for boxes at MUJI.

11

The making of

We planned to create one fully functioning prototype, test it, and multiply the successful prototype two more times.

  • Each soap container includes two parts of foam to protect them from scratches. We decided to use the foams as part of our design for two reasons:
    1. Design element – They can defuse the light and makes it more interesting as it goes through the foam texture.
    2. Functionality – we can use it a substitute for our breadboard which helps to stabilize the parts and hold them inside the box.
  • We drilled two holes for the USB cable and the encoder.
  • We added encoder covers (same colour as the box) to make the encoders easier to see and turn.
  • Since the wiring required three GND pins, we had to connect four jumper wires – three that go to each pin in the encoder and neopixel and one that goes into the GND pin at the Feather.

12

Day 4 – testing the code

We were trying to test the code with the prototypes before the user testing, but we had to deal with some code issues first:

  • Communication with PubNub – we got ‘client error’ messages in two of our three Feathers.
  • We managed to get green light output but not red light output on the encoder LED.
  • Finlay was out of town 🙁 which makes the process and communication a bit harder *BUT* we managed to figure some of the issues over a group chat.

13

Day 5 – debugging

We needed to solve two main issues:

  • Showing red light for any values below zero.
  • Make the neopixel respond to our central art piece and shows the same output – the average value of the whole devices.

We did some testing, and figured out that two of the three devices were just not responding. This gave us an indication that there might be a problem with the Feathers. We started with checking the wires to make sure everything is wired correctly. Once we verified that the wiring is ok, we were quite clueless about might be the problem.

We then decided to test it on another wifi network to see if this might be the problems and we found out that they were all working fine. Apparently, two of our Feather was not registered on the OCADU wifi network. Our reaction to this discovery was a fine blend of elation, exaltation, frustration, and indignation.

14

Day 6- User testing

We set up 3 working stations at the studio and asked people to plug them to their computer for 3-5 hours, and at the end of each session, answer our online survey.

15

Conclusions

We had 7 people to test our product and share their feedback with us.

things-we-learned-about

References

  • Music by Mild High Club, used with permission
  • victorysocialclub.com

Pocket Oracle

Pocket Oracle

Roxanne Baril-Bédard & Sean Harkin

description

A small wooden box fitting a Wifi connected derivative of the old “Magic 8 Ball” novelty item.  At the press of a button the unit randomly chooses a text based bit of “advice” from a handmade API containing an ample vocabulary, for a possibility of 203’112’000 unique answers.  After having downloading the JSON, the device can work offline. 

It’s meant to be cryptic and mysterious, so to have the user try to interpret what the oracle means, much like seers of the old days. Users, pretty much anyone with an interest in the mysterious, can ask advices during the day whenever they hesitate between options. This device needs to be portable so to be able to keep asking it question throughout the day.

Due to the requirement of being portable, we wanted to design the device to be as small as possible. Although we wanted the final design to be small enough to fit on a key-chain, we had to compromise time and component size. The device is still small enough to be easily portable and is very light-weight.

We want to test how useful the esoteric advice received from the  Oracle is in everyday settings, and understand the limitations of the device.

 

The Device

portable oracle

20171207_184950

Production 

20171206_125350 20171206_151155

20171207_134328

20171206_152851 20171206_160456 20171206_160505 20171206_154516 20171206_154523 20171206_125311 25035333_10159646484310057_1042479634_o 25075468_10159646484275057_1372687742_o

20171206_185059 20171206_184654 20171206_184703 print-lid print-box

20171207_193951

final Bill of Materials (spreadsheet including costs & suppliers)

 

final circuit diagram

screen-shot-2017-12-08-at-11-35-15-am

final code 

https://github.com/metanymie/portableguru

Journal

Day One (Monday, November 27)

Discussion in class regarding form and function of the project.  Two ideas seemed to fit the group’s vision:

  1. A derivative of the “Fitbit” wearable fitness/exercise monitor, but with included timers designed to help with HIIT type training and reps.
  2. A derivative of the novelty “Magic 8 Ball” which would pull “advice” or “answers” from random API sites and display them on a small (and wearable or “keychainable”) OLED screen.

After some back-and-forth, the simplicity of the “Magic 8 Ball” application appealed to all three of us and we decided to go with that.  Team to meet tomorrow approximately 16:00.

Day Two (Tuesday, November 28)

Roxanne having gone to Creatron to pick up the required OLED FeatherWings, extra stackable headers in case of need and order the batteries we soldered the assemblies and downloaded and installed the required libraries (Adafruit_SSD1306 and Adafruit_GFX) then ran the example code provided on the Learn Adafruit site (https://learn.adafruit.com/adafruit-oled-featherwing/usage).  Sean then successfully tested the battery we’ve chosen to use (being the only one with a battery so far).  We have also decided that there should be a “no internet available” subroutine that would draw from a preset array of “answers” (see below).  i.e. – if the “Guru” is connected to the web, it draws from a random selection of API’s for its answers.  If no internet connection is available, then it draws from its preset array.  

Roxanne tried some of the available fonts to see what could work best. Most were too big for the screen. They also figured out how to wipe the screen from previous answer.

Dec 1st

Roxanne: Trying to get io.adafruit working with the board

First follow this tutorial to update ssl certificate

https://learn.adafruit.com/adafruit-feather-m0-wifi-atwinc1500/updating-ssl-certificates

Also install libraries: ArduinoHttpClient, Adafruit MQTT, Adafruit IO Arduino

Trying to make this tutorial work, had to ask Nick for help because it is not explicit you have to have the ssl certificate for both ArduinoHttpClient and Adafruit MQTT libraries so the compiler would not work (rather frustratingly).

Tried to make this tutorial work so we’d be connected via wifi to the internet https://learn.adafruit.com/adafruit-io-basics-digital-input/overview

Success! Able to send stuff from the arduino to io.adafruit, albeit random number but the bridge is working !

Got it working with a slider and a button, sending value on press. Getting there !

Nick says adafruit io is not useful, so most of the things succeeded today served nothing. Instead, we must look for tutorial make Http client, point it to web address, take json message, put it in a json object.

Sean: We also spent some time working on the back-up option of preloaded responses on the feather. We already had the basic functionality of a display screen asking if you need advice, being able to push 1 of the 3 buttons and getting a different response.

From here, we wanted to build a basic random function where any button push would pull from a list of responses

Dec 5

Day 1 Build: The original idea was to build the casing from wood. We enjoyed the idea of a little wooden box you carry around and ask for advice, as well as the contrast of the organic wood with the digital OLED screen. However when Sean went to build, he ran into some issues:

  • The wood we had available was mostly ply, and mostly low-grade (3-4 ply)
  • Even if we were to source better materials, the time constraints of the build would not allow for the quality we are initially pictured for our product.

Tomorrow, we will either begin 3d printing the prototype cases or use acrylic. If we are unable to create the 3 casings tomorrow in sufficient time, Sean can quickly build another 2 ply cases for testing purposes – with some work they could be to a fair standard before presentation.

At last week’s class, Kate suggested that we remove the last answer from the screen after a certain amount of time, which seemed like a great idea to encourage interactivity with the user. Despite some small hiccups, we got this functionality into the code. 

 

Dec 6

Day 2 of build went well. Sean had access to the Ultimaker, meaning he began printing a version. Even if we do not have time to print 3, he thought it might be interesting to see an example of how the MK1 case design might look printed.

Sean also experimented with some different materials. First of all he tried some acrylic. Although it was easy to work with and joints were easy using the adhesive bond, it shared the weakness of the plywood being too brittle. Although there is little-to-no worry of it splintering or fracturing, I was cautious of components snapping during prolonged use. While I was waiting on the print, we decided to return to wood. We found some MDF in the workshop and began working. It was also around 1/4inch thickness but seemed to work well. With some help from the esteemed Reza, Sean was able to build a fairly well crafted box. He will return in the morning to complete the lid and then all we have to do is upload the final version of the code and begin testing.

Roxanne set out to figure out a way to get the feather to be able to make request to the server. After many hours of code, the feather is able to connect to the raw json file on their github. They used a JSON written using the visual tool Tracery (http://www.brightspiral.com/tracery/).

We didn’t succeed in splitting one string array into many strings so that it would create a sentence structure from the origin array as featured in tracery structure. Since we did not succeed, we hardcoded the sentence structure instead, telling the program from which array to pull a random phrase. Doing so, the advice sentence is always created according to the same pattern.

We also had problem with the server address because we put the JSON on github and it was hard for the feather to connected to a protected https server. We resolved it by using the website rawgit.com .

Dec 7

Sean finished the box today with some help from Reza. The 3d print, although functional, had a less pleasing aesthetic feel. He continued working with the MDF casings as these seemed to be the most stable and easy to work with. Originally we had planned to use a simple dowel lid which could be pulled off for maintenance and  repair, but Reza suggested and helped build a magnetic lid – less stable but much easier to get access to the internal components. Similarly, we had thought of using rubber or foam for the button, however Reza suggested adding some contrast with the wood in the form of colourful acrylic. With the addition of some hot glue – which acts as suspension – the button works effectively and consistently.

Overall, we were fairly happy with the final product shape. Going forward, we would definitely want to print our own board in-order to cut down to a more convenient size – namely something which could attach easily to a keychain and be unobtrusive in a user’s pocket or bag.

Roxanne was finishing up the code and making the JSON create more interesting sentences. With all of the vocabulary they added, there is a total of 203’112’000 possible unique advices. When writing their code, they were really inspired by oracles sur as the Yi Ching, so their new sentences are more mystical than goofy.

Example sentences:

You have to not let go the answer vis-a-vis death . You’ll get hurt.

Have you tried not to see life and them . Who cares.

Have you tried to fight your mom vis-a-vis life … but it’s a waste of time.

I know! Just implement her with love . Alea jacta es!

You could always not say the advice concerning her … LOL

User Testing Materials

user testing plan

Prep

No extra supplies needed, possibly just a short time to recharge the battery during testing sessions. We also should not require any repair materials.

Plan

Testing will be recorded through a combination of photographs, video and journaling.

There may be some deviation in the recording methods, as they will be dependant on where they are while testing the product. However, we will aim to be as consistent as possible.

As of today, we’re planning on testing Wednesday, or possibly Thursday depending on build-time.

Since our devices are not in constant use, our plan is to test for 24 hours, using the device whenever we need to make a decision or need advice. The users (us) will make our own interpretations of the advice given and will record this process of interpretation and outcome.

The testing will be conducted during day-to-day activities. We will most likely be independent when testing, as the devices do not require interaction.

After

The debrief will be held as a discussion/write up session after testing, and before our blog-post is submitted.

The user photo/video journal will be added to the blog post, along with comments from the users.

link to end-of-session report forms

https://docs.google.com/forms/d/e/1FAIpQLScYSNOkyMNSGYy30nYR-r6JU89ptl06flzWJ_yRy-4hUoioBg/viewform?usp=sf_link

photos, video, or other media gathered in the field

20171207_18155220171207_215854 20171207_221113 20171207_224903 20171207_204359  20171207_21050520171207_184950 20171207_185006 20171207_185025 20171207_193652 20171207_193658

20171207_205057 20171207_205110 20171207_205117 20171207_205128 20171207_205807 20171207_170047 20171207_211515 20171207_213045

summary of the testing process 

We set out to use the device for a day to see how useful it would be and to propose its advice to other people we were to be interacting with.

For Sean’s testing period, he spent around 20 hours testing the device on any and all sorts of decisions he made during this time. He also conducted mini-user testing in a social environment when he asked friends and passers-by to test the device. Due to the short amount of testing time we had available, we were unable to hand the devices off to users for prolonged user testing, thus only being able to conduct these ourselves.

reflections on your findings

The results of the user testing were interesting, though mostly proved what we already knew. Some users took to the philosophical and mysterious overtones of the experience very well, however a lot of the users did not seem to understand their role in the interpretation of the advice. This is not necessarily a weakness of the product as much as evidence that we have a specific target market.

EXHIBITION

20171211_152058 20171211_152114

References & Related Works

references

Adafruit Feather M0 WiFi with ATWINC1500: Updating SSL Certificates: https://learn.adafruit.com/adafruit-feather-m0-wifi-atwinc1500/updating-ssl-certificates

ArduinoJson: Manual: http://arduinojson.org/doc/

What and where are the stack and heap?: https://stackoverflow.com/questions/79923/what-and-where-are-the-stack-and-heap

WiFi101 Library: https://www.arduino.cc/en/Reference/WiFi101

Arduino HTTP Client library: https://github.com/arduino-libraries/ArduinoHttpClient

RawGit: https://rawgit.com

 

Excuse Me Accessories

Title: Excuse Me Accessories

Group: Savaya Shinkaruk, Tommy Ting, and Chris Luginbuhl


Project Description

Our product was created from thinking about a project for our assignment Portables / Multiples.

The goal of this experiment is to create robust interactive prototypes that we can make more than one of and that we want to travel with us. – Nick and Kate.

When coming up with ideas for this project, we came up with some really cool and interesting concepts, but none we personally wanted to test or use in our everyday lives. So, after lots of thinking over a few days, we came up with Excuse Me Accessories.

We go more into depth about the journey of our process towards making our game in our blog  but the overall description of our project is:

To help people with their productivity on any task or activity we created the Excuse Me Accessories products. By using the Pomodoro Technique (work for 20 minutes – break for 5 minutes), people can follow this technique with their wearable or desktop accessory.

So, continue on to the rest of our page to read more about the Excuse Me Accessories team and our journey.


About team

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.

Chris Luginbuhl: Chris Luginbuhl is engineertist who likes to create new words and new ways of using technology to help the world.

Tommy Ting: Tommy Ting is an artist and an emerging video game designer currently in his first year MFA student in Digital Futures at OCAD University studying game design and development.


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF OUR EXPERIMENT:

November 24, 2017

After everyone presented the projects for Experiment #4, Kate and Nick assigned us our group members and talked about the description for Experiment #5.

The three of us got paired together (Savaya, Chris, Tommy), and we decided to take the weekend to each come up with ideas for this project for Monday.

End of day one.

Thanks!


DAY TWO

DAY TWO OF OUR EXPERIMENT:

November 27, 2017

Today in class Kate went further into the project description for Experiment #5.

She went over the requirements and the deliverables for our project.

For this experiment we are to produce 3 copies of the prototype we are intending to code and design.

During class, each group was given roughly 45 minutes to come up 5 ideas for their project. This is the first time we started to come up with ideas for this experiment.

Here are the 5 ideas we came up with:

  • Night time to Daytime t-shirt wear – exploring clothing that changes in appearing in light and dark environments – perhaps becomes more or less revealing, or adding lighting effects.
  • Linking Maps to Instagram to source the best Instagram photo opportunities in Toronto
  • Dance Wearables – clothes and accessories that produce sound and light effects in response to movement. Used for performance dance, and just dancing for fun.
  • Real life Social Media – wearing a small Eddystone/Physical Web/Puck.js beacon to broadcast your key interests via bluetooth around OCADU. You are alerted when someone within range (~10m) has overlapping interests. You then have to find the person and have a conversation to figure out your common interests.
  • Uncomfortable fashion – exploring fashion that places demands on the wearer, and negotiates with them to adjust their posture, movement and behaviour.
  • Now you see me, now you don’t – exploring the theme of wanting to be looked at or wanted to not be looked at in public. In particular, clothing that can tighten or loosen, fasten or unfasten, illuminate the wearer, or conceal.

The common thread we had for all of these ideas is that we want to make a wearable fashion item. It could be a shirt or an accessory. We are more interested in creating an accessory for this experiment.

Before class ended we had to pitch an idea (that we could change later on, and we did) to Kate so she knew the direction we were headed.

We talked to her about going on the theme of “you see me, now you don’t” and we wanted to go about this concept by creating make a device that unlocks your garments at a certain time of day (like sunset), or to make a hat that has lights that turn on at sunset.

These are some of the ideas we are running with, and we are going to meet on Wednesday, November 29, 2017 to settle on an idea.

End of day two.

Thanks!


DAY THREE

DAY THREE OF OUR EXPERIMENT:

November 29, 2017

Today we met in the DF lab to start to brainstorm more ideas for this experiment.

Here is a timelapse of us meeting and chatting about what we want to do:

The new idea we came up with is:

We are going to are creating a wearable product that is a navigation system that informs the user if they are travelling North. In addition, when daytime turns into night and people start to feel nervous walking around in the dark, our product will light up so the user feels more safe when walking home.

We want to iterate this idea, but this is the initial concept we are going to start with and then build off of.

Some of the iterations we are looking to add are:

  • Have the navigation move in all directions: NORTH, SOUTH, EAST, WEST (like a compass)
  • Build the Puck.js on a magnet, so the user can easily move the item from their wrist, to neck, to finger.
  • For safety purposes to have the devices where if a male or female are encountered in an unsafe situation, by clicking the button it will make an alarm and blink light so the attacker leaves.

We ended up having to leave to class after these notes and work on other assignments so we are taking a break from this and will meet again tomorrow.

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF OUR EXPERIMENT:

November 30, 2017

We met again today in the DF lab to go over the ideas we came up with yesterday.

We are still interested in our idea, so we are going to work on getting our proof of tech and proposal done for class tomorrow, as that is due.

We got to work on setting up the Puck.js (EMA) devices to Bluetooth.

Here is a image to show the pairing:

pairing-our-puck-js-device-to-bluetooth

To get our Puck.js (EMA) devices up and running this is the link we used to follow the steps to get it connect to Bluetooth, and to start doing tutorials on how to code: https://www.espruino.com/Puck.js+Quick+Start

Each Puck.js (EMA) devices has a 4 digit number that will be linked to our individual MACS so we know what Puck.js (EMA) is linked to whose computer.

This is Savaya’s:

bluetooth-code

We are going to use this code from this link for our proof of tech code for class on Friday, Dec 1, 2017:  http://forum.espruino.com/conversations/296979/?offset=25

Just kidding, in the end the code from the link above is not working because we need to subtract the magnetism in the area we are using this device in, for it to calibrate. And we need to set which axis you are mounting the device on. The original was not a robust enough and isn’t working.

Here is a image of the first try of the original code from link above:

firsttrycompass

So, we have sourced new code, and here is the link: http://forum.espruino.com/conversations/297915/

We are playing around with this new code because it looks like it will be easier for us to set an axis with a direction.

map

In the end, we are using the above code in the link as a template, but adding our own code into it. The personal code we are adding and changing is to assist in the calibration of the axis movement.

Here is a link to the GitHub that has the code we worked on and Chris implemented for our project: https://github.com/ChrisLuginbuhl/WalkWear/blob/master/WalkWear.js

Here is a first trail video Puck.js (EMA) working with the code:

This code is to show for our proof of tech, due tomorrow (Fri, Dec 1). We are going to iterate it a bit however as time goes on. But this is the starting point to show that when we move in a direction it navigates that and shows which direction you are headed, NORTH, SOUTH, EAST, WEST – as shown and talked about in the video above.

From there we started to work on our design ideas and noted what materials we will need for tomorrow.

Some of the design ideas we came up with for the Puck.js (EMA) are:

  • Original Idea: Magnetise the back side of the Puck.js(EMA) and then have a magnet it will attach to on the bracelet / watch, ring, and necklace. We like this magnet idea because it will be easy for people to move the device from location to location without having to undo something and tie it up again. However, the issue with this is the magnet will play with the compass too much, which will ruin the concept.
  • Prototype Idea: For Friday, Dec 1, we are going to do a prototype with velcro – which will showcase the same concept as the magnet, but won’t play with the calibration or compass.
  • Iteration Idea: We also like the idea of how Pop Sockets work. With the Pop Sockets you can also purchase a device where you can attach a car mount, where you can slide the Pop Socket into it, and it will hold you phone. We have attached images below to show what we mean:

 

pop-socket-mount popsocket

We also started to think of some working titles:

  • Wear and go
  • Navi – Direct
  • Walk Wear – WINNER!

For Friday Dec 1, here is a list of the materials we will need for our prototype:

  • Velcro
  • Wrist band – bring these products from home
  • Puck.js (EMA)

What is due for Friday Dec 1, 2017:

Proposal:

Working Title: Walk Wear

Group Members: Savaya Shinkaruk, Chris Luginbuhl, and Tommy Ting

Concept: For experiment number 5 Chris, Tommy, and Savaya are creating a wearable product that is a navigation system that informs the user if they are travelling North. In addition, when daytime turns into night and people start to feel nervous walking around in the dark, our product will light up so the user feels more safe when walking home.

Form: We are going to be making a wearable product that can be worn either on your wrist, hand, or neck. We are designing it so the device can be removed and then placed between your ‘bracelet / watch band’, necklace band, or ring band. For our prototype we are going to use velcro.

Electronic Component: https://docs.google.com/spreadsheets/d/1me4clmdyE9FGIMsQC5aUXlfaip62lC03WpXSjYtY48Q/edit#gid=0

Proof of tech:  https://youtu.be/vgKwKgTtEaA

Materials and production process for enclosure:  The Puck.js (EMA) will be shown as it already is built to act as an enclosure, so we are just building around it to make it into a device that can be worn on your wrist, neck, or hands.

We will work tomorrow in class on the next iteration steps of this product.

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF OUR EXPERIMENT:

December 1, 2017

We had class today, where we had to hand in our proposal (which is shown above) and then we have individual team meetings with Nick to go over our project.

When talking to Nick in class, he told us that we should try and focus on one thing, and only use what the Puck.js (EMA) provides rather than adding in extra information and stuff for us to do.

The big thing for us to focus on, is the product. What is our intention and goal – and how are we going to make the Puck.js (EMA) look good and have it look like something. Ultimately, we have to make it into something because the technical side of it and input + output portion, it already does so much.

So after talking to Nick, Savaya left because she was feeling under the weather, so Chris and Tommy met to chat about some new ideas for this project.

We planned to meet tomorrow to go over what Chris and Tommy talked about.

End of day five.

Thanks!


DAY SIX

DAY SIX OF OUR EXPERIMENT:

December 2, 2017

Today we met to go over what was talked about yesterday during class hours.

And in the end we decided to go with a new idea and concept for this experiment.

New concept:

We want to make a productivity assistant app that is a smart device to inform the user when it is time to take a break and time to get back to work. Using light, green and red, it will inform the user when break time starts and ends. Along with this, when it is time for the user to take a break they have the option to have a screen pop up letting them know of their agenda for their things to do, or have their display sleep so they can take the allotted amount of time till break time ends. To turn the device off hold down the button for it to shut off.

For the display options we are looking at using this function for that side of our product: https://www.boastr.net/downloads/

Here is a list of some of the brainstorming ideas we came up with for this new concept:

  • Productivity app
  • Hardware
  • Assistance
  • Lifestyle
  • Not connected to anything – it’s a smart device (the Puck.js (EMA))
  • Does not turn anything on but turns things off
  • The goal is to disconnect – and you as the user have the choice of what that might be
  • Then when to go back to work
  • Just have the colours of Green and Red – but before it shuts OFF gives you warning
  • 20 min on – 5 off and then 25 min off – the break / work schedule.
  • Keep it a wearable
  • Keep it all in the puck.js (EMA)

We are following the Pomodoro Technique for this system where you work for 20 minutes and then take a 5 minute break and do this for an hour, and after doing this for an hour you will work for 20 minutes and then take a break for 20 minutes.

What would the user use this device for:

We talked about the ways we would individually use this:

Savaya: I would use this more for, not when i am working but for when I am taking time off my work to shop online.

Tommy: I would use this for when I am working and need a break to relax and then get back to work.

Chris: I would use this for focus computer time and remembering to take breaks.

In the end we all agreed people can use this productivity device for:  school / work / fun

For design options we would like to:

  • Have an option for a wearable / like a bracelet or watch look alike
  • Have a option for people to put this device on their desktop so they can see it  

We want to have two options for people because we believe this will help when we put this product on the market, so people can choose how they want to use it and wear it as well!

Here is a sketch of the wearable design we are thinking of:

watch-sketch

For the wearable part of this device we are looking to purchase watch bands like this (below) and to 3D print a way for the Puck.js (EMA) to sit and have the watch bands connect to the watch band connectors (shown below).

Watch bands:

ordered-watch-bands

 

Watch band connectors:

idea-watch-band-connectors

We shopped around Spadina and Queen West at multiple brick and mortar stores to find the intended materials to make our product and to know the size of what we need to 3D print for the Puck.js (EMA) to sit in / on. But we couldn’t find the right materials.

So we are ordering the items from Amazon. Which should be here Monday.

We are ordering the watch bands shown above. 

We decided on this watch bands because it will be easier for all people off all wrists sizes to wear this device without having to remove links.

We however, are not ordering watch band connectors. Because we are going to 3D print a surface for the Puck.js (EMA) to sit in. 

watchv3 watch-design watch-spring

New names/titles:

We had the title as Walk Wear, but are changing it now as we changed our concept.

  • Wear Away
  • Walk Away
  • Deskside
  • Deskside Breaktime
  • Deskside Wearable
  • Excuse Me Accessories = WINNER!

Now we are working on getting the first part of the code working: to have the red and green light on button press running, but also by clicking the button to have the display turn off and on.

Here is video of us turning Chris’s computer display off by clicking the Puck.js (EMA):

We are going to finish here today. We need to ordered the watch parts we need for this wearable, and we are going to work on Monday to 3D print what we need, so by Tuesday we have the wearable made.

End of day six.

Thanks!


DAY SEVEN

DAY SEVEN OF OUR EXPERIMENT:

December 3, 2017

Today all three of us worked separately, but kept a communication line open via Facebook Messenger.

Today we spoke more about the design of the watch, the theme colours, and branding.

We decided to use these colours, font, and included stickers in our branding because we wanted to evoke a ‘kid friendly’ feeling. When people are breaking we want people to feel like they have freedom to do whatever they want, kind of how kids act when they are playing with their toys.

We decided on this theme colour for the brand / product:

exp5-colour-theme

We also started to think about the design of the desktop holder for this product. Here are a couple ideas we came up with: ADD IMAGES

And we started to work on our branding too!

Here is one concept we are working on for the title:

branding

This is all we worked on today, independently, and communicated over Facebook Messenger.

End of day seven.

Thanks!


DAY EIGHT

DAY EIGHT OF OUR EXPERIMENT:

December 4, 2017

Today we re-connected in class and sent in our Testing Plan Proposal:

Testing Plan Proposal: Savaya, Chris, Tommy:

  • Preparations
    • What needs to be done ahead of time? Each person in our group needs 1 wearable and one desktop piece to either take home, to work, or leave at school – it is up to them where they want to test it. We just need to make sure the tester has both options – including the package given with Excuse Me Accessories.
    • Do you need extra batteries? There is the option to include extra batteries in the package each tester will receive.
    • What goes into your repair kit? Trial period of testing there will be velcro and tape supplied in the package in case anything breaks – however there is a receipt of return to Excuse Me Accessories for product to get fixed.
    • Be sure to take “before” photos. ADD IMAGE HERE.
  • The Plan
    • How will you document your own user experiences while using the device? Notes? Journaling? Photos? Video? Audio recording? Each of us (3 group members) will document our own experiences with Excuse Me Accessories by: Journaling, Photos, Video, Audio recording and a Questionnaire that we have created.
    • What will you do to ensure that your data collection methods are consistent for each group member? To ensure our data collection methods are consistent, we will give a Start and Finish time of testing that each group member has to follow. Also with answering a standard Questionnaire about the product.
    • For each group member, what are the dates & times of testing?
    • Savaya: Wednesday DEC 6 @ 9 AM – 4 PM:    AND Thursday DEC 7 @ 12 PM – 6 PM
    • Chris: Wednesday DEC 6 @ 9 AM – 4 PM:    AND Thursday DEC 7 @ 12 PM – 6 PM
    • Tommy: Wednesday DEC 6 @ 9 AM – 4 PM:    AND Thursday DEC 7 @ 12 PM – 6 PM
    • If there is a reason that (2) 6-hour testing periods don’t make sense, include a proposal for a reasonable equivalent for your device and get sign off from Kate. Not needed.
    • Will you be together or having your own experiences? We will each be having our own experience using this product.
    • Will you be doing your day-to-day activities or doing a special activity? We will be doing our day-to-day activities because we want this product to make sense for everyone and their day-to-day activities.
    • Any other details? For this testing period, we have said we cannot work on this direct assignment.
  • End of Session Reports
    • You are required to create End of Session Reports. Create a survey / form using Google Forms for each group member to fill out at the end of their 6-hour testing periods. You will end up with 6 entries (3 users x (2) 8-hour testing periods.) Link to your form here. Each of us will have to do the standard Questionnaire at the end of both of our testing periods. And taking a video sharing their experience of what they liked and did not like.
  • After – Crunching the data & documentation
    • After the field testing, how will your team structure a debriefing conversation? Each of us will read each other’s Questionnaire and watch one another’s videos, and come up with solutions to the things they didn’t like. And then we will have a discussion group / thinking out loud session about what worked and did not, and what we should update.
    • What will you do with the data and media once you find it? We have decided to not work on this assignment during testing period because we want to make each of us use this product as though we had just bought it off the shelf. When it is a break time and there are important things to note we can write it down so we don’t forget, but that is it. The goal of this testing period is to also see and discover ways to use this device and product.
    • How will you display or present your observations & findings? We will present this by doing a blog post, and video presentation.
    • Be sure to visually document each prototype after testing is complete and make notes on what state they’re in. Done deal.

During class we also talked about our branding. We created the stickers to go over the Puck.js (EMA) to dress up. We will need to find clear adhesive stickers and change the opacity on some of the stickers colours so it they don’t cover the Green and Red light.

We also worked after class on the BetterTouchTool software, to see how it works and play around with it, and also see what trigger makes the most sense for the user.

BetterTouchTool is a software application on Mac that allows you to create triggers and gestures to result in a action. We are playing around with it so that when your computer wakes up from sleeping – your reminders app / page will pop up on your screen so you can see what you need to do.

better-touch-tool-commands

We are also going to meeting after class in the DF lab around 6:00 PM to discuss what we need to do tomorrow and aim to finish.

End of day eight.

Thanks!


DAY NINE

DAY NINE OF OUR EXPERIMENT:

December 5, 2017

Today we worked on building our watches, finishing the code, and making our desktop accessories.

To start of the day we started working on ordering the proper stickers we need for our product design. We went to Staples and Michaels but they didn’t the right ones we need. So we ordered some from Amazon, https://www.amazon.ca/gp/product/B007Z7LQ54

And we then put our designs into the proper template, and we will print them off tomorrow!

We also purchased our desktop accessories and painted them!

painted-desktop-accessories desktop-accessories

From there, we put together our first watch!

 

The middle part is 3D printed, and the pink straps are the ones we purchased from Amazon.

The 3D part of the watch is slightly too far apart where we need to attach the watch bands, but with soldering we were able to push them together. But we might reprint these again just so the bands do not fall off when someone is wearing them.

While working on the code we also learnt more about how we will be using BetterTouchTool with the code.

We are using BetterTouchTool to pull up peoples note taking and reminder apps on their computer after their computer wakes up from sleep mode.

We are also using BetterTouchTool as a command to turn your display sleep by hitting Control D. This is the hotkey.

With the code we have it working when, you press the button (Puck.js (EMA)) the light turns GREEN and it starts the timer for 20 minutes (working time), and after 20 minutes the light turns RED and it sends the hotkey (BetterTouchTool command ; control D) to turn screen display to sleep. And Green blinking light = break time over, but it will not turn your computer display on.

We are working on the code to have a 5 minute timer for break times, and right now there is no way to turn off the computer all together.

Here is a video of wearing the device and turning your display on after a break and off to take a break:

End of day nine.

Thanks!


DAY TEN

DAY TEN OF OUR EXPERIMENT:

December 6, 2017

Today we met at 10 AM in the DF Lab to finish building the watch accessories and update the code too.

Here is a video of how you would pair your Puck.js (EMA) to bluetooth / the code we are using:

We also wrote a instruction manual for users to follow and read so they know how to use our Excuse Me Accessory.

Today is the first day for our testing period!

Testing Period #1:

Duration: 12 PM – 6 PM (for all of us).

  1. Take before videos / pictures of where you are testing and what accessory you are testing with and how you will be testing it.
  2. Write notes during testing period about things you like / don’t like.
  3. Take video during the 6 hour period of what you are doing and how you are using it. (We won’t be filming the full 6 hour period, just intervals of it).
  4. Test these accessories however you would like to!
  5. At the end of the 6 hour period take a video of your findings.
  6. At the end of the 6 hour period also complete the survey: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewform

The dates and times of testing period #1 are as follows:

All of us will be testing for 6 hours today -independently- from

Savaya: Will be testing from home using both the wearable and desktop accessory

Tommy: Will be testing from school (in the DF lab) using both the wearable and desktop accessory

Chris: Will be testing from school (in the DF lab) and at home using both the wearable and desktop accessory

*Chris, Tommy, and Savaya will be using our Apple Computers.

We also decided that if we want to update any part of the code and test the new version, rather than waiting for tomorrow’s testing period to test it, we will check in with one another every 2 hours to see how things are going. And update it then and test it during Testing Period #1.

We have individually written our own notes and taken video of our first testing period – and in our blog post we have added links to each of our observations. (Links are shown in Finale Project Blog Post).

Before testing started we worked on the setup and plan for Friday’s presentation of our product.

We took some of the required photos we need for our presentation on Friday.

After we got the code up and running, chatted about Friday, went over our testing schedule, and took some images of our product and device, we went on our own paths to test our product for the first time.

Notes on how things went for all of us will come soon!

End of day ten.

Thanks!


DAY ELEVEN

DAY ELEVEN OF OUR EXPERIMENT:

December 7, 2017

Today we met in the DF lab to take video for our final presentation on Friday.

Today is also the second day of our testing period!

Testing Period #2:

Duration: 12 PM – 6 PM (for all of us).

  1. Take before videos / pictures of where you are testing and what accessory you are testing with and how you will be testing it.
  2. Write notes during testing period about things you like / don’t like.
  3. Take video during the 6 hour period of what you are doing and how you are using it. (We won’t be filming the full 6 hour period, just intervals of it).
  4. Test these accessories however you would like to!
  5. At the end of the 6 hour period take a video of your findings.
  6. At the end of the 6 hour period also complete the survey: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewform

The dates and times of testing period #2 are as follows:

All of us will be testing for 6 hours today -independently- from

Savaya: Will be testing when walking around Toronto to run errands, using the wearable

Tommy: Will be testing at his gym using the wearable

Chris: Will be testing from home using both the wearable and desktop accessory

*Chris, Tommy, and Savaya will be using our Apple Computers.

We have individually written our own notes and taken video of our first testing period – and in our blog post we have added links to each of our observations. (Links are shown in Finale Project Blog Post).

We also worked on our instruction manual:

  • To turn on pair your device to bluetooth by… going to this website: https://www.espruino.com/ide/… click the yellow square in the top left corner and choose a device number that hasn’t been taken // then click PAIR // and wait for it to connect.
  • When you have to work the light will be GREEN.
  • When you have to break the light will be RED.
  • When the GREEN light is blinking it means break time is almost over.
  • When it is break time your screen will sleep.
  • To make your computer sleep go to BetterTouchTool and go to the tab that says Keyboards – then click +Add New Shortcut or Key Sequence and in Shortcut put ‘control – option – D’ then go to Trigger Predefined Action and select Sleep Display.
  • Working with BetterTouchTool.. You can include commands to trigger an app to launch when your screen wakes up. Go to the previous action above and then click on it and choose +Add New Shortcut or Key Sequence and select Trigger Predefined Action and choose Open Application / File / Apple Script… in Controlling other Applications list.
  • To turn device off hold your EMA for a few seconds so light will device off.
  • You are given an extra battery
  • You are given stickers
  • Desktop accessory
  • Watch Band
  • EMA 3D printed holder
  • To turn it on you click the EMA once quickly

Here is a image of the battery being used:

battery

Here is the final instruction manual which user would get in their package when purchasing this product:

https://docs.google.com/a/ocadu.ca/document/d/1zW0JyQokGSjLWK2eN_BKDi7QTTDJu2hzPR6c5kjh5lM/edit?usp=sharing

End of day eleven.

Thanks!


DAY TWELVE – THE REVEAL OF EXCUSE ME ACCESSORIES

DAY TWELVE OF OUR EXPERIMENT:

December 8, 2017

Today we are presenting our product to our class, and we are so excited about it!

Here is our code: https://github.com/ChrisLuginbuhl/WalkWear

Here is image  of people trying on our product: yiyi

Here is our video / presentation: https://vimeo.com/246383252

This is our final project description:

We created a product that assists people with their productivity by using the Pomodoro Technique. We made and designed the Excuse Me Accessories that are available in wearables and desktop accessories for you to choose how you want to wear or use our device. In the delivery package you will receive, the EMA device, a 3D printed EMA holder, a watch band, a desktop accessories, extra battery, and stickers.

End of day twelve.

Thanks!


FINAL PROJECT BLOG POST

Project overview:

Project title: Excuse Me Accessories

Group: Savaya Shinkaruk, Chris Luginbul, Tommy Ting

Project description, including overview of object and intended context and users:

To help people with their productivity on any task or activity we created the Excuse Me Accessories products. By using the Pomodoro Technique (work for 20 minutes – break for 5 minutes), people can follow this technique with their wearable or desktop accessory.

We tested this product in multiple ways because we want this product to be usable and available for people to use it how they choose. This product is great to use when in school or during work hours to help you focus on the tasks at hand. But it is also a great product to use when out and about running errands to know how much time you are spending in stores shopping.

2-minute video presenting the portable & summarizing the field testing plan & results:

https://vimeo.com/246383252

Image of device on its own:

img_7131

img_7146

Images with device being worn / used by each group member:

portraits-1

portraits-2

portraits-3

Production materials

Design files: more are in the blog above. 

stickers stickers3 stickers2

Final Bill of Materials (spreadsheet including costs & suppliers):

https://docs.google.com/spreadsheets/d/1clJ1SG9paU3YRE_ewiWXgeS4Wr1ZE-hx4iZrrl9CGBc/edit?usp=sharing

Final circuit diagram:

watchv3

ema

pugk-js-device

Code:  https://github.com/ChrisLuginbuhl/WalkWear

User testing materials

User testing plan:

For both of our testing periods we stuck to this consistent formula – but had the freedom to use this product to our own regards.

Duration: 12 PM – 6 PM (for all of us each day).

  1. Take before videos / pictures of where you are testing and what accessory you are testing with and how you will be testing it.
  2. Write notes during testing period about things you like / don’t like.
  3. Take video during the 6 hour period of what you are doing and how you are using it. (We won’t be filming the full 6 hour period, just intervals of it).
  4. Test these accessories however you would like to!
  5. At the end of the 6 hour period take a video of your findings.
  6. At the end of the 6 hour period also complete the survey: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewform

The dates and times of testing period #1 are as follows:

All of us will be testing for 6 hours today -independently- from

Savaya: Will be testing…*location

Tommy: Will be testing…*location

Chris: Will be testing…*location

*Chris, Tommy, and Savaya will be using our Apple Computers.

We have individually written our own notes and taken video of our first testing period – and in our blog post we have added links to each of our observations. (Links are shown in Finale Project Blog Post).

Link to end-of-session report forms:

Here is our feedback from the survey we each completed after both of our trail runs:

Savaya:

Testing Period ONE: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewanalytics

Testing Period TWO: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewanalytics

Tommy:

Testing Period ONE: Shown in his Google Doc.

Testing Period TWO: Shown in his Google Doc.

Chris:

Testing Period ONE: https://docs.google.com/forms/d/1vjf0zSr_2iWvGgt04cNeQrbLx7yqAydsGli3gOBK2tU/edit#response=ACYDBNjmFDXXOQ6Hd-VSOB6JoJtaJ0wiu9POMmWGkvu1JetOWe62f7v11YIl_A

Testing Period TWO:

https://docs.google.com/forms/d/1vjf0zSr_2iWvGgt04cNeQrbLx7yqAydsGli3gOBK2tU/edit#response=ACYDBNheNu74TooUHnBuqjHPLqsVlysUGnVt5D8tffGapAItLMut3ALzoig1BQ

Link to data collected: NOT APPLICABLE – our data is our feedback shown in our notes and survey (above).

Photos, video, or other media gathered in the field:

Photos, video, or other media gathered in field is represented in our individual Google Docs.

Summary of the testing process: Information is included in our individual Google Doc Links.

Savaya: https://docs.google.com/document/d/1CRPUb4gkZqGOY5HT9sN97NA_C4V-HzqGcLCNI3hviHE/edit

Chris: https://docs.google.com/document/d/14ZC8tYcGWOSXoA-tu3VwBUTSjqD1SYjhLelLFA1RQII/edit

Tommy: https://docs.google.com/document/d/1r_Hzi_QUvadfG6nPVTHZBSKM4YTKxPf7xu96qmMWD58/edit

Reflections on your findings:

Information is included in our individual Google Doc Links and Survey links listed above or in our individual Google Docs.

Any summary materials presented at the critique: This would be our desktop accessories and wearable, and stickers available (images in above blog post).

Code development notes:

Part of the reason for using Puck.js as our platform was that we wanted a compact, battery operated package.

The Puck.js is based on the Espruino architecture. This is an open source javascript interpreter for arduino-like microcontrollers. It is programmed via a web IDE, which uses web Bluetooth to connect to the puck.

The use of Javascript means this is an event-based development approach, which is a good solution for battery-operated projects that spend a lot of time waiting for something (a timer, a button press, a certain pin to change states, etc).

The use of Javascript requires a change in mindset and a change in libraries from what is normally used with Arduino, however! The web IDE does contain modules that function similarly to libraries in Arduino, which extend the functionality of the device, and allow access to hardware features. They are also pre-compiled, and so run more efficiently than writing lots of Javascript.

We used a Bluetooth HID library (HID = Human Interface Device – i.e. mouse, keyboard, etc), which allowed the Puck to connect to Windows and Mac computers as a bluetooth keyboard, and send hotkeys which would sleep the display or the computer.

Although javascript for Espruino is quite compact and concise, it was hard to develop for this environment because of

  1. i) the relatively young age of this system. Espruino was first funded in 2013 and Puck.js was first funded in 2015.
  2. ii) Javascript is normally run in web browsers, so most of the examples and conventions available relate to screen-based devices.

iii) There are still some bugs in the web IDE. Once or twice we were able to fix a bug simply by removing code that was already commented out!

Here is a image to show the old / new firmware we used in our process making:

old-vs-new-firmware

Further considerations:

Although it is quick and easy to make updates to the code and flash devices with small fixes, it quickly became clear that version tracking would be important. There is no way to check what version of code is running on a device once it has been flashed to that device. For testing and debugging, it was important to use github and not make spur-of-the-moment changes to code to be used in testing

References & related works:

Any references / support materials used for the project:

References to related articles, papers, projects, or other work that provide context for your project. Write about the relationship between your project and these references:

Do more and have fun with time management. (n.d.). Retrieved December 2, 2017, from https://cirillocompany.de/pages/pomodoro-technique

Misfit. (n.d.). Retrieved December 3, 2017, from https://misfit.com/fitness-trackers/misfit-shine

 

RUDE

RUDE

By Maxwell Lander, Feng Yuan, & Emma Brito

Description:

Rude was intended to be a jacket that signals when the wearer exhibits anti-social behaviors. Thanks to a tilt-sensor in the sleeve whenever the wearer fidgets, checks their watch, crosses their arms, or looks at their phone, the red LED spikes on the shoulder light up. These are all connected thanks to conductive thread. The red spikes are intended to look aggressive and discourage the conversation partner from continuing. Alternatively, it also signals for a third party to come and interrupt the interaction or for the wearer to become aware of their poor social habits.

Final Presenting:

Video on Vimeo

dsc_0199

dsc_0186

dsc_0136

dsc_0099

dsc_0127

img_20171205_192225_763

Material Testing

We made the decision to build these devices into a pre-made article because they would be better built for our bodies than anything we could make ourselves. We chose denim jackets because the material offered more protection as a strong material.

25129919_10159864467650422_1768757510_o-1

These were our initial design sketches of how to embed the LEDs and sensors into the article.

screen-shot-2017-12-10-at-7-01-38-pm

We initially tested with leather because of the strength it offers. We decided against leather though because of how difficult it was to sew the conductive thread through.

screen-shot-2017-12-10-at-7-02-40-pm

Since leather was too tough we switched to denim

Final Bill of Materials

screen-shot-2017-12-08-at-12-26-55-pm

Craft Process

  1. We began by testing our code with the sensor. Once we got this working and did our material testing we were able to  start the crafting process.
  2. We began by pushing the LEDs through the shoulder with all of the negative legs on one side and the positives on the other. We then stuck copper tape on either side to connect all of the LEDs to be placed on the same pin.

img_20171204_142649

img_20171204_155933

img_20171204_145013

  1. We then soldered the legs down, as well as a wire to each tape. This way they could be connected to the feather.
  2. We then got started on the tilt sensor. We  sewed the tilt sensor to the sleeve by using conductive thread. We had to wrap the conductive thread around the positive and negative legs while making sure that the sides were not touching. In one case the thread touched beneath the fabric which did not allow the sensor to work. We ran the two lines along separate seams to ensure they did not cross.

screen-shot-2017-12-10-at-7-02-40-pm

img_20171204_185731

  1. Once we reached the feather board we connected wire to the thread through the same method as the sensor.
  2. Testing the circuit ( debug if not working)

img_20171204_144804

img_20171204_144814

  1. Once we were sure the circuit was working we were able to move the breadboard to a protoboard and solder everything down. We made sure that the code was loaded and were then able to to use the rechargeable button instead of a usb connection.

img_20171204_170532

img_20171204_171433

8. We then added the glue tips to the LEDs to emphasize their appearance. We used hot glue to ensure they were attached.

img_20171204_172533-1

9. All of the connections we then secured with tape and hot glue because we were concerned with how wear might disrupt their placement.

img_5585

img_5584

img_5586

10. Final testing! = Success

img_20171204_194334

Final Circuit Diagram

exp05_sketch

Final Code is Here

Testing Plan

End-of-Session Form

Test Questions:

screen-shot-2017-12-10-at-5-26-22-pm

screen-shot-2017-12-10-at-5-26-56-pm

We focused less on how the mechanical success of the jacket and more on the intended social and emotional reactions.

Data Collected

Results

Ultimately through our testing we found that RUDE had the opposite of the intended effect. Rather than discouraging conversation it seemed to spark it because people were interested in the wearable. There were times that we were approached by strangers who were curious with the jacket.

screen-shot-2017-12-10-at-7-24-40-pm

screen-shot-2017-12-10-at-7-24-47-pm

screen-shot-2017-12-10-at-7-24-56-pm

While these were not the outcomes and reactions that we were looking for, we were still pleased with our results. The jackets operated the way that they were supposed to on a mechanical level, and gave us ideas on how to move forward in potential future iterations. For instance, sound or a more aggressive output would probably be more successful in discouraging contact. LEDs are not clear enough to scare people away, but are interesting and different enough to arouse curiosity. These jackets would be better suited for a different purpose in their current iteration.

We would also be curious to pinpoint the best point to place the sensor so that it is not as easily triggered. We realized that the exact location affects the sensitivity of the sensor due to differences in our testing. Finally, we would also like to play around with different light patterns with the LEDs. Blinking and gradual lighting were some ideas. We would have to connect them to different pins if we were to pursue this.

 

Emolace

img_7743

Emolace

Yiyi Shao, Jad Rabbaa, Quinn Rockliff


Ever wonder what other people are feeling right now?

Emolace is a fashion accessory reading emotional tweets from all over the world, it changes colours depending on different hashtags used. By tracking the sharing of the hashtags: #happy #sad #ew #scared and #mad, Emolace tells you the current emotion of the ever-updating internet.

The speed of blinking is based on the strength of WiFi signal, which will tell you how well as an individual you connect to the cyber world.


Code is available at Github

BOM List

Day 1

5 ideas in class

  1. Water bottle that displayed levels of other bottles
  2. Hat that measured tasks completed
  3. Christmas sweater that communicated with phone christmas tree
  4. Device that measured people around you on the subway and reacted to it.
  5. Twitter to wearable that displayed the emotion of the internet via LEDs
  6. Use photon board for this project

 

Day 2

On day two Jad and YiYi met to work more on the concept of experiment five. We found that we need to develop more for the concepts and we also want to add a little bit more challenge to the code. We did some research http://richardhayler.blogspot.com.au/2016/04/getting-started-with-adafruit-feather.html  and found that there is an art project which can detect how many Wifi routers around us and change color.

screen-shot-2017-12-13-at-15-12-10

We think it’s pretty cool especially as a wearable device because the user can be more aware of the invisible internet environment. We considered what the best materials would be for the job and ordered some LED strips off amazon. Unfortunately we later realized that they were 12V not 5V and had to cancel and reorder putting our supplies order behind a day. Thankfully, Kristy offered to bring in some LED strip for us to use in the meantime. So we just used the 12V led strip with an external power supply to make the code working.

Here is the fritzing diagram to connect 12V led with photon board.

emolace-12v

IFTTT has a trigger app for particle photon, it can call a function each time when a specific hashtag from Twitter. Photon is very convenient to use, code can be flashed through wifi, which is very useful for us to debug and adjust. The little difference in particle IDE is that each function is called in integer and not inside the loop.

screen-shot-2017-12-12-at-17-45-34

We did some thinking about:

Why do we want the emotions of the internet to be worn?

Who would want to wear this?

What shape and form will this device take on?

 

Day 3

– Compiling a table for materials, cost, and suppliers.
– Discussion:
Wearing the emotions of the internet translated into colors
  /vs/
Showing the strength of one’s connectivity to internet through intensity of color.

Interesting discussion between Jad and Quinn

> Conclusion: both concepts are slightly similar, combining both speak to the same concept:
So we decided to use the color scheme to translate the emotion, and its intensity to translate the strength of connectivity to these emotions.

 

Day 4

Quinn bought materials at a fabric store to start prototyping the design of the necklace. We purchased faux leather, crepe and some black mesh. We began creating the fabric tubing to encase the LED and diffuse the light. We went through a few iterations of production of creating this tubing. White glue first – not strong enough. Sewing – too time consuming and not in our skill set. Hot glue – this method worked and allowed us to ensure a secure connection of the fabric in minimal time. We created 3 4 layer white fabric tubes to hold the led section of the necklace. We then began sewing the clasps onto the leather pieces deciding if we should place them at the back or connect them to either side of the battery pack/photon. Quinn has to go home and use the sewing machine at home for this.

screen-shot-2017-12-13-at-15-21-48

screen-shot-2017-12-13-at-15-22-45

screen-shot-2017-12-13-at-15-24-11 screen-shot-2017-12-13-at-15-24-16

We also developed the code and develop the Wifi connectivity. We found an example code of getting RSSI ( relative received signal strength in a wireless environment) from photon board successfully had it dim at different wifi strengths by connecting a single LED.

screen-shot-2017-12-13-at-15-26-48

Reference for the pic above:
https://support.kontakt.io/hc/en-gb/articles/201621521-Transmission-power-Range-and-RSSI

 

It’s a bit awkward to hold an opening laptop and walk around because we need to keep an eye on the screen which will print out the RSSI value so we can get the right range in our environment and also be more aware of those numbers. We even tried to use a metal plate to block the wifi signal but which didn’t really work.

img_7496-2

After few testing, we found that -20 is the maximum wifi signal and -80 is the minimum wifi signal. We incorporated a new variable (rssi indicator) to do this. We also changed and added the hashtags #mad #disgusting #happy #sad #scared to play around with what hashtags are more effective at gaging emotion.

Example code for wifi signal

https://community.particle.io/t/photon-wi-fi-signal-strength-indicator-led/22737

Day 5

 

On this day we began to move on to the protoboard and consider what the best way to run our circuit was. In class we soldered header pins onto the protoboard so that we could move our photon on and off the protoboard. We were still waiting for our led strips that were late so we had to wait until monday to test the circuit, especially the battery power, with a 5v strip.

img_7544

We developed the design and sewed some initial prototypes of the faux leather. We realize that the battery pack weighs more than the photon and anticipate this being an issue. We tried using clasp but considered perhaps sewing or velcro will be better.

-moved onto protoboard

-added the hashtag #ew

screen-shot-2017-12-13-at-15-30-41

Day 8

Our LED strips came and we were able to test them using the battery and they work great! Additionally the original circuit we used that had 3 transistors – we no longer needed. This really cut down our protoboard soldering time and allowed us to put the RGB/Ground pins directly into the photon. Then we just needed to wire the battery packs. We opened up the plastic casing and soldered longer wires to the pack so that we could wire to the photon through the white fabric tubing.

emolace_5v

The old code can work by just calling each function, however, we did find problems when we begin to make 3 necklaces changing color at the same time. The old code need to set up 3 individual IFTTT accounts to work with 3 particle accounts, which is quite stupid and also caused those apps crashing in IFTTT (functions can only work when Photon board is online). So we looked back the IFTTT again and found that they can also publish a public event that any photon board can subscribe to it and the apps can keep running all the time! So we put different value for different emotions and we can just write statement for each one to be just one color.

screen-shot-2017-12-13-at-11-46-15

screen-shot-2017-12-12-at-18-03-25

screen-shot-2017-12-12-at-17-51-46 screen-shot-2017-12-12-at-17-51-40

We have another problem of the changing color today, it’s not turning into the right colour as we set for each emotion. So we start checking the soldering from led strip and debugging by command out everything else but remain one color. And we found a very weird issue, the RGB value totally reversed for some reason! But it worked perfectly with the 12V led strip before!

We are not sure if it’s the case that we soldered the led strip from the other side or maybe it’s a bad manufactory led strip? Red should be (255,0,0), but it turns out to be blue from the led. We have to manually do math and reverse each RGB value to get the right colour.

screen-shot-2017-12-12-at-23-08-01

Day 9

Today we continued to put together the necklace. Just like in the fashion design industry we used paper to sketch the size of the necklace on the neck and chest so we know how long each piece of fabric should be.
To make things symmetrical Jad folded a paper and sketched one side then unfolded it and then cut that piece of paper to trace it on the fabric before cutting it accordingly.
We know the electronics could be heavy for those fabrics so we followed a design that sit on the chest like a cleopatra necklace

screen-shot-2017-12-13-at-15-36-57

We cut the silver leather in U shape and to balance weight volume, Jad thought the batteries could be on one side and the microprocessor on the other side.
Quinn made the back of the necklace from three braided pieces of leather so the design would be simplistic from the back and highlight the front.

screen-shot-2017-12-13-at-15-37-57 screen-shot-2017-12-13-at-15-38-06

Day 10

We measured the estimate length of the necklace from the lower side because this will help us know how long the LED strip should be as well as how long the cord of the battery case to the Photon should be. Through trial and error we succeeded in defining the right length and went on joining the wires from one side to the other and leaving some space for the micro processors as well for the batteries case to hide them in a way that doesn’t interfere with the design and make it non obvious where these parts are within the necklace.
screen-shot-2017-12-13-at-15-38-12

Jad realized before assembling the fabric and the electronics that too much weight won’t allow the necklace to sit in a round shape on the chest and there must be an incorporation of some sort of support to keep it in shape and together. As we assembled the pieces the concern was validated as the necklace was droopy and not in a good shape. Jad then cut a piece of cardboard in the U shape of the necklace and glued the fabric to it and that helped in solving the whole issue. The necklace now sits flat and comfortably on the chest without problems. screen-shot-2017-12-13-at-15-38-19

Day 11

Before we glued all the pieces together, we thought ahead of the problems we would face when changing batteries or resetting the microprocessor. We needed to have some sort of mechanism to remove the batteries and replace them so we thought of velcro as the best solution.

screen-shot-2017-12-13-at-15-38-27

The batteries case is now covered with a flap that opens and closes which is great.

But the on/off switch in on the other side towards the cardboard that it will be glued to. Jad then cut a little square that allows the user to switch the batteries on and off very easily from the other side.

screen-shot-2017-12-13-at-15-38-32

 

The batteries case is now secure on one side, and the same for the microprocessor on the other side and in between the LED lights strip and of course the wire that goes from the batteries to the microprocessor.

Now we were able to stitch and glue all the fabric and pieces together. And cover the whole with the black mesh that Quinn found in the fabric store.

We wanted that net or mesh to make a cloud effect so it diffuses the color of the LED and make the whole necklace light up as one entity so we had to hot glue that mesh from the back of the necklace to the cardboard in a folded arbitrary way so it creates that intended cloudy effect.

screen-shot-2017-12-13-at-15-38-37

So on the other side it looked like so: screen-shot-2017-12-13-at-15-38-42

And for camouflage of the squarish battery section on the right, Jad thought of adding a rosette made from the same mesh material on the right and two thin bands on the left.

 

 

Final Product:

screen-shot-2017-12-13-at-15-38-53 screen-shot-2017-12-13-at-15-38-58

We also work on the RSSI part of the coding today, it’s quite frustrating to figure out how to define the analog pin from the example code as it seems like analog pin is defined in RSSI function and we are using three pins for RGB which is quite difficult to do so. Beyond that, it’s not very visible for people to see the difference of the brightness of the led strip. So we changed our plan to visualize the wifi strength by the blink rate of the led, when wifi get strong it will blink fast, when wifi get weak it will blink slowly. We remapped the RSSI value between 1 to 5, which is more easy to work with led blink rate.

Day 12

It’s our last day to finalize the physical part and code part of the emolace.

 

Day 13

User Testing day!

Emolace – User Testing Plan

Repair Kit

  • extra batteries
  • needle and thread

Our using testing plan is to test the 3 devices separately and record our experiences through a form that will be answered every 3 hours. Participants are encouraged to document moments that stand out to them but try and do daily tasks and functions they normally would.

Short vlogs documenting experience.

Testers will take short ten-fifteen second video logs recording their experiences and concerns. This is for moments that cannot be captured in the form.

Form to be answered every 3 hours. For a total of 4×3 responses.

  1. How aware are you of the necklace?
  2. Is it comfortable to wear?
  3. Do you feel a connection to the changing colours?
  4. Does this connection affect how you feel about other events in your life?
  5. How does the reaction of other people impact your experience with the device?
  6. What questions are they asking?
  7. Were you aware of the changing brightness of the LEDs, did this make you aware of the spaces you were in?
  8. Additional comments obstacles experienced.

 

Jad’s videos:
https://drive.google.com/a/ocadu.ca/file/d/1dokc8UpFjUJmEisaTwYfwDB-FE6CNtKS/view?usp=sharing

 

Yiyi’s video:

 

 

Quinn’s video:

 

Link for questionnaire

https://docs.google.com/forms/d/1d7pUKyHjxoZO0secHMKh4Na3i_G9J4R_2gQ20_nggYY/edit#responses

 

Evaluation and Future

For future iterations of this design we are interested in considering new means of data collection as well as improving aesthetic elements of the necklace. The current approach to collecting data uses hashtags on twitter in english, which can be limiting to the scope of data collected. As suggested by colleagues during our critique and reflected on after, the addition of location specific data collection or alternatively the expansion of hashtags used could create a more effective demonstration of emotion. Aesthetically speaking we are interested improving the balance of the design through more considered weight distribution. The distribution of weight is currently on one side (the battery pack side) and the investment of smaller batteries could create not only a more comfortable design but a more sleek one. This ability to size down some of the hardware aspects of the necklace will result in a design that is more portable and wearable on an everyday basis.

 

Conclusion

In conclusion, our groups desire to create a fashionable wearable that shared the emotion of the internet and demonstrated connectivity was successful. We were able to learn about production of multiples and design considerations when dealing with hardware. Throughout user testing we were able to collect data on how we, as users, and others, as viewers, perceive our device. This valuable information allowed us as as group to consider the future iterations of the wearable and think about what is next for EMOLACE. Thank you!

FlattEar Me

Project Name: FlattEar Me

Project Members: Ramona Caprariu, Kylie Caraway, Sana Shepko

Project Description: FlattEar Me is a wearable device that takes the form of earmuffs. FlattEar Me’s literally keep you warm and fuzzy at all times, from the large, wool earmuffs, to the sweet sounding compliments whispered in your ear. When feeling distressed or looking for reassurance, the user presses the button behind their ear and a compliment will play.

Intended Context and Users: FlattEar Me is made for users of all ages. They are best used in cold weather. FlattEar Me’s are most often used outside, during commutes, and in solitude. Compliments play as little or as often as desired; the user is in control of the admiration given.

2-minute video presenting the portable & summarizing the field testing plan & results:

https://vimeo.com/246377023

Image of Product:

dsc_2035

dsc_2038

Image of Product being worn by three group members:

dsc_2081

dsc_2052

dsc_2028-1

Final Design Files:

prelimflattearme

componentsprelim

BOM : https://docs.google.com/spreadsheets/d/1YJtyliPrk1No4qI-vu55MjSV5Vky_pARm23NyWC1cK8/edit#gid=0

Rough estimate for each earmuff: $ 167.21 Each

Github Code:

https://github.com/sanaalla/Exp5_Final

Fritzing Diagram: 

experiment_5_fritzing_kb_2

Process Journal

Day 1:

Today we brainstormed ideas for our final, portable project. This project needs to be off of the breadboard, battery powered, and fully enclosed.

We knew we wanted to make something in the wearable technology realm.

Five ideas:

  • Clothing that heats up when the temperature is cold outside. Winter-themed wearable technology
  • Sensor that sends you a notification on your phone when you have coffee breath, bad breath, breathalyzer
  • Mood sensor, mood ring, with colors
  • Wearable technology that uses animal defense mechanisms to combat sexual assault – perhaps spikes?
  • Whisper Earrings that whisper compliments to you when you’re distressed

After discussing our various ideas, we combined the mood sensor with the whisper earrings. We plan on using a heart rate sensor to measure someone’s anxiety, and based on their anxiety level, they will receive different compliments through a speaker near their ear. This could be earrings and a necklace, an earcuff, a brooch, a headband, a hat, a collar, etc.

We have decided to go ahead and order the music maker feather wing so it will arrive by Friday. We will also need a battery, a speaker, a button, wire, and decorative accessories.

DAY 2:

photo-on-11-30-17-at-2-37-pm-2

We have finalized (at least almost finalized) our project idea.

In our first iteration, we are set on a wearable device, but we have to figure out how to dock the feather and battery near the earrings.

We realized that we needed something to send the signal to play an audio file. It would be very annoying to have someone constantly saying compliments in your ear. We also looked for heart rate monitors at Creatron, but they were all sold out. The difficulty of finding the heart rate sensors, as well as logistical issues of placing the heart rate monitor on the user easily in conjunction with the earrings/accessories made us abandon the mood sensor for now. While we would like to implement biometric data in further developments, the sensor available to us at this point make that too difficult to execute in our current time constraint.

Therefore, our first step in hardware was making a button trigger audio on the feather. We were first able to make the button mute the audio, but we wanted to reverse this, so we altered the code to only play the audio when the button was pressed.

Our next step was to upload audio onto a micro SD card and test it on the music maker feather wing. None of us had micro sd cards, so for testing purposes Kylie brought her GoPro micro sd card. After many attempts, we realized the GoPro SD card won’t work. It says it will read only, and it won’t allow us to write to the card. When we try to change it to write, or format the card, it won’t let us change anything. Tommy had similar experiences with his GoPro SD card. It seems that Gopro formats the card and it cannot be reverted back easily. Therefore, we have to buy a new SD card.

We went to The Source: bought a microSD card on sale, originally 19.99 but purchased it on sale for 11.99. It is a 16 GB SD Card.

Once we had an SD card with music uploaded, we worked through the Adafruit tutorial for the music maker feather wing. The first step is to solder the pins to the feather wing. Next, we downloaded the VS1053 library to get the music player example working in Arduino. After that we formatted the SD card. One important thing to mention is that the audio files have to be 8 character names (track001.mp3). We also have to make sure our tracks can load easily and aren’t too large. MP3s seem to be the best, as they are compressed to proper sizes and sound decent on the computer.

The example code also has a beep to tell you when it starts, which is helpful but also annoying after a certain point. We commented that out of the code. Unfortunately, when I plug in my MAC headphones, the quality sounds absolutely terrible. It is low volume and the distortion makes it almost unrecognizable. The audio also only plays when we open the serial monitor. We had to comment that section out, so we didn’t have to rely on our laptop to play the audio file.

disable_tone_serial_port

DAY 3:

Today we started by making our build of materials and writing our discussion post detailing the concept/overall development plan for the execution of the three devices. So far, our project is quite expensive – $106.00 each. The working title right now is “CompMEment” with the slogan: ““None of us will ever accomplish anything excellent or commanding except when he listens to this whisper which is heard by him alone” – Ralph Waldo Emerson

We have decided against mini speakers and will go the route of using earbuds instead. This will be easier to implement into your project, they are a more appropriate size, and we aren’t sure that the music maker feather wing version we bought will even work with the speakers we purchased from Creatron.  

Our working code so far:  https://github.com/sanaalla/exp5TESTING/blob/master/exp5_music.ino

We have also decided to switch from jewelry to some sort of headband with headphones. Could take the form of earmuffs, Yoda ears, flowers, a tiara, etc.

screen-shot-2017-12-03-at-4-11-55-pm

img_5317

We recorded Ramona saying compliments from a compliment generator online (www.complimentgenerator.co.uk) , such as:

  1. Everytime you smile, a kitten is born.
  2. Is it a bird? Is it a plane? No. It’s you, you massive legend.
  3. I once looked at your bum. I regret nothing.

We recorded them on our phone, and brought them into Adobe Audition to export as MP3s.

screenshot-24

We tested out the music maker. The sound quality is really awful. It is so distorted and quiet you can’t hear Ramona’s compliment, defeating the purpose of the project. We tried Emilia’s feather wing to see if our solder was the issue, but it sounds bad with her music maker feather wing as well. We tried changing the volume and other settings in the code, but it didn’t change the quality of the audio.

We talked over our project with Nick to fix any issues and get feedback on portions of our project we were uncertain about, such as wiring our project and button onto a headband, randomizing audio files, and sound quality issues. Sana worked with Nick on the code for randomizing what audio track is played, while Ramona and Kylie attempted to fix the sound quality issues.

screen-shot-2017-12-03-at-4-12-04-pm

screen-shot-2017-12-03-at-4-12-12-pm

 

Ramona and Kylie got nowhere with sound quality issues. They tried altering the audio files, trying different headphones, trying different audio files, with no results. In desperate measures, they asked sound engineer Finlay for help.

FINLAY SAVED US it was an issue with headphones..?

We need headphones with Tip ring sleeve NOT tip ring ring sleeve (what is used with most headphones today) (VERY VERY IMPORTANT)

Therefore, in order for our project to work, we need either:

  • ⅛ in TRS male to ⅛ inch TRRS female adapters

OR

  • TRS headphones

We couldn’t find any headphones online that would clarify they were TRS headphones, nor would they show the end of the headphone jack to look for the number of rings.

Kylie went to Walmart and found TRS headphones (they had a graphic of the stereo plug, but she double checked the audio jack by opening the packaging). Sony Clear Sound Headphones for $10.88!

Day 4:

Today we decided to re-brainstorm our name ideas. We changed from CompMEment to something along the lines of: “Voices in my Head”

 

Ramona also rewrote our script and did a call for volunteers to read compliments (Ramona didn’t want to listen to her voice over and over)

SCRIPT

 

  • Your methodology is soooo sound. (Savaya)  
  • Tommy’s words of wisdom (TOMMY)
  • If you tried, you could probably be quite famous. (Savaya)
  • Everytime you smile, a kitten is born. (SEAN)
  • I’d love to speculate on possible futures with you.  (TOMMY)  
  • I wish I could deep dive into your eyes. (Emma)
  • You compute me. (KRISTY)
  • Is it a bird? Is it a plane? No. It’s you, you massive legend.  (EMMA)
  • I once looked at your bum. I regret nothing. (KRISTY)
  • You’re perfectly layered. Like a lasagne. (SEAN)

We recorded most of our compliments, brought them into Audition to adjust the volume, edit them, and convert them from m4a files to MP3 files, and then we uploaded them to the SD cards. We also wired all three of our projects.

24882889_10156017643858054_763725960_o

Day 5:

Today we finished recording all of our compliments and uploaded them to all of the SD cards. We began putting together our testing plans today:

    • Preparations
      • What needs to be done ahead of time?
      • Battery must be charged (with a special lithium polymer charger)
      • Do you need extra batteries?
        • No, our battery is rechargeable.
      • What goes into your repair kit?
        • Needle and thread
        • Wire
        • Electrical tape
        • Extra yarn
      • Be sure to take “before” photos.
    • The Plan
      • How will you document your own user experiences while using the device? Notes? Journaling? Photos? Video? Audio recording?
        • Notes/ Journaling/ Video
      • What will you do to ensure that your data collection methods are consistent for each group member?
        • Wear for the same amount of time, we all have the same sound bites to draw from, same functional design, all going to wear on our commutes
      • For each group member, what are the dates & times of testing?
        • Kylie: Wednesday and Thursday during the day
        • Ramona: Wednesday and Thursday after 4 (has work prior to)
        • Sana: Wednesday and Thursday after 4
      • If there is a reason that (2) 6-hour testing periods don’t make sense, include a proposal for a reasonable equivalent for your device and get sign off from Kate.
        • Our wearable is a winter-specific wearable so wearing them indoors for an extended period may prove a little odd and uncomfortable; also because we wear them on our ears, it may be difficult to wear consistently for 6 hours straight.
      • Will you be together or having your own experiences?
        • Own experiences
      • Will you be doing your day-to-day activities or doing a special activity?
        • Day-to-Day
      • Any other details?
    • End of Session Reports
      • You are required to create End of Session Reports. Create a survey / form using Google Forms for each group member to fill out at the end of their 6-hour testing periods. You will end up with 6 entries (3 users x (2) 8-hour testing periods.) Link to your form here.
    • After – Crunching the data & documentation
      • After the field testing, how will your team structure a debriefing conversation?
        • We will talk to one another and fill out a survey:

 

  • How did FlattEar Me feel after an hour?
  • How did it make you feel, in an emoji?
  • When did you find yourself pressing the button and why?
  • Did you feel like it change your emotional state when you heard the voices?
  • Was it intuitive to press the button in that location?
  • What recommendations for further development?

 

    • What will you do with the data and media once you find it?
      • Incorporate it into our video, and discuss what our next iterations in the future will be
    • How will you display or present your observations & findings?
      • Make a video with our user journals, videos, and surveys
    • Be sure to visually document each prototype after testing is complete and make notes on what state they’re in.

We are also reconsidering our name:

fb_poll

New Name: FlattEar Me

DAY 6:

OK OK OK OK..

So we started off the day hopeful because all our wiring was golden from Monday and we had our materials gathered from Michael’s. BUT on Monday, near the end of class, we decided to remove the resistor near the button on our circuit to keep everything more trim and compact and we now, in hindsight, believe this was our error. We started soldering one protoboard with the feather and feather wing and then a small proto board piece with just the button. But we were having loads of issues with playability inconsistency. We thought it may have been the battery (perhaps it wasn’t giving out a full charge, maybe it wasn’t of a high enough voltage, etc.). Then we thought it was the wiring so we unsoldered the button and tried placing it back on the breadboard. BUT then, we found it was having the same inconsistencies in play still. SO, hours later, we thought to reintroduce the resistor (by looking at our preliminary fritzing diagrams, we were reminded of our initial thinking) and lo and behold, a success! But still, only kind of – it doesn’t play endlessly. In the ballpark of every 30-40 presses of the button, we must reset the featherwing music maker. And then it proceeds to play again. The batteries also seem like they need charging fairly often.

24623668_10156017643798054_1418531805_o

24825794_10156017643768054_2060318178_o

We also started building the ear muff components from felt, wool yarn, and hot glue.

dsc_2021

We followed the Brit & Co tutorial (linked in our context section) to make the pom pom ear muffs out of the wool yarn.
We wanted the earmuffs to be very large so we could place the electronics within them. In our first attempt, the pom pom looked less fluffy and more like a mop or small dog. This was due to us making the yarn bunch longer in length, rather than more yarn.

dsc_2022

We tried again, making the bunch with more yarn rather than longer yarn, and this fixed our prior issue.

dsc_2023

After finishing the pom poms, we started making the ear muff piece that presses against the user’s ears. We did this by cutting out a circular piece of felt, and swirling the yarn on top of it.

img_0556

We realized that glue the pompom and the earpiece to both pieces of the felt sandwich made it lack any structure to place the earbuds in, so we decided to purchase stiff felt and place a piece in the middle of the earmuff, to provide more support for the two pieces along with the earbud.

img_0559

After hot gluing the stiff felt to the felt sandwich, we glued the pom pom and soft earpiece to the outside of the felt (in hindsight, I wish we had not hot glued this. We later realized we must sew this to make sure it stays in tact, and sewing through three pieces of felt and hard glue proved to be extremely difficult- next time, we will wait to hot glue until the very end).

img_0560

Our next step was to figure out how we would incorporate the various pieces of hardware onto the headband and earmuffs. While we originally wanted to place the hardware within the earmuff, it came to our attention that the weight distribution would be so uneven that the earmuffs would not be able to be worn and would be extremely uncomfortable. We decided to move the electronic components further up the headband, where we could use the teeth of the headband to hold onto the electronic components. We weren’t sure how hot glue would react to the protoboard, so we decided to zip-tie the protoboard (with the feather, music maker, and battery) to the headband, with the button and resistor on the opposite side of the headband. We continued to use zip-ties to hold the wires and button to the headband. We also used heat shrink (that was not heated up) to hold the wires in place on top. Our next step was to figure out how we would store the battery with the feather without gluing it down. We decided to use velcro to attach the battery to the side of the feather, so that the electronic compartment would not be too tall, and we would have the ability to reach the reset button, as well as the SD card and audio jack. Our last step was wrapping the earbuds around the components, and positioning each of the earbuds to hang down on opposite sides.

img_0561-1

img_0562

24956795_10155107013368148_1363986206_o

24879158_10155107015528148_1431790702_o

As you can see, combining the earmuffs with the electronic headband made us realize that our earmuffs were going to be HUGE. Our next steps are:

  • figuring out a way to make a compartment for the electronics that is easily accessible
  • cover the rest of the wires
  • leave the button visible and functional

We also need to figure out how we are going to implement the earbuds into the earmuffs, so they will be in the proper space and stay oriented properly without swiveling around.

Day 7:

Today was our workshop day! We built some stuff!

24992774_10214953196006313_1531868891_o

Our first step was creating a compartment for the electronics. We used stiff felt and normal felt to sew a box around the electronic components, with a hole for the usb to fit through. We also made a top piece that is sewed on one side, so we can flip it open to change the battery, hit the reset button, or reach the SD card.

24992806_10214953195726306_1394050810_o

The earbuds were swivelling

around too much with the zip ties, so we decided to use duct tape to fasten the earbuds into the exact spot we wanted them to lay. We went ahead and built all three headbands the exact same way, before we began layering the felt on top of the electronics.

img_0564

25035460_10214953195766307_708025606_o

img_0570

We cut small strips of felt and layered them on top of one another, hot gluing the pieces as we went. We had to cut a hole out for the button to remain visible. The felt was a good first layer to hide the wiring, as well as create a cushion for the earmuffs.

24956865_10214953195686305_778424385_o

Our next step was placing the earbuds within the earmuffs, and attaching the earmuffs to the headband. This took two sets of people and a bit of finessing. We attempted to measure where the earbud would fall in relation to the headband and earmuffs. This was dependent on the person wearing them, as well as the earmuff size.

Our next step was fastening the earbuds within the earmuffs. We first realized you could not hear the compliments very well through the felt and wool yarn, so we decided to cut a small hole for the earbud to fit into. This was a very difficult task, as our scissors did not like the felt, and the hardened hot glue was very tough to cut through. If we were to do this again, we would cut the holes for the earbuds before gluing the pieces together.

To fasten the earbuds to the earmuffs, we superglued the back of the earbud to a piece of felt. We were worried that placing hot glue on the earbud would melt and ruin the earbud, and we didn’t want to take that risk. We essentially used the piece of superglued felt as a piece of tape by stretching the felt and hot gluing it to the inside of the earmuff.

Next, we hot glue the felted headband to both sides of the earmuff, as well as poured more hot glue within the earmuff and sandwiched it together. This took large amounts of glue, and the earmuff would still open up on the edges, so we decided to sew around the edge to keep them secure and intact.

After attaching all of the earmuffs to the headbands through hot glue and sewing, we attached more wool yarn in order to hide the blue felt, create more cushion and warmth, and to make the earmuffs more aesthetically pleasing. We wrapped pieces around the headband, hot gluing as we went. We made sure to avoid wrapping over the button, the USB slot, and the top of the electronics compartment. 

img_0572

img_0571

dsc_2035-1

Voila! We completed all three earmuffs, and they look almost identical, yet they each have their own personalities based on the fluff on their muffs.

User Testing

User Testing Plan

For our user testing, we decided to rely on notes, journals, and photos. Video recordings seemed to be an ineffective tool for information gathering, because our project relies on personal experiences with discrete audio and a small button, versus visual cues and feedback or shared participatory experiences. We also decided that we would not wear these in 2 6-hour sessions, rather we would wear them during all of our commutes, while we are outside, and at other times we deemed plausible. We wanted to wear our earmuffs when we would normally wear them day-to-day, rather than forcing them into a specific timeframe. We believed wearing them for 6 hours straight could potentially negatively impact the information we gathered, as well as it being uncomfortable and awkward to wear in situations when we are at work or in class, etc.

User Testing Questions and Survey:

Before going out and testing our product, we decided to create a survey of questions / Google Form (https://goo.gl/forms/VEIMrOI7euqUYZc92) for us to fill out after testing. We decided on these questions:

  • How did FlattEar Me feel physically after your session?
  • Was it comfortable?
  • Did you feel like it changed your emotional state when you heard the voices?
  • Was it intuitive to press the button in that location?
  • When did you find yourself pressing the button and why?
  • How often did you press the button?
  • What recommendations would you have for further development?

Personal Reflections of User Testing:

Sana user testing:

In general, the earmuffs feel very comfortable. As far as these headphone type things go, they’re much looser than bluetooth earphones which actually makes them easier to wear for long periods of time. I first tested them out on my walk back to home from the subway (around 20 minutes) and they actually did a GREAT job of keepings my ears warm. The music maker wing worked for the entirety of the walk, up until the point I walked into the house; I am tempted to believe that this is a change of state kind of thing that might have affected it? As in the change of temperature might have disrupted the fragility that we have learned is the music maker? These actually (and not to toot our own horn here) did have a positive effect on my mood. Before putting them on it was a pretty normal commute experience, but I think just the fact that I was wearing this ridiculous headpiece and knew why people were staring at me as I walked by added this bit of hilarity, in addition to the fact that I could hear Tommy’s voice saying “YASSS QUEEN” when I pressed the button. All in all, the only thing I wish we could fix is aesthetic so I don’t feel quite so self-conscious wearing this in public… well, and I suppose the need to reset the music maker feather whenever it decides to freeze on us.

photo-on-12-7-17-at-2-54-pm

Also they do feel a little like they’re about to slip off all the time, and figuring out where to situate the earmuffs so that the earbud is right over your ear isn’t very intuitive. However, even when the sound isn’t being played right into my ear, the voice recordings are still fairly clear; but in situations where you’re in a loud space it might be a little hard to hear so that’s something that we should consider.

Kylie User Testing:

img_3536

I wore my earmuffs during my commutes in between school and home, on my walks around my neighborhood with my dog, and around my house. Here are some of my notes and journal entries I wrote down during my user testing experience:

  • The earmuffs are very warm. I have never worn earmuffs before, and I do not own a pair, so I was a new user to earmuffs. They kept me very warm outside, which was great in the cold – not so great when worn inside.
  • The earmuffs were very comfortable and soft at first. After a while, my head felt squeezed and compressed, and my ears started to hurt from the pressure, as well as the earbuds (I have a big head and small ears, so this is probably an issue with me more-so than the earmuffs)
  • The earbuds are made for specific ears (one for the left ear, one for the right ear) and we didn’t make it obvious which way to wear the earmuffs, so I found myself wearing them backwards numerous times, and I had to reverse them. It would be great if we could make it more obvious which way to wear them.
  • I kept having to readjust them on my head. They either felt like they were in the wrong spot on my head and right spot for my ears, or wrong spot for my ears and right spot for my head. It was tricky to measure these for each person, and it would be tricky to make a pair that fits every user, but I think in the next iteration we would need to finesse the measurements to fit better.
  • The earmuffs are (obviously) very big, and due to the electrical components within them, they are very heavy. If I rotated my head or looked down at all, they would fall off my head. This was my biggest issue with the earmuffs so far.
  • The earmuffs are also a dog magnet. My dog has gone to extreme measures to try and get ahold of my earmuffs (from attempting to climb my dresser, to using the chairs near my dining table as steps to get on top of the table.) Other dogs are also mesmerized by them when I walk near. Perhaps they look like a small animal?
  • I also felt very self-conscious and silly when wearing them. People definitely notice how huge they are (or maybe I am more aware of people looking at me?) I was anxious wearing them at first, but the first compliment i pressed was Tommy’s words of wisdom. It made me smile and laugh, and it did make me feel better and happier. My attention was drawn away from my head as a bowling ball size and more towards feeling good about myself.
  • Overall, I really like the compliments we selected. They are fun and silly, but also reassuring. I found myself wanting a larger variety of compliments, so that could be another iteration we could do in the future.
  • In my first session walking around, I was only able to receive about 8 compliments before they stopped working. My battery had died. The battery life seems relatively short on my pair, so I have to recharge them often.
  • After leaving them to charge overnight, I was able to get about 40 or so compliments before having to reset the feather. I think I have to reset the feather from timing out, or once I hit about 30 compliments or so. I also think the battery life is good for about 50 or so compliments.
  • I think the volume is a good amount, but I have also worn them when I am in quiet spaces, such as the subway or walking outside. Other people have commented that they are too quiet. Perhaps this is because the users are in loud spaces, or the earmuffs are measured for my own head and ears, so they don’t fit properly on other users.
  • Overall, I really like the emotional response this had on my user testing, I just wish the earmuffs were not as huge / ostentatious and a bit more comfortable. I found that I could not wear them for longer than 1 hour at a time. I had to give my ears a rest.

Ramona User Testing:

25075255_10214953195566302_1379136324_o

I similarly wore my earmuffs on my commute to and from work/school and I also tried wearing them around the house a couple times but was met by too much curiosity from my cat who would not seize chasing me around. On my commute, I was initially quite comfortable wearing the ear muffs because of their warmth and comfort, but as I wore them longer, I was more aware of the imbalanced structure on my head. I had to be very cautious of moving my head too quickly or looking down because I was so nervous they would fall off. Wearing them during rush hour was also very uncomfortable to me because it was then that I was hyperaware of their size and I felt a little pompous and extravagant. I did find myself reaching for the button more in these moments, to distract myself and maybe to also make it more obvious to people that I was ‘doing’ something with the earmuffs. As for the earmuffs themselves, I believe they worked as intended. When I was wearing them at the restaurant I work at (right as I was leaving), I found the noisy atmosphere made it quite difficult to hear the compliments. But all in all, volume was not that big of an issue for me. More often than not, a low volume was because I was wearing the ear muffs incorrectly (the earbud positioning was then off). I also found the battery life to have been less than expected, but still not requiring more than a couple charges a day (each at about 30mins). I was able to use them in a context and a way that was conducive to my life. I had quite the positive experience testing these – I was proud of what we had made and in the beginning, that definitely coloured my perspective (as I was in such a good mood). But as I wore them longer, I was able to point out the flaws in design and that first-hand/lived experience was ultimately so helpful.

Data collection from survey:

Here is the data we received from user testing. We filled out our survey to view our own feedback. We additionally asked others from our cohort who tried FlattEar Me on to fill out the survey:

screen-shot-2017-12-13-at-12-02-14-pm

screen-shot-2017-12-13-at-12-03-40-pm

screen-shot-2017-12-13-at-12-03-56-pm

screen-shot-2017-12-13-at-12-04-25-pm

Our Overall Summary of User Testing:

Based on the survey above, this is a summary of the data we received:

  • FlattEar Me leans on the comfortable side for users
  • FlattEar Me improves the user’s mood, with the majority feeling happy, and others feeling both amused and… a little bit gassy?
  • While the button was pressed numerous times when worn by users, the button location is not intuitive.
  • The button was pressed when people were curious, bored, or wanted to hear all the compliments, rather than when they were wanting a mood boost.
  • The data is widespread for how often FlattEar Me would be used, from “never” worn to “always” worn, with more users leaning on the “always” worn side.
  • Recommendations include: color options, less lint and fluff, change in button location, and making the earmuffs lighter in weight… We were also told to make them larger??

As a group, we agreed that these were a bit awkward to wear out in public, but the compliments were self-esteem boosters and made us laugh and happy. We would like to implement more compliments for a greater variety and surprise. We also noted the weight of the earmuffs, and that they became a sort of balancing act to wear during commutes. The button location was also an area of discussion – perhaps it should be placed lower on the earmuffs, or within the pom pom of the earmuff, because it is a bit uncomfortable and awkward to press at this moment.

Our Next Steps and Future Iterations:

As we move this project forward, there are a few iterations we would like to make:

  • We would like to work on the button location, making it more intuitive for the user
  • We would like to address the size and weight of the earmuffs, making them easier and more comfortable to wear, store, and transport.
  • We would like to increase the number of compliments for a greater variety
  • We would like to change the design of the earbuds, making them more comfortable, louder, and suitable for various users. This could mean using speakers instead of earbuds.
  • We would like to find a more inconspicuous way to store the electronics, while still making them accessible for the user.
  • We would like to address issues with the music maker needing to reset after so many compliments, as well as the battery dying in a relatively short time span.
  • We would like to find ways to decrease the price of these earmuffs, making them more affordable for users.
  • We would like to make different designs and colour options, as well as an option more suitable for other seasons, such as a summer version.

References & Context:

FlattEar Me would not have been made without the help of Adafruit’s music maker feather wing, the Compliment Generator, and Brit + Co’s pom pom ear muffs tutorial. Adafruit gave a step by step tutorial (https://learn.adafruit.com/adafruit-music-maker-featherwing/overview) to help users with the music maker feather wing, from soldering the right pieces, to diagrams of the different pins available, to downloading the proper libraries and giving example code with fantastic notes and references. The Compliment Generator (http://www.complimentgenerator.co.uk/) gave us a script to work off of for our compliments. While we used some homemade compliments, we also used some of their compliments within our project. Brit + Co’s tutorial (https://www.brit.co/how-to-make-pom-pom-ear-muffs/) provided a different reference, as it helped us gather materials and form a product that we felt was capable of hiding the hardware components within the material. We were also mesmerized by the photo of the dog with the pom pom earmuffs.

screen-shot-2017-12-01-at-2-56-08-pm

Other wearable technology in headband form includes the Sparkfun illuminated pom pom headband (https://learn.sparkfun.com/tutorials/led-pompom-headbands/advanced-pompom-headband) as well as the Crystal Headband that lights up via LEDs (https://learn.sparkfun.com/tutorials/led-crystal-goddess-crown?_ga=2.35831017.830944120.1512079637-721508807.1510938523).

While we moved away from using biometric data to send compliments when the user is upset or stressed, we were motivated by other wearable technology that uses biometric data as an input. The AWElectric (https://makezine.com/2016/06/08/sensorees-biometric-jacket-has-3d-printed-goosebumps-that-move/) is a wearable technology that renders visual and tactile signals through lights and fractal goosebumps that raise out of the clothing when the user is in awe.

We would be remiss to not include where we first received the idea of complimenting accessories: 20th Century Fox’s Aquamarine (2006). In this film, a mermaid by the name of Aquamarine totes starfish earrings that give her compliments. She declares, “They literally give me compliments – in my ear. They talk to me. Starfish are notorious suck-ups. They love to give compliments. But it’s nice when you need a little boost.”

Compliments are an important aspect of our social lives. According to Psychology Today, (https://www.psychologytoday.com/articles/200403/the-art-the-compliment) if compliments are given correctly, “they create so much positive energy that they make things happen almost as if by magic” (Marano 2004). “Focusing on and noticing the good qualities in the world around us gives our moods a boost all by itself” (Marano 2004). Earmuffs are also considered one of the most popular inventions, as they received attention on HowStuffWork’s Stuff of Genius (https://www.youtube.com/watch?v=HpHRwXXkZnw) for keeping millions of ears warm and toasty. Therefore, the combination of earmuffs and compliments can keep people warm and fuzzy on the inside and outside!

Bibliography

Aquamarine. Directed by Elizabeth Allen Rosenbaum, 20th Century Fox, 2006.

Bryden, Kelly. “Warm Up With DIY Pom Pom Earmuffs.” Brit & Co, 29 Dec. 2015, www.brit.co/how-to-make-pom-pom-ear-muffs/. Accessed 8 Dec. 2017.

“Compliments Are Good.” Compliment Generator, www.complimentgenerator.co.uk/. Accessed 8 Dec. 2017.

“Ear Muffs: Where Did They Come From? | Stuff of Genius.” YouTube, uploaded by HowStuffWorks, 8 Nov. 2014, www.youtube.com/watch?v=HpHRwXXkZnw. Accessed 8 Dec. 2017.

Feldi. “LED Crystal Goddess Crown.” Sparkfun, learn.sparkfun.com/tutorials/led-crystal-goddess-crown?_ga=2.35831017.830944120.1512079637-721508807.151093852 3. Accessed 8 Dec. 2017.

Feldi. “LED PomPom Headbands.” Sparkfun, learn.sparkfun.com/tutorials/led-pompom-headbands/advanced-pompom-headband. Accessed 8 Dec. 2017.  

Lady Ada, editor. “Overview: Adafruit Music Maker FeatherWing.” Adafruitlearn.adafruit.com/adafruit-music-maker-featherwing/overview. Accessed 8 Dec. 2017.

Marano, Hara Estroff. “The Art of the Compliment.” Psychology Today, 1 Mar.2004, www.psychologytoday.com/articles/200403/the-art-the-compliment. Accessed 8 Dec. 2017.

Neidlinger, Kristin. “Sensoree’s 3D Printed Fabric Animates Your Goosebumps.”Make:, 8 June 2016, makezine.com/2016/06/08/sensorees-biometric-jacket-has-3d-printed-goosebumps-that-move/. Accessed 8 Dec. 2017.

 

 

 

remember ocean

Karo Castro-Wunsch

live demo: realgoodinternet.me/remember-ocean

GitHub: https://github.com/KaroAntonio/remember-ocean

screen-shot-2017-12-01-at-6-43-50-am

Process Journal

Sometimes I forget about the ocean, sometimes I’m thinking about things that seem more important at the time but in retrospect really weren’t really important when compared to endless water. The ocean does her own thing and takes care of herself, as far as I can tell, so she doesn’t really get up in your face too often unless you’re of the Moby dick/Tom hanks variety. So this piece is a somewhat agitated, demanding, incarnation of the ocean, trapped in the internet as per usual. The central thought is ~ don’t forget the the ocean ~ Ie remember the ocean. I put the demanding ocean next to population growth. The ocean at the whim of pop growth, becoming more agitated due to pop growth, becoming increasingly more spastic. With life coming from the sea and all, it’s probably pretty tough being the mother of all that and keeping your kelp beds in good order too. There’s the calm ocean, more sphereish and gentle, and the cartoonishly wild ocean that don’t even fit on the screen it’s having such a fit. Your job is to mind the ocean, let her know that yes, you are thinking of her, you do remember her. And that’s the role of the peripheral: a salt water detector in a shell that is activated only when you tap it with a finger dipped in salt water. The sensor is tuned to only react to salt water, regular water carrying too faint a signal. By touching salt water to the shell (a small alter of sorts) you let the ocean know that you’re thinking of her in a way that assures her also that you remember at least a small fragment of the sensation of the ocean. A drop of salt water on a finger is practically an ocean if you look close enough…? Touching the shell calms the ocean and she resumes peaceful orbliness, but she’s only calm so long as you actively attend to her, otherwise she slips steadily back into agitation. And so it goes.

I’m really glad of the meditation on ocean vs population this project allowed me, but what I’m less enthusiastic of is cats vs conductive thread.

img_1741

^ It’s a knot, there were many of them. Future iterations of this project will use a wireless connection. The initial thought for the thread was to wrap the shell with it and leave gaps which could be crossed with the thread. This proved difficult as the thread holds on to too much moisture, so I re-designed using a two prong moisture detector. Also, yes, I am aware creation sells a moisture detector for $5, I thought I could make something that looked better, which was a successful negative result!

img_1740

It worked very well though and was able to handle large drops of water and not become waterlogged.

On the visual technical end, learning more WebGL and shader systems is always fun. The Ocean is a deformed icosahedron that uses 3D and 4D periodic perlin noise in conjunction with vertex shader reflection to produce its wobliness.

The population api doesn’t give a second to second population update, so to get that working I took the daily pop status report and interpolated it with the avg growth of humans / second.

To deal with serial data with the water sensor there was quite a bit of tweaking in the form of scaling and averaging but the numbers still seemed to drift as I kept using the sensor, I assume it started to degrade a little. Poor Guy.

good_shell

The circuit for the sensor just involves grounding the sensor wire so that it’s more constant when off, and passing the ground through a resistor so that more of the current is channeled towards the analog in. Is that how electronics works? Experimentally thought, this circuit functions.

img_1738

untitled-sketch_bb

img_1727

The physical interface is a lil bowl of water and a shell. I have a good deal of remorse about not making the sensor more invisible. Ideally you just touch a bowl of water and that’s your interface. I’ve seen water bowls used as synths before but that’s a mk004 thing now.

img_1726

The simple contrast of the visuals and concept make me happy.

References / Resources

Reflective Deform Shader: http://www.clicktorelease.com/blog/experiments-with-perlin-noise

World population api:  http://api.population.io

Considerations on the Ocean: https://la.utexas.edu/users/jmciver/Honors/Fiction%202013/Hemmingway_The%20Old%20Man%20and%20the%20Sea_1952.pdf

Water Sensor Inspiration:  http://www.instructables.com/id/Simple-Water-Sensor/

 

 

Title: Love Corner

Title: Love Corner

Group: Savaya Shinkaruk and Emilia Mason


Project Description

Our product was created by thinking of a project for our Messaging and Notifications assignment.

For this project you will design and create a custom device that allows the members of your group to communicate in new ways.  Using the Wifi connection of the Feather M0, you will create standalone objects that communicate over the internet.  Each group must consider how these devices create a specific language that allows the things, people, or places on the ends of the network to communicate.  – Nick and Kate.

When we were first assigned this project Emilia wanted to make a Christmas gift for her boyfriend. She showed me some videos of concepts that work with couples who are in long distance relationships. And since both of us are in long distance relationships, we came up with our product Love Corner.

We go more into depth about the journey of our process towards making our game in our blog  but the overall description of our project is:

We created a product where couples who are in long distance relationships, can wear a bracelet that will vibrate and light up based on their partner’s pulse.

For this to work, you place on the bracelet (which looks like a Power Ranger watch) and attach the pulse sensor to your finger and through WIFI it will send the data to and from one another’s bracelets.

Even if you are not in a long distance relationship, but want to spice up your love life test out Lover Corner.

So, continue on to the rest of our page to read more about the ‘Love Corner’ team and our journey.


About team

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.

Emilia Mason: Emilia Mason is a content creator with a background in community-based work. She is currently an MDes student in Digital Futures at OCAD U and she is interested in using art, technology and design as tools for educational purposes.  


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF OUR EXPERIMENT:

November 13, 2017

Today we were introduced our fourth assignment in our Creation and Computation class.

This assignment is to come up with and design a custom device that allows the both of us (Emilia and Savaya) to communicate in a new way.

During class hours, Emilia and I discussed some possibilities of what we could do for this assignment.

Emilia had a great idea that is based off of Rafael Lozano-Hemmer project called Pulse Room. The Pulse Room is a interactive art installation where he uses pulse sensors that set off light bulbs in the room to the same rhythm of the person’s heart rate.

Here is a link to show his project: http://www.lozano-hemmer.com/pulse_room.php

We both loved this idea so much, so we decided to iterate it a bit and follow this other concept a little more.

We decided to do this because we thought it would be a cute gift for both of our boyfriends who do not live in Toronto haha.

Emilia came across this link on how we want to iterate Rafael Lozano-Hemmer project.

Here is the link: https://blog.arduino.cc/2015/06/18/your-first-wearable-with-gemma/

The above link is from the Arduino Blog where they have made a wearable pulse sensor.

We also took a look at this website: https://www.precisionmicrodrives.com/tech-blog/2016/05/16/how-drive-vibration-motor-arduino-and-genuino

However, it is using a Genuino, but it is helpful information and research for us to do, so we can understand how it is ran. And the materials we will need to look into.

So, we took this idea and will make our project to be this:

You will have a bracelet that has a pulse sensor and your partner will have a large LED (light bulb) that will show the rhythm of your partner’s heart rate.

Our input being: pulse sensor

Our output being: 1 big LED

After we figured out our concept, we asked Kate and Nick about it and they approved of our idea.

We also asked Kate what kind of pulse sensor we should get. She said the one from Creatron is great.

We also asked, for design ease, if we could / should use the GEMMA, but she said we should stick to using the Feather as it has WIFI – which is needed for this experiment.

So right after this class we went directly to Creatron to purchase one, but the gentleman there said they were sold out both in store and online.

We ended up going on Amazon to purchase one, but that was not going to work as we would not be able to receive on till December 4th.

attempt-to-order-pulse-sensor

We kept looking around to see where to get a pulse sensor because after calling Creatron again to say that had it available online and see if they could bring it in store – they still told us they were sold out. So we are still searching for where to get one and also other needed materials and tools.

From there, we kept talking via text messages about what we would need for this project.

Here is what we talked about needing for the project:

  • 1 feather for the pulse sensor (input)
  • 1 feather for the LED (output)
  • Materials to make the bracelet
  • Materials to make a box for the LED

We did talk about making 2 sets of this, but because we figured it would be too much money to purchase two more feathers, we decided to make one set.

What we need to do going into this week:

  • Ask Sana who used a light bulb (LED) for her third experiment, so we have to ask her how she set that up to function.
  • Order our pulse sensor.
  • Do more research on how to get this working.
  • Decide who is going to work on the input and/or output.
  • Design the bracelet we want to make.

End of day one.

Thanks!


DAY TWO

DAY TWO OF OUR EXPERIMENT:

November 14, 2017

We met today to go over what we discussed yesterday about our project and we decided to change a few things.

After searching how we would make a bracelet for this project – that would look good but also be large enough to fit all the intended components for it to run – we decided to do some changes.

The changes we made and why:

  • We decided to scratch the idea of using a single large LED set up to give a light show of your partner’s heart rate.
  • We decided to use 3 small LED lights that would go on the bracelet and turn on and off based on your partner’s heart rate that is being sent.
  • We did this because we want to have this project be as though you and your partner each have a bracelet where you get a light show based on your partner’s heart rate and we want to add in a vibration component where you can also feel the pulse rate of your partners.
  • In the end our changes allow our project for this experiment to become more intimate, which is what we wanted for our long distance relationships.

Here are the sketches for our two ideas:  sketch-1

sketch-2

After we changed the look of our experiment we started to order the intended materials and tools to make this experiment happen.

What we need:

  • 2 pulse sensor kits
  • Vibrating mini disc motor
  • IN 4001 Diode
  • Resistor ~200 1 K OHM – We have in our kit
  • 100 MAH Lipoly Battery – Having trouble sourcing
  • Heat Shrink Pack

Here is the new concept of our project:

You and your partner will EACH have a bracelet that will include a pulse sensor, a vibrating disc, battery, and LED’s where you can both send and receive your partner’s heart rate where you can visually see it on the LED’s and feel it from the vibrating disc.

In the end what we are making will be a prototyped version of where we see this experiment going.

After doing some research to source the materials we need to find we, we came across a few snags. Mainly with sourcing a battery.

Here is a link to show the recommended battery to get: https://www.adafruit.com/product/1570

Here is a link to show the battery we are going to get: http://www.canadarobotix.com/battery-chargers/battery-lithium-1000mah

We decided to not go with the recommended battery because the one we sourced has a longer lasting life period, even with having the same voltage.

Here is the email Emilia sent Nick and Kate about some battery dilemmas we are finding:

email

We ended up ordering most of our materials from Canada Robotix, which was also great because it was half the price.

Here is the list of what we ordered from Canada Robotix:

canada-robotix-orders

Here is what we purchased from Creatron: 

creatron

The last thing we need to get are the batteries. Which Kate and Nick will help us with hopefully before Friday, if not on Friday at the latest.

Kate was in the DF Lab when we were looking to order these batteries, so we asked her about what we were looking at, and she said to go for it! So we added the battery to our order.

Now we just have to start working on the code, and designing the bracelets.

We also sketched some ideas of how we want to design the bracelets.

Here is the sketch:

front-side-bracelet

back-side-bracelet

We just need to figure out the materials to use for this.

End of day two.

Thanks!


DAY THREE

DAY THREE OF OUR EXPERIMENT:

November 17, 2017

We re-connected today to start working further on our project.

All of our materials have arrived via mail, and we have purchased all the extended materials needed for coding and tool building.

However, we still need to purchase the bracelet materials needed to make them. But we want to start to work on the coding right now.

And also because we have class today with Kate, and it would be a great time to ask questions with problems we are having.

Here is a image of Emilia with our new tools and materials to build this product:

nov-17-new-materials

For our first stage of coding research we stuck to these two websites:

https://makezine.com/projects/make-29/beating-heart-headband/

https://pulsesensor.com/pages/code-and-guide

https://pulsesensor.com/pages/code-and-guide

These websites gave us the direction we needed to set up our Arduino and get our pulse sensor working with an LED. And to also see how we can implement other outputs into our product.

We had some technical issues arise with our computers where it was having trouble ‘compiling’ our board – but Savaya would just have to upload the code twice for that message to disappear.

Here is a video of our first attempts to get our pulse sensor reading our pulse rate and having the LED blink. We are still testing to see our pulse rates to know where / how to turn the LED OFF and ON rather than just blinking.

We are also finding that when we want the LEDs to blink – it takes a while for them to catch onto the pulse rate ‘beat’ that is being done. And then takes a few minutes to stop when you have removed your pulse from the pulse rate monitor. We are unsure as to why this is.

After we were to successfully get this portion working, we decided to call it a day and meet tomorrow to continue working on this experiment.

Some notes from class we took about our project based on what Kate taught us:

Facts:

A pulse sensor does not read the BEAT of the heart – it reads the blood flow to and from the pulse you are testing.

We also learnt our battery is a great battery to have because the Feather MO can charge it too.

However, if in need of replacing they are more difficult to source than just going to Shoppers Drug Mart to grab a pack of batteries. So if it were to come to commercializing this bracelet that would be something to think about. But there are ways around it, as the Feather can charge it.

We also learnt about Neo Pixels. These are lights that you can dress up individually. It is done in an array with wiring to have an individual dress. We are thinking about switching these with our LED lights to make the bracelet look better. There is information on Canvas we will be looking at to learn how to make these work.

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF OUR EXPERIMENT:

November 18, 2017

We met up in the DF lab to continue working on our project.

Our goals today are to:

  • Finish the coding for the pulse sensor and LED.
  • Find code for the vibrating motor.
  • Get these two things talking to one another.

Lets see how it does! Continue reading to find out…

So we had a few issues with the pulse sensor being / we did not know if the code we were using already had a programmed heart rate beat in it, or if it was reading Savaya’s.

On the other hand today, we also worked on getting the vibrating motor working.

We followed this link to how we need to solder the motor to work when putting it in place with our breadboard in a more condensed matter:

And we used the code from our Pages link in Canvas to see what code we need to get the motor working.

From this link here: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-output

After a few hours of soldering and coding, we got the vibrating motor to work! Just one of them right now! But it is working. They are very fragile so we have to be careful with how to solder this piece together.

Here is a image of the wired vibrating motor:

vibrating-motor-set-up-emilia

We are still confused with the pulse sensor code as to why there is movement when we are not touching the pulse sensor. So we are going to wait for class on Monday to ask Nick why this is happening.

Savaya went home to continue working on it and lost all the coding files she had been working on for the pulse sensor – so she started clean with what code to use.

In the end we will be using this code from GitHub: https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF OUR EXPERIMENT:

November 20, 2017

We are back in action!

We have class today with Nick and we are going to ask him about the pulse sensor.

To simplify our rant yesterday, this is what we are unsure of:

  • Why is the serial plotter graphing movement when we are not touching the pulse sensor.
  • Is this because the code we are using has a pre-programmed heart pulse to it?

We kept working with the code from the link posted yesterday, but here it is again: https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

After talking to Nick about the questions we had he said, it is not a pre-programmed code in the code we are using, it is just showing and graphing in the serial plotter the way it is, because there is a lot of noise. It won’t move the way it is now when we have it put into a bracelet.

It calms down when you place your finger or wrist where there is a pulse on the pulse sensor.

So, here is a video to show the pulse sensor working with the proper code and LED: 

After that was up and functioning, we started to work on getting the pulse sensor working with PubNub. Roxanne H helped us to understand what code we needed and how to link it.

We got the examples working in class, but it is a new learning curve we are finding when we have to do it ourselves haha.

What we needed to understand for it to link was the pMessage being sent from the data the pulse sensor was gathering.

And also with that, the data that is needed to be published is the void loop code – which is the Signal > Threshold information being made, and then sent. So to have it being published you have to say you need to add: “publishToPubNub”.

Once we got this very small part working, we continued to work on getting the vibrating motor working.

The soldering has been an issue for us because it is so fragile! After hours on hours of soldering on Saturday – it was working but then fell apart on Sunday. So we went back to the soldering table but it was not working again. So we are going to go to Reza and ask him to assist us in this issue – or else we will need to purchase a new one.

After we do this, we will need to then work on getting code to link the pulse sensor signal to the vibrating motor – as we already have the code from canvas working.

We also scheduled a meeting with Kate and Nick tomorrow at 3:15 PM to ask some questions about our project.

When we got both of the vibrating motors soldered, we started to work on connecting the vibrating motor to the same code the LED is running on. Aka our pulse sensor code.

We had Mudit help us with this.

img_9507

Some of the things he noticed with our previous code with the pulse sensor was how we were using Signal as a variable.

He said if we just use Threshold as a variable it will help to define a number when the LED and vibrating motor will go off and on.

img_9508

We got this working by adding the same loop code the LED is using but with the vibrating motor. And after doing this, it still was not working…until we realized we didn’t have it wired up correctly. We were missing the USB wire. As soon as we added that wire in – YAY it started working again!

We sadly didn’t get any video of this because then Savaya’s vibrating motor became unsoldered – so we will re-solder that tomorrow and get video.

But here is Emilias vibrating motor working:

We decided to take a break from this and work on setting up our PubNub code with the code we sorted out today, tomorrow.

img_9506

And also to work on building our bracelet. We are deciding between two ideas – to 3D print one or find out and then add some other pieces to it, like a pocket.

End of day five.

Thanks!


DAY SIX

DAY SIX OF OUR EXPERIMENT:

November 21, 2017

Today we are meeting at 11 AM to work on building our bracelets!

So we got our motor and pulse sensor code work together! But it did not happen with a few broken wires.

Both of our vibrating motors wires broke, and Savaya’s pulse sensor wire broke too. After talking to Nick and also reading online, the most important thing for us to do is hot glue or electrical tape the wire to there aren’t any pressure points.

Here is a image to show of the hot glue we put on our pulse sensors and vibrating motors: ADD IMAGE HERE.

After we went to Reza to get his to assistance on soldering and glueing what we needed, we got working on the code as to why our motor was not working with our pulse sensor and LED code. We set it up the same way as our LED but it was not working. We ended up getting Orlando to look at it, and he suggested we re-look at our wiring because he said our code should work.

And with re-wiring, we realized that our Transistor was facing the wrong way, and when we switched that – it started working! Thank you Orlando.

Here is a video to show this process working: ADD VIDEO HERE.

After this, we took a break and started to talk about how we wanted to make these bracelets. The idea we talked about is to 3D print a whole bracelet and / or 3D print a box where the Feather and battery will sit – kind of like a Power Ranger / Watch concept.

In the end we decided to do the Power Ranger / Watch concept!

We booked a meeting to go and 3D print our concept with Reza for tomorrow after our class tomorrow.

We also scheduled a meeting with Nick and Kate for tomorrow (November 22) at 3:15 PM to ask about our code to see if our PubNub is reading one another’s information.

End of day six.

Thanks!


DAY SEVEN

DAY SEVEN OF OUR EXPERIMENT:

November 22, 2017

Today we re-connected after our Research Methods to go and start our 3D printing box to make our bracelet, and to talk to Kate about our code.

The concept for our bracelet is, we will have a box that will hold the Feather and our battery (with all our wires too) that will sit on the top of our wrist, and then make it so we can have a band (velcro probably) that will have the pulse sensor on the inside of our wrist.

Some of the challenges / things we decided to change about this concept are:

Our meeting with Kate:

At our meeting with Kate, we had two main questions – the code and our wearable concept.

She said our concept of the Watch and Power Ranger look is great! Which was awesome to hear.

And with our code, she said we should include a BPM variable so then it is only reading it in a specific time range rather than all the time. – Which we need to source. We have done some sourcing on this already with looking at past examples, but we cannot seem to find the proper library for this. We have tried multiple things from Github but it does not seem to work. So Kate said she would send us some code that should fix that. But for now, we will do more research to see if we can find something to work.

img_9537

We are finding that the Adafruit code does not read well with our Feather board – and we cannot seem to figure out how to change that. So we are looking for BPM code that does not come from Adafruit.

After our meeting, we wanted to go over our code with PubNub to make sure that is it reading what it should be.

So we broke it down.

Here is a image of the notes we took while breaking it down:

howtounderstnadpubnub

howtounderstandpubnub3

howtounderstandpubnub1

We broke it down by: changing the value to our names so we knew who was reading and sending what.

In the end we got it working and now know FOR SURE that Savaya’s pulse is showing on Emilia’s LED and vibrating motor, and vice versa!

Here is a image to show our channels sending the right information to one another:

pub-nub-channels

We are so happy haha.

So now we are going to work on researching BPM code to add in our code.

We are also going to meet tomorrow to work on our bracelet!

End of day seven.

Thanks!


DAY EIGHT

DAY EIGHT OF OUR EXPERIMENT:

November 23, 2017

Today we went to get the materials we need for our bracelet.

We stopped in at Creatron to purchase our enclosure boxes, went to a fabric store to purchase some velcro to use as the band, and went to Michaels to purchase some paint and stickers to make the enclosure boxes look good.

We also talked about our presentation and how we want it to look and work.

We decided to make ‘Love Corners’ where we will be on opposite sides of the room with a designed ‘romantic feel’ (pictures to come in tomorrow’s blog post), where we will each sit and place on our Love Corner pulse bracelets where we will read each other’s pulses.

Also the name Lover Corner for our product came from us chatting about our presentation look and appeal.

We got back to school and started to work on our BPM code and also get designing our bracelets.

For the BPM code we sourced this code, from this website (link here: https://github.com/bmbergh/cheerios) and it worked! So we followed this YouTube video of how to code it, and added it into ours.

Here is the YouTube link: https://www.youtube.com/watch?v=gbk5T67KYcs

Once we added this in, we started to work on our bracelet. We need to condense our breadboard into a smaller one – which includes soldering our pieces and figuring out where we need to put our pins.

Here is a image to show that process so we know how many things we need to connect to GRND or what goes to what Pin:

figuringoutwhatconnectstogrnd_fornewbreadboard

Here is our fritzing board before we soldered and condensed the board: pulsesensor_breadboard

pulsesensor_schematic

 

This helped us to see where everything needs to go and to also see if we need to add in another + – GRND because we have multiple things connecting there. And might not have enough room on the smaller breadboard.

Here is our breadboard before transferring to a smaller breadboard version to be placed in the bracelet:

breadboard-close-up-emilia

breadboard-set-up-emilia

We also decided to spray paint our enclosure boxes pink and add stickers to it, to evoke that romantic vibe and feel.

Here is a image of our enclosure boxes:

enclosurebox

Here is a video of the enclosure boxes being painted:

Here is a image of the painted enclosure boxes and the stickers we are going to use to decorate them:

enclosureboxpink

We are having a issue with our vibrating motors where Savaya’s is not working, and Emilia’s is vibrating non-stop.

But we got our bracelets built and soldered and placed into the smaller proto-boards:

building

building1

finish-build-inside

img_9567

But we are going to take a break tonight, and work on it before class tomorrow and see if we can fix it, because it was working before we transferred the boards. But we are assuming it is a wiring issue.

One of Emilia’s wires broke too when we put the top on, so we have a soldering date tomorrow.

End of day eight.

Thanks!


DAY NINE: THE REVEAL OF LOVE CORNER

DAY NINE OF OUR EXPERIMENT:

November 24, 2017

Before the big finale of our presentation we had to solder and fix some of our items.

We are going to re-make Savaya’s vibrating motor because the wiring on the breadboard was done correctly, so we are thinking one of the other wires broke, but everything has been hot glued, so we can not see what is happening.

So off to the solder table we go in DF lab.

We also had to change the threshold again in our code because we could not get our pulses to go over 1000, so we lowered it so the LED and vibrating motor would go off, and with us doing that we see that Emilia’s board it working.

In the end, the first run did not work – and we know the wiring it correct and same with the code, so we think it is Savaya’s Pin 13. So we re-soldered it to Pin 12, and changed the code.

And that did not help the troubling.

We were not able to sort it out, and for some reason our code stopped reading and sending fully. Which was a huge bummer when we went to present it to the class, but that’s ok! We did a great idea and strong prototype and it was working well at some point during our process and journey.

screen-shot-2017-11-24-at-2-42-50-pm

To take a look at our code click this link:

https://github.com/SavayaShinkaruk/experiment4

https://github.com/emiliamason/Experiment4

Here is a image to show the product:

img_3336

Here is a video to show you to put on the bracelet:

What we learnt:

This project was a true learning curve on so many levels:

Writing a code that takes a pulse sensor and having two different outputs, using pubnub to send the input to be read by another device and vice versa. That was complicated!

Also, understanding the basics of how to handle pieces that will go in a wearable device. During our process we realized it would have been easier, more convenient and less expensive to have two sets of everything. One set to be used for the breadboard to make sure the code works and a second set to solder for the final device. This way the pieces (resistors, wires, sensors, motors, etc) wouldn’t be so worn out by the time we figured out the code, resulting in less soldering and less breaking of pieces.

Another valuable lesson from this project was to sketch where in the wearable device we want each piece to be located. Depending on this is how each piece should be soldered, this will also give a better idea what type of wires to use stranded wires or sole cords and how long each wire should be.

Examples:

screen-shot-2017-11-27-at-3-15-34-pm

The first board the soldered has a significant amount of wires that could have easily been shorter and soldered on the backside. Having so much wire made almost impossible the closing of the box.

If we could do this again, instead of soldering the motor to the diode we would for sure solder longer stranded wires. This would give the opportunity to place the vibrating motor on the wrist band so the vibration would be stronger on the flesh. Having the vibrating motor inside the enclosure did not really allow the user to FEEL the pulse of person wearing the other device.

End of day nine.

Thanks!


FINALE PROJECT BLOG POST

Lover Corner Product:

We created a product where couples who are in long distance relationships, can wear a bracelet that will vibrate and light up based on their partner’s pulse.

For this to work, you place on the bracelet (which looks like a Power Ranger watch) and attach the pulse sensor to your finger and through WIFI it will send the data to and from one another’s bracelets. (There is a video to show this process).

Even if you are not in a long distance relationship, but want to spice up your love life test out Lover Corner.

Project Members: Savaya Shinkaruk and Emilia Mason

Code:

https://github.com/SavayaShinkaruk/experiment4

https://github.com/emiliamason/Experiment4

Supporting Visuals:

There are images and video of our process and journey in our blog post above.

Design Files:

There are images and video of our process and journey in our blog post above.

Project Context:

The Lover Corner bracelets were created to be a accessory couples can wear when in long distance relationships or when couples are looking to feel a connection to their loved one. Savaya and Emilia created this product because they are both in long distances relationships and thought this accessory would be a way for each of them to connect with their boyfriends back home.

Through our understanding of Arduino, PubNub and product design we were able to create this prototyped version of Lover Corner bracelets.

We see this bracelet being less Power Ranger like, and more smooth so couples don’t need to just wear it in the privacy of their own home – but can wear it out in public when they are missing their significant others. We also see this coming in other colour options too, so it can be for men, women, and uni-sex options.

With the goal of our given assignment – to create a product that sends and receives notifications – we took that concept in less of a digital format like a screen, and implemented it into a wearable.

Bibliography:

Brandy Morgan. (2015, November 21). BPM with and Arduino Tutorial [Video file]. Retrieved from https://www.youtube.com/watch?v=gbk5T67KYcs

Pulse Sensor. (n.d.). Pulse sensor servo tutorial. Retrieved November 17, 2017 from https://pulsesensor.com/pages/pulse-sensor-servo-tutorial

Hartman, K., Puckett, N. (2017).  KIT: Connections / Code – INPUT. Retrieved from OCAD University Creation and Computation Canvas website: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-input

Hartman, K., Puckett, N. (2017).  KIT: Connections / Code – OUTPUT. Retrieved from OCAD University Creation and Computation Canvas website: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-output

Precision MicroDrives. (2016).  How to drive a vibration motor with arduino and genuino. Retrieved from https://www.precisionmicrodrives.com/tech-blog/2016/05/16/how-drive-vibration-motor-arduino-and-genuino

Github [yury-g]. (March 24). Getting Advanced Code / PulseSensor & “Arduino”. Retrieved from https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

Github [bmergh]. (2016). Cheerios. Retrieved from https://github.com/bmbergh/cheerios

Pulse Sensor. (n.d.). Getting Started. Retrieved from https://pulsesensor.com/pages/code-and-guide

Arduino. (2015). Make your first wearable with arduino gemma. Retrieved from https://blog.arduino.cc/2015/06/18/your-first-wearable-with-gemma/

Rafarl Lozano-Hemmer. (n.d.). Pulse Room. Retrieved from http://www.lozano-hemmer.com/pulse_room.php

Adafruit. (n.d.). Lithium Ion Polymer Battery – 3.7v 100mAh. Retrieved from https://www.adafruit.com/product/1570

Stern, B. (n.d.). Beating Heart Headband. Retrieved November 17, from https://makezine.com/projects/make-29/beating-heart-headband/

Earl, Bill. (2014). Using millis() for timing. Retrieved from, https://makezine.com/projects/make-29/beating-heart-headband/