Title: Love Corner

Title: Love Corner

Group: Savaya Shinkaruk and Emilia Mason


Project Description

Our product was created by thinking of a project for our Messaging and Notifications assignment.

For this project you will design and create a custom device that allows the members of your group to communicate in new ways.  Using the Wifi connection of the Feather M0, you will create standalone objects that communicate over the internet.  Each group must consider how these devices create a specific language that allows the things, people, or places on the ends of the network to communicate.  – Nick and Kate.

When we were first assigned this project Emilia wanted to make a Christmas gift for her boyfriend. She showed me some videos of concepts that work with couples who are in long distance relationships. And since both of us are in long distance relationships, we came up with our product Love Corner.

We go more into depth about the journey of our process towards making our game in our blog  but the overall description of our project is:

We created a product where couples who are in long distance relationships, can wear a bracelet that will vibrate and light up based on their partner’s pulse.

For this to work, you place on the bracelet (which looks like a Power Ranger watch) and attach the pulse sensor to your finger and through WIFI it will send the data to and from one another’s bracelets.

Even if you are not in a long distance relationship, but want to spice up your love life test out Lover Corner.

So, continue on to the rest of our page to read more about the ‘Love Corner’ team and our journey.


About team

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.

Emilia Mason: Emilia Mason is a content creator with a background in community-based work. She is currently an MDes student in Digital Futures at OCAD U and she is interested in using art, technology and design as tools for educational purposes.  


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF OUR EXPERIMENT:

November 13, 2017

Today we were introduced our fourth assignment in our Creation and Computation class.

This assignment is to come up with and design a custom device that allows the both of us (Emilia and Savaya) to communicate in a new way.

During class hours, Emilia and I discussed some possibilities of what we could do for this assignment.

Emilia had a great idea that is based off of Rafael Lozano-Hemmer project called Pulse Room. The Pulse Room is a interactive art installation where he uses pulse sensors that set off light bulbs in the room to the same rhythm of the person’s heart rate.

Here is a link to show his project: http://www.lozano-hemmer.com/pulse_room.php

We both loved this idea so much, so we decided to iterate it a bit and follow this other concept a little more.

We decided to do this because we thought it would be a cute gift for both of our boyfriends who do not live in Toronto haha.

Emilia came across this link on how we want to iterate Rafael Lozano-Hemmer project.

Here is the link: https://blog.arduino.cc/2015/06/18/your-first-wearable-with-gemma/

The above link is from the Arduino Blog where they have made a wearable pulse sensor.

We also took a look at this website: https://www.precisionmicrodrives.com/tech-blog/2016/05/16/how-drive-vibration-motor-arduino-and-genuino

However, it is using a Genuino, but it is helpful information and research for us to do, so we can understand how it is ran. And the materials we will need to look into.

So, we took this idea and will make our project to be this:

You will have a bracelet that has a pulse sensor and your partner will have a large LED (light bulb) that will show the rhythm of your partner’s heart rate.

Our input being: pulse sensor

Our output being: 1 big LED

After we figured out our concept, we asked Kate and Nick about it and they approved of our idea.

We also asked Kate what kind of pulse sensor we should get. She said the one from Creatron is great.

We also asked, for design ease, if we could / should use the GEMMA, but she said we should stick to using the Feather as it has WIFI – which is needed for this experiment.

So right after this class we went directly to Creatron to purchase one, but the gentleman there said they were sold out both in store and online.

We ended up going on Amazon to purchase one, but that was not going to work as we would not be able to receive on till December 4th.

attempt-to-order-pulse-sensor

We kept looking around to see where to get a pulse sensor because after calling Creatron again to say that had it available online and see if they could bring it in store – they still told us they were sold out. So we are still searching for where to get one and also other needed materials and tools.

From there, we kept talking via text messages about what we would need for this project.

Here is what we talked about needing for the project:

  • 1 feather for the pulse sensor (input)
  • 1 feather for the LED (output)
  • Materials to make the bracelet
  • Materials to make a box for the LED

We did talk about making 2 sets of this, but because we figured it would be too much money to purchase two more feathers, we decided to make one set.

What we need to do going into this week:

  • Ask Sana who used a light bulb (LED) for her third experiment, so we have to ask her how she set that up to function.
  • Order our pulse sensor.
  • Do more research on how to get this working.
  • Decide who is going to work on the input and/or output.
  • Design the bracelet we want to make.

End of day one.

Thanks!


DAY TWO

DAY TWO OF OUR EXPERIMENT:

November 14, 2017

We met today to go over what we discussed yesterday about our project and we decided to change a few things.

After searching how we would make a bracelet for this project – that would look good but also be large enough to fit all the intended components for it to run – we decided to do some changes.

The changes we made and why:

  • We decided to scratch the idea of using a single large LED set up to give a light show of your partner’s heart rate.
  • We decided to use 3 small LED lights that would go on the bracelet and turn on and off based on your partner’s heart rate that is being sent.
  • We did this because we want to have this project be as though you and your partner each have a bracelet where you get a light show based on your partner’s heart rate and we want to add in a vibration component where you can also feel the pulse rate of your partners.
  • In the end our changes allow our project for this experiment to become more intimate, which is what we wanted for our long distance relationships.

Here are the sketches for our two ideas:  sketch-1

sketch-2

After we changed the look of our experiment we started to order the intended materials and tools to make this experiment happen.

What we need:

  • 2 pulse sensor kits
  • Vibrating mini disc motor
  • IN 4001 Diode
  • Resistor ~200 1 K OHM – We have in our kit
  • 100 MAH Lipoly Battery – Having trouble sourcing
  • Heat Shrink Pack

Here is the new concept of our project:

You and your partner will EACH have a bracelet that will include a pulse sensor, a vibrating disc, battery, and LED’s where you can both send and receive your partner’s heart rate where you can visually see it on the LED’s and feel it from the vibrating disc.

In the end what we are making will be a prototyped version of where we see this experiment going.

After doing some research to source the materials we need to find we, we came across a few snags. Mainly with sourcing a battery.

Here is a link to show the recommended battery to get: https://www.adafruit.com/product/1570

Here is a link to show the battery we are going to get: http://www.canadarobotix.com/battery-chargers/battery-lithium-1000mah

We decided to not go with the recommended battery because the one we sourced has a longer lasting life period, even with having the same voltage.

Here is the email Emilia sent Nick and Kate about some battery dilemmas we are finding:

email

We ended up ordering most of our materials from Canada Robotix, which was also great because it was half the price.

Here is the list of what we ordered from Canada Robotix:

canada-robotix-orders

Here is what we purchased from Creatron: 

creatron

The last thing we need to get are the batteries. Which Kate and Nick will help us with hopefully before Friday, if not on Friday at the latest.

Kate was in the DF Lab when we were looking to order these batteries, so we asked her about what we were looking at, and she said to go for it! So we added the battery to our order.

Now we just have to start working on the code, and designing the bracelets.

We also sketched some ideas of how we want to design the bracelets.

Here is the sketch:

front-side-bracelet

back-side-bracelet

We just need to figure out the materials to use for this.

End of day two.

Thanks!


DAY THREE

DAY THREE OF OUR EXPERIMENT:

November 17, 2017

We re-connected today to start working further on our project.

All of our materials have arrived via mail, and we have purchased all the extended materials needed for coding and tool building.

However, we still need to purchase the bracelet materials needed to make them. But we want to start to work on the coding right now.

And also because we have class today with Kate, and it would be a great time to ask questions with problems we are having.

Here is a image of Emilia with our new tools and materials to build this product:

nov-17-new-materials

For our first stage of coding research we stuck to these two websites:

https://makezine.com/projects/make-29/beating-heart-headband/

https://pulsesensor.com/pages/code-and-guide

https://pulsesensor.com/pages/code-and-guide

These websites gave us the direction we needed to set up our Arduino and get our pulse sensor working with an LED. And to also see how we can implement other outputs into our product.

We had some technical issues arise with our computers where it was having trouble ‘compiling’ our board – but Savaya would just have to upload the code twice for that message to disappear.

Here is a video of our first attempts to get our pulse sensor reading our pulse rate and having the LED blink. We are still testing to see our pulse rates to know where / how to turn the LED OFF and ON rather than just blinking.

We are also finding that when we want the LEDs to blink – it takes a while for them to catch onto the pulse rate ‘beat’ that is being done. And then takes a few minutes to stop when you have removed your pulse from the pulse rate monitor. We are unsure as to why this is.

After we were to successfully get this portion working, we decided to call it a day and meet tomorrow to continue working on this experiment.

Some notes from class we took about our project based on what Kate taught us:

Facts:

A pulse sensor does not read the BEAT of the heart – it reads the blood flow to and from the pulse you are testing.

We also learnt our battery is a great battery to have because the Feather MO can charge it too.

However, if in need of replacing they are more difficult to source than just going to Shoppers Drug Mart to grab a pack of batteries. So if it were to come to commercializing this bracelet that would be something to think about. But there are ways around it, as the Feather can charge it.

We also learnt about Neo Pixels. These are lights that you can dress up individually. It is done in an array with wiring to have an individual dress. We are thinking about switching these with our LED lights to make the bracelet look better. There is information on Canvas we will be looking at to learn how to make these work.

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF OUR EXPERIMENT:

November 18, 2017

We met up in the DF lab to continue working on our project.

Our goals today are to:

  • Finish the coding for the pulse sensor and LED.
  • Find code for the vibrating motor.
  • Get these two things talking to one another.

Lets see how it does! Continue reading to find out…

So we had a few issues with the pulse sensor being / we did not know if the code we were using already had a programmed heart rate beat in it, or if it was reading Savaya’s.

On the other hand today, we also worked on getting the vibrating motor working.

We followed this link to how we need to solder the motor to work when putting it in place with our breadboard in a more condensed matter:

And we used the code from our Pages link in Canvas to see what code we need to get the motor working.

From this link here: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-output

After a few hours of soldering and coding, we got the vibrating motor to work! Just one of them right now! But it is working. They are very fragile so we have to be careful with how to solder this piece together.

Here is a image of the wired vibrating motor:

vibrating-motor-set-up-emilia

We are still confused with the pulse sensor code as to why there is movement when we are not touching the pulse sensor. So we are going to wait for class on Monday to ask Nick why this is happening.

Savaya went home to continue working on it and lost all the coding files she had been working on for the pulse sensor – so she started clean with what code to use.

In the end we will be using this code from GitHub: https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF OUR EXPERIMENT:

November 20, 2017

We are back in action!

We have class today with Nick and we are going to ask him about the pulse sensor.

To simplify our rant yesterday, this is what we are unsure of:

  • Why is the serial plotter graphing movement when we are not touching the pulse sensor.
  • Is this because the code we are using has a pre-programmed heart pulse to it?

We kept working with the code from the link posted yesterday, but here it is again: https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

After talking to Nick about the questions we had he said, it is not a pre-programmed code in the code we are using, it is just showing and graphing in the serial plotter the way it is, because there is a lot of noise. It won’t move the way it is now when we have it put into a bracelet.

It calms down when you place your finger or wrist where there is a pulse on the pulse sensor.

So, here is a video to show the pulse sensor working with the proper code and LED: 

After that was up and functioning, we started to work on getting the pulse sensor working with PubNub. Roxanne H helped us to understand what code we needed and how to link it.

We got the examples working in class, but it is a new learning curve we are finding when we have to do it ourselves haha.

What we needed to understand for it to link was the pMessage being sent from the data the pulse sensor was gathering.

And also with that, the data that is needed to be published is the void loop code – which is the Signal > Threshold information being made, and then sent. So to have it being published you have to say you need to add: “publishToPubNub”.

Once we got this very small part working, we continued to work on getting the vibrating motor working.

The soldering has been an issue for us because it is so fragile! After hours on hours of soldering on Saturday – it was working but then fell apart on Sunday. So we went back to the soldering table but it was not working again. So we are going to go to Reza and ask him to assist us in this issue – or else we will need to purchase a new one.

After we do this, we will need to then work on getting code to link the pulse sensor signal to the vibrating motor – as we already have the code from canvas working.

We also scheduled a meeting with Kate and Nick tomorrow at 3:15 PM to ask some questions about our project.

When we got both of the vibrating motors soldered, we started to work on connecting the vibrating motor to the same code the LED is running on. Aka our pulse sensor code.

We had Mudit help us with this.

img_9507

Some of the things he noticed with our previous code with the pulse sensor was how we were using Signal as a variable.

He said if we just use Threshold as a variable it will help to define a number when the LED and vibrating motor will go off and on.

img_9508

We got this working by adding the same loop code the LED is using but with the vibrating motor. And after doing this, it still was not working…until we realized we didn’t have it wired up correctly. We were missing the USB wire. As soon as we added that wire in – YAY it started working again!

We sadly didn’t get any video of this because then Savaya’s vibrating motor became unsoldered – so we will re-solder that tomorrow and get video.

But here is Emilias vibrating motor working:

We decided to take a break from this and work on setting up our PubNub code with the code we sorted out today, tomorrow.

img_9506

And also to work on building our bracelet. We are deciding between two ideas – to 3D print one or find out and then add some other pieces to it, like a pocket.

End of day five.

Thanks!


DAY SIX

DAY SIX OF OUR EXPERIMENT:

November 21, 2017

Today we are meeting at 11 AM to work on building our bracelets!

So we got our motor and pulse sensor code work together! But it did not happen with a few broken wires.

Both of our vibrating motors wires broke, and Savaya’s pulse sensor wire broke too. After talking to Nick and also reading online, the most important thing for us to do is hot glue or electrical tape the wire to there aren’t any pressure points.

Here is a image to show of the hot glue we put on our pulse sensors and vibrating motors: ADD IMAGE HERE.

After we went to Reza to get his to assistance on soldering and glueing what we needed, we got working on the code as to why our motor was not working with our pulse sensor and LED code. We set it up the same way as our LED but it was not working. We ended up getting Orlando to look at it, and he suggested we re-look at our wiring because he said our code should work.

And with re-wiring, we realized that our Transistor was facing the wrong way, and when we switched that – it started working! Thank you Orlando.

Here is a video to show this process working: ADD VIDEO HERE.

After this, we took a break and started to talk about how we wanted to make these bracelets. The idea we talked about is to 3D print a whole bracelet and / or 3D print a box where the Feather and battery will sit – kind of like a Power Ranger / Watch concept.

In the end we decided to do the Power Ranger / Watch concept!

We booked a meeting to go and 3D print our concept with Reza for tomorrow after our class tomorrow.

We also scheduled a meeting with Nick and Kate for tomorrow (November 22) at 3:15 PM to ask about our code to see if our PubNub is reading one another’s information.

End of day six.

Thanks!


DAY SEVEN

DAY SEVEN OF OUR EXPERIMENT:

November 22, 2017

Today we re-connected after our Research Methods to go and start our 3D printing box to make our bracelet, and to talk to Kate about our code.

The concept for our bracelet is, we will have a box that will hold the Feather and our battery (with all our wires too) that will sit on the top of our wrist, and then make it so we can have a band (velcro probably) that will have the pulse sensor on the inside of our wrist.

Some of the challenges / things we decided to change about this concept are:

Our meeting with Kate:

At our meeting with Kate, we had two main questions – the code and our wearable concept.

She said our concept of the Watch and Power Ranger look is great! Which was awesome to hear.

And with our code, she said we should include a BPM variable so then it is only reading it in a specific time range rather than all the time. – Which we need to source. We have done some sourcing on this already with looking at past examples, but we cannot seem to find the proper library for this. We have tried multiple things from Github but it does not seem to work. So Kate said she would send us some code that should fix that. But for now, we will do more research to see if we can find something to work.

img_9537

We are finding that the Adafruit code does not read well with our Feather board – and we cannot seem to figure out how to change that. So we are looking for BPM code that does not come from Adafruit.

After our meeting, we wanted to go over our code with PubNub to make sure that is it reading what it should be.

So we broke it down.

Here is a image of the notes we took while breaking it down:

howtounderstnadpubnub

howtounderstandpubnub3

howtounderstandpubnub1

We broke it down by: changing the value to our names so we knew who was reading and sending what.

In the end we got it working and now know FOR SURE that Savaya’s pulse is showing on Emilia’s LED and vibrating motor, and vice versa!

Here is a image to show our channels sending the right information to one another:

pub-nub-channels

We are so happy haha.

So now we are going to work on researching BPM code to add in our code.

We are also going to meet tomorrow to work on our bracelet!

End of day seven.

Thanks!


DAY EIGHT

DAY EIGHT OF OUR EXPERIMENT:

November 23, 2017

Today we went to get the materials we need for our bracelet.

We stopped in at Creatron to purchase our enclosure boxes, went to a fabric store to purchase some velcro to use as the band, and went to Michaels to purchase some paint and stickers to make the enclosure boxes look good.

We also talked about our presentation and how we want it to look and work.

We decided to make ‘Love Corners’ where we will be on opposite sides of the room with a designed ‘romantic feel’ (pictures to come in tomorrow’s blog post), where we will each sit and place on our Love Corner pulse bracelets where we will read each other’s pulses.

Also the name Lover Corner for our product came from us chatting about our presentation look and appeal.

We got back to school and started to work on our BPM code and also get designing our bracelets.

For the BPM code we sourced this code, from this website (link here: https://github.com/bmbergh/cheerios) and it worked! So we followed this YouTube video of how to code it, and added it into ours.

Here is the YouTube link: https://www.youtube.com/watch?v=gbk5T67KYcs

Once we added this in, we started to work on our bracelet. We need to condense our breadboard into a smaller one – which includes soldering our pieces and figuring out where we need to put our pins.

Here is a image to show that process so we know how many things we need to connect to GRND or what goes to what Pin:

figuringoutwhatconnectstogrnd_fornewbreadboard

Here is our fritzing board before we soldered and condensed the board: pulsesensor_breadboard

pulsesensor_schematic

 

This helped us to see where everything needs to go and to also see if we need to add in another + – GRND because we have multiple things connecting there. And might not have enough room on the smaller breadboard.

Here is our breadboard before transferring to a smaller breadboard version to be placed in the bracelet:

breadboard-close-up-emilia

breadboard-set-up-emilia

We also decided to spray paint our enclosure boxes pink and add stickers to it, to evoke that romantic vibe and feel.

Here is a image of our enclosure boxes:

enclosurebox

Here is a video of the enclosure boxes being painted:

Here is a image of the painted enclosure boxes and the stickers we are going to use to decorate them:

enclosureboxpink

We are having a issue with our vibrating motors where Savaya’s is not working, and Emilia’s is vibrating non-stop.

But we got our bracelets built and soldered and placed into the smaller proto-boards:

building

building1

finish-build-inside

img_9567

But we are going to take a break tonight, and work on it before class tomorrow and see if we can fix it, because it was working before we transferred the boards. But we are assuming it is a wiring issue.

One of Emilia’s wires broke too when we put the top on, so we have a soldering date tomorrow.

End of day eight.

Thanks!


DAY NINE: THE REVEAL OF LOVE CORNER

DAY NINE OF OUR EXPERIMENT:

November 24, 2017

Before the big finale of our presentation we had to solder and fix some of our items.

We are going to re-make Savaya’s vibrating motor because the wiring on the breadboard was done correctly, so we are thinking one of the other wires broke, but everything has been hot glued, so we can not see what is happening.

So off to the solder table we go in DF lab.

We also had to change the threshold again in our code because we could not get our pulses to go over 1000, so we lowered it so the LED and vibrating motor would go off, and with us doing that we see that Emilia’s board it working.

In the end, the first run did not work – and we know the wiring it correct and same with the code, so we think it is Savaya’s Pin 13. So we re-soldered it to Pin 12, and changed the code.

And that did not help the troubling.

We were not able to sort it out, and for some reason our code stopped reading and sending fully. Which was a huge bummer when we went to present it to the class, but that’s ok! We did a great idea and strong prototype and it was working well at some point during our process and journey.

screen-shot-2017-11-24-at-2-42-50-pm

To take a look at our code click this link:

https://github.com/SavayaShinkaruk/experiment4

https://github.com/emiliamason/Experiment4

Here is a image to show the product:

img_3336

Here is a video to show you to put on the bracelet:

What we learnt:

This project was a true learning curve on so many levels:

Writing a code that takes a pulse sensor and having two different outputs, using pubnub to send the input to be read by another device and vice versa. That was complicated!

Also, understanding the basics of how to handle pieces that will go in a wearable device. During our process we realized it would have been easier, more convenient and less expensive to have two sets of everything. One set to be used for the breadboard to make sure the code works and a second set to solder for the final device. This way the pieces (resistors, wires, sensors, motors, etc) wouldn’t be so worn out by the time we figured out the code, resulting in less soldering and less breaking of pieces.

Another valuable lesson from this project was to sketch where in the wearable device we want each piece to be located. Depending on this is how each piece should be soldered, this will also give a better idea what type of wires to use stranded wires or sole cords and how long each wire should be.

Examples:

screen-shot-2017-11-27-at-3-15-34-pm

The first board the soldered has a significant amount of wires that could have easily been shorter and soldered on the backside. Having so much wire made almost impossible the closing of the box.

If we could do this again, instead of soldering the motor to the diode we would for sure solder longer stranded wires. This would give the opportunity to place the vibrating motor on the wrist band so the vibration would be stronger on the flesh. Having the vibrating motor inside the enclosure did not really allow the user to FEEL the pulse of person wearing the other device.

End of day nine.

Thanks!


FINALE PROJECT BLOG POST

Lover Corner Product:

We created a product where couples who are in long distance relationships, can wear a bracelet that will vibrate and light up based on their partner’s pulse.

For this to work, you place on the bracelet (which looks like a Power Ranger watch) and attach the pulse sensor to your finger and through WIFI it will send the data to and from one another’s bracelets. (There is a video to show this process).

Even if you are not in a long distance relationship, but want to spice up your love life test out Lover Corner.

Project Members: Savaya Shinkaruk and Emilia Mason

Code:

https://github.com/SavayaShinkaruk/experiment4

https://github.com/emiliamason/Experiment4

Supporting Visuals:

There are images and video of our process and journey in our blog post above.

Design Files:

There are images and video of our process and journey in our blog post above.

Project Context:

The Lover Corner bracelets were created to be a accessory couples can wear when in long distance relationships or when couples are looking to feel a connection to their loved one. Savaya and Emilia created this product because they are both in long distances relationships and thought this accessory would be a way for each of them to connect with their boyfriends back home.

Through our understanding of Arduino, PubNub and product design we were able to create this prototyped version of Lover Corner bracelets.

We see this bracelet being less Power Ranger like, and more smooth so couples don’t need to just wear it in the privacy of their own home – but can wear it out in public when they are missing their significant others. We also see this coming in other colour options too, so it can be for men, women, and uni-sex options.

With the goal of our given assignment – to create a product that sends and receives notifications – we took that concept in less of a digital format like a screen, and implemented it into a wearable.

Bibliography:

Brandy Morgan. (2015, November 21). BPM with and Arduino Tutorial [Video file]. Retrieved from https://www.youtube.com/watch?v=gbk5T67KYcs

Pulse Sensor. (n.d.). Pulse sensor servo tutorial. Retrieved November 17, 2017 from https://pulsesensor.com/pages/pulse-sensor-servo-tutorial

Hartman, K., Puckett, N. (2017).  KIT: Connections / Code – INPUT. Retrieved from OCAD University Creation and Computation Canvas website: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-input

Hartman, K., Puckett, N. (2017).  KIT: Connections / Code – OUTPUT. Retrieved from OCAD University Creation and Computation Canvas website: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-output

Precision MicroDrives. (2016).  How to drive a vibration motor with arduino and genuino. Retrieved from https://www.precisionmicrodrives.com/tech-blog/2016/05/16/how-drive-vibration-motor-arduino-and-genuino

Github [yury-g]. (March 24). Getting Advanced Code / PulseSensor & “Arduino”. Retrieved from https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

Github [bmergh]. (2016). Cheerios. Retrieved from https://github.com/bmbergh/cheerios

Pulse Sensor. (n.d.). Getting Started. Retrieved from https://pulsesensor.com/pages/code-and-guide

Arduino. (2015). Make your first wearable with arduino gemma. Retrieved from https://blog.arduino.cc/2015/06/18/your-first-wearable-with-gemma/

Rafarl Lozano-Hemmer. (n.d.). Pulse Room. Retrieved from http://www.lozano-hemmer.com/pulse_room.php

Adafruit. (n.d.). Lithium Ion Polymer Battery – 3.7v 100mAh. Retrieved from https://www.adafruit.com/product/1570

Stern, B. (n.d.). Beating Heart Headband. Retrieved November 17, from https://makezine.com/projects/make-29/beating-heart-headband/

Earl, Bill. (2014). Using millis() for timing. Retrieved from, https://makezine.com/projects/make-29/beating-heart-headband/

 

Moon Gaze

Moon Gaze

A wearable interactive partner finder. A project by Yiyi Shao and Finlay Braithwaite for DIGF6037, Creation and Computation, Digital Futures, OCAD University.

人有悲欢离合,月有阴晴圆缺,此事古难全。但愿人长久,千里共婵娟 — 苏轼 《水调歌头》

Translation: The moon does wax, the moon does wane, and so men meet and say goodbye. May we all be blessed with longevity though far apart, we are still able to share the beauty of the moon together.

Our idea inspired from an old well-known Chinese poem Shui Diao Ge Tou by Shi Su (also known as Tungpo Su, 1037 – 1101). This poem describes the poet himself travelling long distances and missing his family. The moon is a comfort for him, because no matter how far people are separated, they are still watching and sharing the beauty of the same moon. Chinese people still carry on this traditions as part of the Mid Autumn Festival.

Moon gaze is a contemporary take on this desire for long distance connectedness through facing one another. With Moon Gaze, one can find, face, and connect with their partner irregardless of physical distance.

Moon Gaze creators Yiyi Shao and Finlay Braithwaite facing one another at a short distance.
Moon Gaze creators Finlay Braithwaite and Yiyi Shao face one another at a short distance.

User Experience

With a concept taking shape, we began thinking about how we could create a meaningful experience for users. A primary consideration was making the experience as natural as possible, requiring little to no input from users.
We focused on the experience of two partners facing one another, albeit at a great distance. One output of our system would allow a user to determine if they were facing one another. Interacting with this output would also allow the user to hone in on their partner’s bearing. We also saw meaningful interaction in knowing if your partner was facing you. This feedback would be simpler, only revealing if your partner was facing towards you or not.

With Moon Gaze’s summary objective of partners facing one another, we resolved that the interaction should be connected to the users’ body. The interaction would not be with a screen or input device, rather the interaction would respond to the users’ movements. With this in mind, we proceeded to develop Moon Gaze as a wearable interactive technology.

In making this determination, we also investigated Moon gaze as an object detached from the body. We thought it could be nice to have small arrow that pointed towards your partner. We also flirted with the idea of implementing Moon Gaze in a chair; a middle ground between object and wearable.

Proceeding as a wearable, we weighed our options for physical notification. We went back and forth between using a vibration motor or an LED. We felt a vibration motor would be discrete to the user and create an intimate haptic physical connection between partners. However, we felt that an LED would make for a stronger presentation of our first iteration and we felt confident in our ability to implement and finesse the behaviour of LEDs.

Moving forward with LEDs, we concluded that the blink rate of LEDs in the wearable would help orient partners to face one another. The faster the blinking, the closer the match. While we felt it could be interesting to know which direction, right or left, you needed to turn in order to find your partner, the math in determining left and right in a 360 degree bearing system was beyond the scope of the first iteration.

A second set of LEDs would turn on and off whether your partner is facing you or not.

Design

Physical

Design Sketches

To match our concept, we attempted to get as close as possible to the theme of the moon. In deciding to make wearables, we faced a lot of options. There are so many possibilities for wearables; it could be a hat, a t-shirt, a jumper, a pair of trousers. Which one is better? Which kind of material would be the best to play with? What colour should we choose? Where should we put our LED moon? Where should we put our hardware? Imagining, designing, and creating our wearables was a significant portion of the overall project.

Sweaters on the rack at Uniqlo.

We choose to hack knit sweaters for our project, as they allow us to weave a string of LEDs into the knit itself. We went to Uniqlo to find suitable sweaters. In the men’s clothing section we found dark grey knits that perfectly mimic the dark night sky, a perfect backdrop for our moons.

LED string woven into sweater.
LED string woven into sweater.

We began to embed and sew the LED from the sweaters’ insides. However, it wasn’t a good look with only bare LEDs on the outside of the sweater. The light from the LEDs also appeared harsh without diffusion. Ruminating on this for quite some time, we designed a fabric cover for the LEDs but it still didn’t have the look we wanted. We found a bag of feathers in the Digital Futures studio space and they immediately found a home in our design. After playing around with feathers for a while, it seemed to be a perfect material to diffuse the light!

LED feather covering.
LED feather covering.

Tip: Adding a fabric backing helped cover the wires, making the sweater more comfortable to wear. Also, the LED string won’t be snagged when the user removes the sweater.

LED protective backing.
LED protective backing.
Moon Gaze wearables ready for presentation.
Moon Gaze wearables ready for presentation.

We wanted the LEDs to be highly visible and felt that individual LEDs would be difficult to implement. At this point, we weren’t up for the challenge of working with Neo Pixels, although they are the natural progression for our next iteration. We went a trip to Michaels and found a fancy LED string which is used to decorate Christmas trees. We checked the voltage of the LED string (4.5V) and found it should be fine to work with Feather M0 board if we connect to the USB pin to get the 5V power. We cut off the built-in battery pack from the LED string to gain access to the string’s positive and negative leads. As we were using 5V from the USB pin, we had to switch the LEDs using transistors connected to a digital output pin on the Feather.

Coding

Code on GitHub

The main coding challenge we faced was determining a vector between two locations on earth. In investigating geographic calculations, we determined that we needed to resolve a rhumb line bearing between the two coordinates. A rhumb line bearing is a single bearing that will take you to a destination on Earth. On a flat map, a rhumb line appears as a straight line. However, the earth is not flat like a map. As such a rhumb line is not the shortest distance to a destination. If you consider the earth as a sphere, the shortest distance between two points is a ‘great circle’ route. The haversine formula allows you to calculate the route between two points on a sphere, allowing the calculation of a great circle route on Earth. Despite the benefits of following the shorter great circle route, it cannot be described as a single bearing. If following a great circle route, your polar bearing changes over the course of the route. Sailors often used rhumb lines for that very reason. With a rhumb line you can maintain a constant compass bearing for the entire voyage (‘Keep north on your left and sail till dawn!’). We resolved to use the rhumb line to calculate a single bearing between our Moon Gaze partners.

The math for calculating a rhumb line bearing is the stuff of geographical math textbooks and fairly easy to find on the internet if you know what you’re looking for. We found a goldmine in Chris Veness’ ‘Movable Type’ site Calculate distance, bearing and more between Latitude/Longitude points. This is a live portfolio piece demonstrating his capacity for interpreting complex systems and creating interactive online tools to study them. The material on this site is free to use as long as his copyright is cited in any resulting works.

In addition to having an interface to calculate rhumb line bearings, Veness breaks down both the math and Javascript code.

Rhumb line bearing calculations from 'Movable Type Scripts'
Rhumb line bearing calculations from ‘Movable Type Scripts’ © 2002-2017 Chris Veness

While all freely available, it was a formidable challenge as relatively inexperienced coders to make sense of the math, evaluate the code, and translate it for the Arduino IDE’s C/C++ compiler. Here’s our translation:

double rhumbDIFF = log(abs(tan(rLAT2 / 2 + M_PI / 4) / tan(rLAT1 / 2 + M_PI / 4)));
 double rhumbBRNGrad = atan2(deltaLONG, rhumbDIFF);
 double rhumbBRNGdeg = rhumbBRNGrad * 180.0 / M_PI;
 headingToPartner = fmod((rhumbBRNGdeg + 360), 360);

Our favourite part of this code is the final modulus function that corrects for a negative bearing. It was an effective introduction into the power of modulus to place a value within a range.

Not knowing what our GPS data would look like, we also created functions to convert between degrees.minutes.seconds, decimal degrees, and radians (required for the rhumb bearing calculation). This math is less specific and available from a number of sources. Our source was Steven Dutch’s site Converting UTM to Latitude and Longitude (Or Vice Versa).

With the rhumb line bearing calculation in hand, we created functions to compare the local heading to the bearing pointing towards the remote partner. We wanted to determine the absolute difference in these values. The lower the difference, the faster the LED blink rate. The main challenge here is the 360 degree system of headings and bearings. 360 degrees equals 0 degrees. With this in mind, a heading of 10 degrees and a bearing of 350 degrees are only 20 degrees apart, yet with simple math they are 330 degrees apart. Both statements are true, but if you’re moving from 10 degrees to 350 by rotating 330, you’re going the long way around the circle. With this logic in mind, we created a correction if the absolute difference between heading and bearing is greater that 180 degrees (the long way around the circle).

double blinkCalc = abs(bearingToPartner - localHeadingDegrees);
 if (blinkCalc > 180){blinkCalc = -blinkCalc + 360;}

The next challenge was connecting our paired devices to one another on the internet. For this we employed PubNub (pubnub.com) to publish live data from each device. PubNub’s history function was crucial for allowing interaction even if both devices were not online as it stores the last published messages for each channel in a buffer.

For our interaction, each device publishes its local GPS coordinates as well as a local match status boolean that describes if a user is facing the other. The publishing was fairly straightforward using PubNub’s Arduino API code as massaged by our professors Nick Puckett and Kate Hartman.

Appreciating that a local user might not have a GPS fix during a session, we created a ‘self read’ function that pulls back the last known GPS coordinates of the device for use in the bearing calculations. If a GPS fix is achieved, the current data is published and used in updated bearing calculations.

A key challenge was the ‘blocking’ element of the Arduino PubNub code. While reading data from a subscribed PubNub channel, everything else in loop function temporarily stops. This includes sensing magnetometer data and our blinking LEDs. This behaviour is so irksome that we are looking into other data publishing solutions for our second iteration, including using the Particle Photon instead of the Adafruit Feather M0.

Another requirement of our coding was integrating the Adafruit GPS and SparkFun Magnetometer (compass) sensors. However, both companies provide expansive documentation with example code, making the integration of these devices fairly straightforward.

Privacy and Safety

It is reckless to publish one’s GPS coordinates if unsure about the identity and motives of that data’s audience. For future versions of this project, all bearing calculations would happen server-side, negating the need to share coordinates between users.

Electronics

Moon Gaze provided an exciting opportunity to explore GPS and Magnetometer sensors which neither of us had used before.

Moon Gaze electronics overview.
Moon Gaze electronics overview.
Adafruit Ultimate GPS Featherwing

The Featherwing made for easy integration with our Adafruit Feather M0 micro controller. The featherwing is designed to stack on-top of the feather easily connecting the default connection pins.

The GPS unit connects via a serial connection and the example code makes it easy to pull a GPS coordinate from the device. However, we struggled most with getting a GPS fix in the Toronto downtown core and, as mentioned, adapted our code to operate without a GPS fix by pulling the last known address from PubNub on boot.

Link: Adafruit Ultimate GPS Featherwing

SparkFun HMC5883L Magnetometer

This little sensor board provides the local compass heading required to determine if you are indeed facing your partner. This device connects via I2C and the provided documentation and examples made this fairly straightforward to integrate.

Link: SparkFun HMC5883L Magnetometer

Future Developments

We see a lot of potential for future iterations of this project.

As a two-week crash project, we’re incredibly happy with our results, but are already planning next steps based on our team’s observations as well as feedback from our colleagues.

The first refinements would come in developing the physical wearable further. Scaling the electronics down and housing them in a wearable enclosure would make Moon Gaze a more natural experience. We are open to exploring different types of wearable notifications for Moon Gaze. We are interested in testing haptic feedback such as vibration motors for a future iteration.

Unhappy with problems inherent with PubNub’s Arduino implementation, we are looking at other data publish/subscribe options. Adafruit IO is top of the list as it could be more tightly integrated with the Feather M0. Another alternative would be to move to the Particle Photon which features built in networked functions for device-to-device variable sharing.

Moon Gaze could be further developed by adding a web component that displays connections for each relationship. On a meta level, this could be an aesthetic data visualizations of all the connections on Earth. As our cohort mentioned during our presentation, it can also be a useful product for people to find their friends in large event or activities such as a music festival. We do put into concern about our users’ privacy, so we will be very careful for this aspect for any future developments.

Video Demo

 

Warm Thoughts

By Quinn Rockliff & Emma Brito

Project Description

Warm Thoughts allows friends to offer comfort to one another through the warmth of a remote hug. It operates with one friend pushing an inconspicuous button in their wallet which will then warm their friend’s matching wallet. It is a subtle way to offer support and warm wishes to one another while also letting them know that they aren’t alone. The idea is that this can be done from anywhere so that the warmth can be a spontaneous surprise. This project facilitates communication without the need for words even at a distance.

GitHub Code

https://github.com/quinnrockliff/WarmThoughtsExp4

Process Journal

Brainstorming and Beginnings – Initially we were thinking that this project would be a good way to expand on the ideas Quinn was working with in experiment 3. We would be able to create a supportive interaction between two people. We liked the idea of creating a warm and friendly interaction that could offer support remotely. We quickly ended up on the idea of using a heating pad to do this because of the therapeutic functions heat can offer physically and mentally. We were playing with the idea of pressure points but ruled them out because of the problems they posed in regards to wear. We still wanted to maintain the calming quality if the project and decided to try using the heating pad to warm up essential oils. These oils would create a relaxing environment with their aroma once heated, something that would be instigated from one friend to another.

Initial Idea – The full idea that we decided to pursue is one where both people would be receiving and publishing signals from PubNub. The whole interaction would be initiated by the first person pushing a button because they are in need of comfort and support from their friend. This push of a button would light up a LED of the friend’s so that they could in turn push a button to heat up the pad. This would then warm the essential oils of the first person so that their environment would become more comforting thanks to the thought and action of their friend. The idea was that the the LED and button circuit would be portable so that they could be reached wherever they are, while the essential oils and heating pad would have to be stationary.

Wiring and Code – We decided to first get started with wiring each individual circuit and testing it with sample code before having the devices speak to each other. The buttons and LED we were able to get up and running without too much issue.

img-2066

Button we used in the beginning

The heating pad’s circuit we tested with a circuit used for a fan and motor code. It required more voltage so we had to add in an additional power source. Once the fan started running we replaced it with the heating pad.

img-2073

The two circuit boards

Once all of the pieces were working separately it was time to try to get them to speak to each other. We used PubNub examples as a starting point for our code.  We had to alter them according to whether a light or motor was being triggered and different code was used on either side of the interaction. We got the button to LED light code working as well as the button to motor code, but had not written them together.

img-2068

LED being lit through PubNub

Our next goal would be combining the two.

Button Changes – One issue that we encountered with the button is that while it triggered the reaction on the other end, it did not hold it. For instance, in order to  properly heat the pad the button would have to be held down for a long period of time. Needless to say, this isn’t very user friendly. We decided to look at switches and other options that would be able to hold an “on” value. We looked at traditional switches but decided on self-lock switches that stay down when pushed, and are released when pushed again. This way the user does not have to physically hold the button down for the entire interaction. This meant that we had to rewire the button slightly, but were able to use the same code. There are few references for this online, but eventually we were able to find that there is a specific prong to draw the power to.

img-2084

Our new self-locking switch

Issues and Challenges – The big issues began to arise when it came time to merge the codes for the two PubNub interactions. We wrote “readFromPubNub” and “publishtoPubNub” in the Void Loop section of the code so that it is easily called on and that there is a stream of information when reading. We included the value information and if-statements in this section as well. We found a number of syntax errors after this point and had to spend some time cleaning up the code.

Upon going to office hours it was suggested to us to not constantly print a value to PubNub because it can overwhelm the system. Instead we were told to trigger a response in each other’s device. We liked this idea a lot, but it turned out to be easier said than done. We adjusted the code accordingly for each breadboard, but they were not responding to each other properly. The buttons would get stuck on a “1” or “0” value but not switch when pushed in the way they had before. The LED and heating pad would also be turned on occasionally, but not from the buttons. We could not figure out what was triggering them. We tried putting in the old code as well, with the printing values at a delay. The buttons became more reliable with this, but still were not triggering the motors or LEDs. We also re-wrote the if-statements slightly to include an “else” but there were no changes.

Final Iteration & Prototyping – Due to the interaction that is encouraged with this project, there were a couple features we wanted to pursue in our final prototype. The first was that it can be easily carried around with someone on a daily basis, so that it can be used anywhere. Ideally it is useful even when not triggered. For instance, we thought that the button would be great if a part of a pencil case or wallet. This way it is handy and inconspicuous. For the heating pad, we played around with the idea of a scarf, a shirt, and even an eye mask. Ultimately we chose a matching little purse because it is one of a set, and can also be easily carried around while still offering warmth and support. We chose to get smaller breadboards to facilitate this as well.

img-2102

One of the matching change purses

img-2100

Breadboard for prototyping with trimmed wires

We thought that a rechargeable battery pack would be ideal, but they were sold out everywhere we checked. We decided that a AA battery pack  would work as an alternative. We tested them out during our prototyping, but they started to spark and melt the wires. Due to this we decided to use our computer cables instead.

img-2101-1

The “X” to mark the button on one. The warm one has a heart to distinguish it.

Sketches, Design Files, and Photographs

img-2075

Brainstorming how to wear the items – t-shirt, eye mask, and necklace

img-2077

Stationery items – bedside aids

Project Context

Like mentioned above our initial inspiration was a companion piece to Quinn’s experiment 3 project in which she was able to push a button when she felt triggered by content seen online. Here we wanted to create an interaction that offers support to an individual who is going through a stressful period. It was significant that the heat is offered by another person in order to emphasize that they are not a burden and that they are not alone in their experience. The heat is suppose to reference the warmth of a hug, even though if it is at a distance.

Fritzing Diagrams

4

Button Breadboard

3

Heating Pad Breadboard

Video of the Device Being Used

24203501_10155940572809490_1414965372_n

The two final prototypes on display.

References

This heated pillow helped explore different forms that we could use the heating pad in. The pillow could be used to comfort a kid, as well as keep them warm.  https://www.smokonow.com/collections/pillows/products/foxyl

2

1

Comforting Items

PubNub Code: Creation and Computation GitHub Examples – Feather

Button Code: https://www.arduino.cc/en/Tutorial/Button

The Felt-Soothing Device

EXPERIMENT #4

Title: The Felt-Soothing Device

screen-shot-2017-11-23-at-8-21-44-pm

Team Members: Dikla Sinai & Ramona Caprariu

Project Prompt: For this project you will design and create a custom device that allows the members of your group to communicate in new ways.  Using the Wifi connection of the Feather M0, you will create standalone objects that communicate over the internet.  Each group must consider how these devices create a specific language that allows the things, people, or places on the ends of the network to communicate.

Project Description: The Felt-Soothing Device allows a parent to communicate with their child in the other room – as a way to trigger a soothing song that would help the child resume sleep if they are crying. We envision it coming with different felt options for casings so that each user can customize their paired devices in whichever way they want. 

Process Journal:

Day One

When we first met, we decided first to start brainstorming on the cons and failings of communication in our modern world, to see if we could target an avenue to address further in this assignment. We had initially set out to try and create a set of devices that would help foster easier communication.

screen-shot-2017-11-23-at-7-17-55-pm

We quickly figured out that we wanted to tackle an exiting kind of social interaction. We first thought of the radio silence that is associated so often with first meet-ups/impressions when there is awkwardness between people and they are cautious and nervous. Perhaps something that could function as a descriptive facilitator for awkward first conversations? But figuring out a specific enough way to create this device was too difficult for our time and skill constraints. Then we had the idea to make something targeted towards individuals with a hearing impairment communicating with somebody else with a vision impairment. We conveniently found an Arduino project with this exact intention: https://create.arduino.cc/projecthub/skyseeker/deaf-blind-communication-with-1sheeld-arduino-bb3362?ref=tag&ref_id=communication&offset=65

Day Two

The next step was to create user flow diagrams so we could conveniently map out the different components we would need. Especially with how there are multiple channels and flows, developing a clear idea of how the devices would work was essential.

screen-shot-2017-11-23-at-7-45-46-pm

We brought up the idea with Kate and she advised us to hone in on a particular aspect of our overall idea because incorporating an Adafruit LCD screen + keyboard and then also a voice recording module would just be way too complicated and unattainable. She told us to focus on:

  • Visually impaired individual- input is button press and output is different vibration sequences
  • Hearing impaired individual – input is button press and output is different LEDs

This new iteration would then have focused on facilitating a conversation starting between these two individuals. It would be a way to help these two people initiate a conversation with the other.

Day Three

We got more specific in terms of the definition of the outcome we want to achieve. We realized that what is most important to us is to focus in this prototype is the very basic first interaction between our two users.

And so we started working on the code for each device:

User type A – Hearing disability – should have a button to send a message and vibration to feel that they’ve received one.

User type B – Vision disability – should have a button to send a message and LED to see that they’ve received one.

We started researching for design inspiration. We decided that we want to create a bracelet so that people can wear it as a fun accessory. The idea to use it on your wrist so it would be visual and close to a pulse point so you can feel and be aware of it too. 

We went shopping for all the electronic parts – more buttons and lithium-ion polymer batteries.

Day Four

We spent the entire day trying to figure out how to solve our two main problems:

  1. We keep getting ‘client error’ and ‘message read error’ notifications constantly on Ramona’s device. The error appears after we’re use the button 3-4 times and then we need to re-upload the code again to have it work for us, because it won’t snap back into the rhythm of working for us. 
  2. We managed to send values from both devices to PubNub but we couldn’t make them operate each other’s devices.

screen-shot-2017-11-23-at-8-09-12-pm

screen-shot-for-blog

After meeting with Nick we decided to focus on only one side (sending a message on a button press from one of our feathers and having that operate the other device.

Day Five

We’re still having the ‘client error’ message!  We tried to search this on Google and on forums for any kind of  solution, and we could not find it. W also kept asking around the 6th floor studio space if other groups were having this issue, but we could not come to any long-term solution. 

We decided we would have to switch to another sensor since we were experiencing some issues with the vibration motor. We decided to switch to a speaker and we had a quick chat with Kristy and Tommy to get their help (since they were using one).

screen-shot-2017-11-23-at-8-19-06-pm

screen-shot-2017-11-23-at-8-19-18-pm

We then started thinking about the new design concept and intentionality and decided we were going to make a device that would allow a parent to communicate with their crying/napping child in the other room – as a way to trigger a soothing song that would help the child resume sleep. We brainstormed attractive designs for this purpose and settled on creating felt sleeves with cute colourful designs fit for a young child.

screen-shot-2017-11-23-at-8-31-48-pm

We proceeded to add a tune library to the Arduino code to make the speaker work. We found a lullaby tune that would fit the idea.

We managed to make one code to send ‘0’ to PubNub and one device to receive ‘0’and operate the speaker. The problem we had now was that the first device was sending ‘0’ constantly so the speaker was playing the tune over and over again.

We thought that we can solve it by adding an ‘if’ statement, so that the first device we send ‘1’ regularly, and on button click send ‘0’, which will make the speaker play a tune. But no matter what we tried we couldn’t make it work.

We scheduled another meeting with Kate and started to work on the design.

We created 2 decorative felt sleeves to cover the devices.

Day Six

‘Client error’ message is still there 🙁

We continued working on our code:

We changed the button values to boolean (true/false) for the button click. Then defined that every time we click the button it will send “true”, and the other device will get the “true” and be triggered to play the tune.

We also tried to fix the client error. Again, no luck. We decided to try switching feathers for the send/receive functionality and test it to see if we still getting the same message.

We took video documentation of the two working together also!

Photo Documentation:

screen-shot-2017-11-23-at-8-34-01-pm

screen-shot-2017-11-23-at-8-33-55-pm

screen-shot-2017-11-23-at-8-33-46-pm

screen-shot-2017-11-23-at-8-33-32-pm

Video Documentation:

Fritzing/Circuity Diagrams:

wiringnov23

dikla2

Code: https://github.com/LolaSNI/feltsoothingdevice

 

Morsecode Messenger 2.0

By Feng Yuan and Roxanne Henry

Project Description

The project is a continuation of Feng’s experiment 3 morse code project. In this iteration, we configure two feathers to be able to send and receive morse code signals from one another. The input remains the same, but the output now goes to an on-board OLED display, which displays the signal’s letter equivalent.

Design Sketches

The original plan was a portable wireless Morse code signals sender. Feng and Roxanne imaged this device should be a bracelet attached with a led screen and several buttons. The batteries and feather board will be hidden inside the bracelet. The buttons could be used to enter the Morse code signals, and the screen would be used to display the receiving message. (As the image below)

img_2277

Because of the limitation of time and equipments( a sew machine would boot the craft process ), they switched their idea and decided to make a tabletop paper-made morse code device. This device would still include two parts: a button-board(input the message) and a screen (output the signals). They decided to use white board papers. The final result would be white, clean and neat.

img_2280

Circuit File

experiment-4_bb

Here is the link to the video

https://vimeo.com/244580952

Here is the link to the code

https://github.com/rh11lp/rh11lp.github.io/blob/master/experiment%204/sketch_nov08b.ino.ino

Process Journal

Step One: The first step of the project, ideation, had essentially already been done in the previous experiment when Feng and Roxanne decided to reuse the idea and add networking to it. The biggest hurdle they encountered in the ideation phase was the physical design of the finished product. This would not affect the code necessary to get the product functioning, however, so Feng and Roxanne put that aside for the beginning.

img_2247

The first step to getting the code working was to reexamine the way Feng’s original code functioned. It wasn’t too challenging to transfer the functionality of publishing results via serial to publishing to PubNub. There was some consideration about whether or not Feng and Roxanne wanted to send whole words at a time or not, but in the end, the decision kept the status quo. Single letters were sent per package in order to maintain the spirit of morse code messaging, and for simplicity sake.

img_2250

Step Two:

Feng and Roxanne have experiment various output methods : LCD screen, Buzzer Speaker, LED lights, and LED screen.

  • Standard 16×2 LCD Screen is too big for this project.
  • Piezo Buzzer can’t make a tone and volume is very limited. And a morse code buzzing noise can convert the morse code to something easily “readable”
  • LED lights look delightful, but also can’t make the signals readable.

After testing all these methods, they found the OLED feather wing most match with our project. The OLED screen could be easily applied with feather board. And int values and string values could be displayed on the screen. Based on the results of testings, they chose to use OLED feather wing as the output section.

img_5546

Following the implementation of the publishing function and decisions about button layout, Feng and Roxanne looked into subscriptions. At first, it seemed to be doing fine. The tests were scripted and didn’t leave much room for error.

  1. Load program on each device
  2. Send from device A
  3. Receive on device B
  4. Send from device B
  5. Receive on device A

The trouble arose when device B would try to receive a message before device A managed to send one by using a timer to activate the subscription function. Roxanne initially suspected the Feather of running out of memory and experimented with adjusting the buffer size, as well as adjusting the timing of memory allocation for the messages.

block

Of course, none of these things were the problem. The issue was with the actual subscribe function. PubNub’s official documentation describes the functionality as such:  

“Listen for a message on a given channel. The function will block and return when a message arrives.”

This means that whenever the server had nothing to provide as far as a new message went, the feather’s program would essentially hang, and wait, and wait, and wait, until PubNub answered with something. This would happen despite the timeout being specified. Roxanne suspects there is a bug in the API and is very upset about this.

img_2252

So, Roxanne and Feng decided to implement the subscription function to be activated on button-press. They used the built in button B on the OLED wing to accomplish this.

Step Three:

The final step of the project was to build the physical product. While deciding the layout of the wires, Feng and Roxanne decided it would be a good idea to try and conserve space by attaching both the ground and input wires directly onto the resistor. Upon testing this model, they found that this did not work.

One solution they tentatively tried was to use copper tape instead of wires for the input pin connection. They discovered that the copper tape they had acquired was not conductive on both side. Since the layout required a lot of turns, it was impossible to create a path without overlapping the tape. In the future, they are certain copper tape with conductive adhesive would have been a better choice.

img_2260

So, Roxanne desoldered all the fancy Ys she had put together and mournfully put the input wire on the opposite side of the button where it belonged.

img_2262

Feng and Roxanne choose to use the white hard paper boards as the bottom of the device and the white thin paper to make the case hiding the feather board and wires.

1. Measure the sizes of the board and button.

 

6a53c451-8d88-48d0-8949-d1ce4b9230c7

e3ea878f-9b24-417a-bda1-e04585075820

2. Layout the button position and board position on the hard paper board.

 

img_5551

3. Make the box and stick the box on the paper board

4. Engrave the button holes for make the space for buttons

 

img_2258

img_5552

5. Organize the wires and make them orderly

6. Connect the buttons with the board

 

img_5553

img_2263

7. Hide the wires and close the box.

screen-shot-2017-11-26-at-11-28-34-pm

 

Mansplainer

Experiment #4: Mansplainer

Kristy Boyce and Tommy Ting

The Mansplainer in Situ.
The Mansplainer in Situ.

Mansplainer is an emergency button for use in the cases of  “mansplaining.” Derived from the Latin, Mannus Interrupticus, it is  the ancient art of a man (often white,)  explaining something, to a woman, and/or person of colour, how to better do, be,, etc,.

The button, which is connected to WiFi, triggers a WiFi speaker system to play the soon to be hit track “Mansplainer”.

A lot of our feelings about why one might need a “Mansplainer” station can be summed up in the video “The Handmaid’s Tale For Men”

https://youtu.be/ciPszqk703k

“This is the story of Manfred, a man just trying to survive in a world under the harsh rule of the feminazis”

 

Circuit Layout

Featherwing on feather MO
Featherwing on feather MO, then out to speaker (Speaker is connected to the featherwing with an audio cable, fritzing did not have a part for audio cables)
Feather with large LED button
Feather with large LED button

Code

https://github.com/livefastdynasty/Mansplainer

 

Supporting Visuals

Mansplainer img_3603
Process Journal

Day 1 – Brainstorming

We brainstormed a few initial ideas on our first day. We first looked at ideas that inspired us such as the methane gas detector from last year. We were both drawn to ideas that were ridiculous and added to the “internet of shit”. We then asked ourselves, “what do we hate?” and came up with a list of pet peeves most of which were masculine traits that we thought were disruptive to our daily lives. Two things really stuck – Mansplaining and  Manspreading. Mansplaining was very well documented and succinctly described in Rebecca Solnit’s book, Men Explain Things to Me (2005).

We briefly described what these two prototypes would look like. A “Mansplainer” would be a kind of walkie-talkie system where if the system detects a “female” higher pitch then on the other end if would garble up the speech into gibberish, but if it were a male or a lower pitch voice speaking into the system then it would freely allow the speech to go through. A “Manspreader” would use the flex sensor down the legs and a force resistor on the bum. When a person sits down and spreads their legs on public transport, then a shock would be sent. We let these two ideas sit until we met up again.

Brainstorming board
Brainstorming board

 

Day 2 – Finalizing idea

We decided on the Mansplaining walkie talkie idea. We spoke with Kate and she suggested that we simplify the idea and we came up with a button trigger system instead since we have to use WiFi to connect the devices and not radio. We quickly settled for this new idea and looked into things we have to purchase which included the music maker feather wing and a big button.

Sketching out interaction.
Sketching out interaction.
Rough sketches of button and speaker design.
Rough sketches of button and speaker design.

 

Day 3 – Testing with codes and simple button, LED light and piezo speakers

We first played with two sample codes to ensure that we have a communicating system between two feathers. We adapted the pubnub samples from Nick and the LED samples from experiment 1. We changed parts of Nick’s code so that the button only generated either a 0 or 1. We successfully used the button the turn the LED light on and off. As you can see in the video documentation, there was a delay in the button and triggering the LED light but we figured that it was the WiFi connection. We also noticed that after a few attempts, “client error” would show up on the button side of the serial port.

We then switched the LED light out for a piezo speaker instead. This is essentially the very basic version of what our final project would look like; a button triggering some sort of tone/music. The piezo speaker required a pitches library and we got it working by adapting the previous LED code with some sample piezo speaker code. Neither of us did a lot of arduino coding in our previous group experiments so we were pleased with our abilities to get a very simple system going on our first attempt.

screen-shot-2017-11-25-at-3-46-47-pm

Testing with piezo speaker
Testing with piezo speaker
Button testing, the large push button LED was in the mail
Button testing, the large push button LED was in the mail

 

Day 4 – Music Maker Feather Wing and LED Button, 3D Printing

We wanted our system to play music rather than an alarm, so we purchased the Adafruit Music Maker Feather Wing from an online shop based in Montreal. The music maker feather wing has a microSD slot and a 3.5mm audio jack slot to connect to a speaker that goes on top of the feather M0.

We also bought a bigger button that has a LED light that would turn on when pressed. First, we ran into some problems with the music maker microSD formatting requirements but once we figured that out we were able to successfully play a track using their example code. While the new button was being wired up, we combined the existing code for the piezo speaker with the new example code from the music maker library. screen-shot-2017-11-25-at-3-47-09-pm

We removed all the code relating to the piezo speakers and pasted in the music maker code. In the music maker example code, the play music function was sitting in the void setup section, which we mistakenly did not catch at the first few attempts. Because the play music function was in set up instead of loop, the track would just play regardless of the button. There were also a lot of unnecessary code in the example which we either removed or disabled.  Once we put all the things in the right place, we were able to use the button to trigger the music maker to play our track.

The LED push button worked pretty much like our arcade button that we had been using for testing, except that it had two more attachments for power and ground to the light and a self-contained resistor within it.

We wired the button so that it stayed on continually until pressed. We did find this interaction a little fast for the eye to catch, and in future would most likely program in a blink to the beat of the music upon button press.

Testing LED button sending to featherwing on press, over wifi.
Testing LED button sending to featherwing on press, over wifi.
Large LED push button testing
Large LED push button testing

 

 

Additionally, we spent several days attempting to 3D print our speaker encasement, but the print job failed 7 times before all of the 3D printers in the lab broke.

Success! The featherwing music maker worked!

Fusion 360 file that never turned into a speaker
Fusion 360 file that never turned into a speaker

 

 

Day 5 – Crafting Day and Recording

We decided to riff on the song “Maneater” by Nelly Furtado to create our new song “Mansplainer”. We wrote some new lyrics of the chorus and recorded it then combined it with an instrumental track of Maneater.

“Mansplainer,

cut you right off

with his man thoughts

can’t you please just shut the fuck up?

He’s a mansplainer,

make your life hard,

give you bad scars,

wish you never ever met him at all.

https://www.youtube.com/watch?v=Pvr_ivXcuok

We chose a faux concrete finish spray paint to cover wooden boxes for the button and speakers enclosures
We chose a faux concrete finish spray paint to cover wooden boxes for the button and speakers enclosures
We hand painted wooden letters for the signage.
We hand painted wooden letters for the signage.
Speaker in the enclosure with usb power supply
Speaker in the enclosure with usb power supply
The speaker mesh was hand painted red and sculpted from a sheet of wire mesh, into a circular shape.
The speaker mesh was hand painted red and sculpted from a sheet of wire mesh, into a circular shape.
Final protoype at critique
Final prototype at critique

 


The Mansplainer in action at critique

Project Context

As people who have both experienced toxic masculinity, we wanted to a project that explored and poked fun at the things we find irksome in terms of western presentations of masculinity.

There is actually statistical dating showing the number of times a female doctor, for example, is interrupted vs her male counterpart. A look at many American tv stations will show males guests interrupting and talking over or in a patronizing manner to female anchors. In the 2016 U.S. election, Donald Trump repeatedly interrupted Hillary Clinton during her speaking allotment, then complained he hadn’t been given a fair amount of time to talk. While we see this as a serious problem, we also recognize that mansplaining can often be well-intentioned, so we wanted to approach the project with humour and a light touch, while still underscoring the issue at hand.

We took inspiration from different forms of satire and parody like Kate McKinnon’s Justin Bieber Calvin Klein ad and Saturday Night Live’s classic “Bag O’ Glass sketch” as well as their fake products like “Oops, I crapped my pants” and “Woomba.”

Bibliography

Solnit, Rebecca. Men explain things to me. Haymarket Books, 2015.

Internet of Shit – Reddit subreddit

https://www.reddit.com/r/theinternetofshit/

 

Calvin Klein Ad – SNL

https://youtu.be/OXvo6ksBHnI

 

Oops, I crapped my pants

https://youtu.be/iUP3PMLdoOs

 

Woomba – It’s a robot and it cleans my business

https://youtu.be/gqesEYUXr78

 

A Cultural History of Mansplaining

https://www.theatlantic.com/sexes/archive/2012/11/a-cultural-history-of-mansplaining/264380/

 

A look at the science behind one of the Internet’s favorite new words.

https://www.youtube.com/watch?v=t7GUjKv9qSI&feature=youtu.be

 

Large LED push button tutorial:

http://play.karlssonrobotics.com/tutorials/circuits/wire-big-dome-button/

 

Nelly Furtado’s cover of “Maneater” with the vocal removed:

https://youtu.be/bDoTqATJL6c

We used the PubNub sample setup from Kate Hartman and Nick Puckett’s class as well as their provided Arduino examples as a base for our code.

 

Responsive Selfie Light

Roxanne Baril-Bédard and Max Lander

We set out to make a good selfie light changed with a color sensor.

Code

https://github.com/naxwell/Selfie-Light

Photographs

dscf9258

img_20171123_152414

img_20171123_152425

img_20171123_211013

24139229_10155852001609437_1164711792_o

selfie3

dscf9255

 

Schematic

featherwing

24139150_10155852474139437_1710695805_o

Design File

24007946_10155846721349437_1842661338_o

Video of the device being used

Process journal

In the beginning we had some trouble coming up with an idea we both wanted to go with, so we decided to look at the sensors and outputs and see what we both felt excited about using. Max was pretty into trying out one ot the featherwings since that seemed like an accessible way to get into something that otherwise might be a little more challenging. Roxanne was interested in working with the colour sensor (RGB Color Sensor with IR filter and White LED – TCS34725), so we decided to think about something that could use a colour sensor on one end and something that could use that colour information on the other end, settling on the neopixel featherwing.

Setting up the feather wing was quite simple since there is lots of support on adafruit for it.

Neopixel featherwing set up and info we used: https://learn.adafruit.com/adafruit-neopixel-featherwing/overview

Neopixel uberguide used to install the library and get it all sparkly:

https://learn.adafruit.com/adafruit-neopixel-uberguide

Copying over the relevant bits into one of the class examples used to connect to pubhub let us use the colorwipe function from the neopixel example as our light up animation. From there we set the other feather up to send three random values between 0 and 255 (together, giving us a random rgb value to output to the feather), so that we could be sure that the two feathers were speaking to one another.

We thought it would be rather painless to use the sensor. However, we noticed many of the colors were muddy. Going to Kate, she told us we had two options, which we both considered and tried to use – fudging/pumping the numbers on the receiving end or calibrating the sensor.

If we fudge the number on the receiving end instead of calibrating the color picker, we will lose a range in breadth of colors because they will be exaggerated and estimated. We tried to calibrate the sensor by resetting the max values for all of its colors: R G B and Clear, tho we are unsure still what clear is.

We found some forum post (https://forums.adafruit.com/viewtopic.php?f=19&t=72140&sid=a45c405777f4715b60ba78e4f388ac32 ) that went into trying to write a calibration code using the analog sensor calibrating example that come with the IDE.

We tried to read of the color (Red, Green, Blue) for calibration off a computer screen and it absolutely did not work so we resorted to printing a sheet with the colour. The printed colour were not that good either, so we bought some cardboard. Roxanne was really hung up on getting the perfect colours. We then put everything in boxes, the neopixels in a plastic ball (we found this at Michaels and it’s apparently used to make custom christmas ornaments, which Max finds to be a horrific thought), and for the colour reader end, we found an appropriately sized cardboard box. We had to colour the hole around the color reader with a marker so that it would be black, not the brown of the cardboard in order to get a correct colour reading. Since we already had the code set up to send and receive the three values, it was just a matter of plugging the colour reader values into the variables being sent to pubnub.

Project Context

LED lights used for photography are plentiful, but we thought it would be kind of neat to make something that could be portable and respond quickly to whatever environment it was was in, to have a light that could respond to the colour of someone’s shirt, or the wall you were standing on. Max thought it would be interesting as a potentially useful series of photos lights, as this very basic tech could be expanded to panels of whatever size and it would be very simple to add more lights to the network to create enough environment light to use as professional photo lights.

We also thought about making a software that can calculate complimentary or tertiary colors from one scanned color and to output them on different neopixels to make a more complete photography lighting environment which could be interesting.

As for the other projects this one is inspired from, a lot of the code and the figuring out how to use the colour sensor was found online, on the adafruit website and on arduino forums like mentioned earlier. https://learn.adafruit.com/adafruit-color-sensors/overview

Odd Man In & Consider Growth

Project Log – Odd Man In & Consider Growth (Experiment 4)

Dave Foster, Chris Luginbuhl, Karo Castro-Wunsch

Creation & Computation DIGF-6037-001 (Kate Hartman & Nicholas Puckett)

Project Description (from course Assignments tab):

Digital messaging is a pervasive part of our daily lives that has taken on a variety of forms and formats.  Working in small groups you will develop hardware and software systems that investigate methods of messaging and notifications.  Some of the topics we will cover include: synchronous/asynchronous communication, ambient data, alerts, web API’s, and messaging services.

For this project you will design and create a custom device that allows the members of your group to communicate in new ways.  Using the Wifi connection of the Feather M0, you will create standalone objects that communicate over the internet.  Each group must consider how these devices create a specific language that allows the things, people, or places on the ends of the network to communicate.

Project Initial Ideas:

From Dave (Balsamiq sketch below):

The Problem Being Addressed:

We are all, at some point in our lives, deeply immersed in one project or another (essays, art projects, etc.) and have no wish to be interrupted by Facebook Messenger, Twitter, etc. (even if these services are part of another assigned project).  What if there was a step back to the old-fashioned pager?  A self-standing, project specific communication request alarm triggered by any content added to the project’s “group application” (Facebook, Twitter, etc.)?.

The Concept:

A project specific communication request notification device (similar in concept to early pagers) which bypasses or steps back from direct Facebook/Chat-room/Web pop-ups/etc.  The idea is that the group’s 3 (or more) members would not have to be logged in to anything to receive notification that another group member was requesting communication.  It would be implied that this would be specific to messages about the project assigned to the group.

A project specific Facebook group, Twitter account and E-mail account would be set up with membership restricted to the group members for a specific project.  Each communication modality would be assigned a colour (red, green or blue) and would carry/wear a small device containing his Feather board wired to 2 differently coloured LED’s and (possibly) a vibration motor or similar noise maker.  The LED of each modality’s assigned colours would blink (and possible noisemaker would buzz) whenever any member had posted to the group’s service.

Use would be as a “filter” so that, if you’re frantically working on an essay or other assignment, you would not have to have any other service or application active (no distractions from your work due to pop-ups etc.).  All you would see is a blinking light if a group specific message was posted by another group member.

screen-shot-2017-11-25-at-3-03-48-pm

From Chris & Karo:

Concept 1:

The problem & context:

People who are trying to establish a consistent rhythm in their life often struggle to be consistent. Typical examples include:

  • self-employed people who wish to be at their desk and working by a certain time each day.
  • People learning to play a musical instrument by practicing daily
  • People wishing to establish a regular mindfulness meditation practice.
  • Writers struggling to finish a book

Establishing a routine is an important part of maintaining helpful habits (Gardiner, 2012)

Accountability partners work in different ways, but one format is a quick, daily check in, along the lines of “It’s 9am and I’m at my desk”.

Accountability partners is an idea that has gained momentum since the 90s. There is ample testimony that accountability improves the chances of sticking with a program (Inc, Quiet Rev, Huffington Post ).

From Wikipedia:  Not having an accountability partner to help a person accomplish their goal is one reason 92% of people did not accomplish their New Year’s resolution according to a University of Scranton study[5] by Dan Diamond in Forbes and an article by Dale Tyson.[6]

See also: Why an Accountability Buddy Is Your Secret Weapon for Faster Growth (Entrepreneur Magazine)

At the same time more and more people are working from home (3% of US workers work from home at least half time, according to CNN and Global Workplace Analytics and FlexJobs (link)). This means that remote accountability partners that on-site ones – typically smartphones and/or social media to connect. One problem with this arrangement is that smartphones and social media are perfect tools for procrastination (Meier, 2016).

Solutions

One approach to accountability partnering that avoids these pitfalls follows the approach of calm technology (Weiser 2015) – that technology can help us best by existing at the periphery of awareness, rather than by demanding our attention.

Borrowing from Kate Hartman’s book “Make: Wearable Electronics: Design, prototype and wear your own interactive garments” pp 56-59, a sandwich switch could be used in a couch cushion, chair cushion or meditation cushion to send a wireless signal to an accountability partner that indicates the user is sitting at their work (practice, meditation, etc). This signal could result in a public post for accountability, or a private signal sent just to the accountability partner.

Numerous variations are possible… the partners could each have the same device and commit to both sitting down at 9am. Each partner receives a visible, tactile or auditory cue that the other has arrived. Progress and consistency could be tracked on an online dashboard.

This system could also be used (e.g. with exercise clothing having bend sensors integrated) for exercise accountability tracking, thereby overcoming some of the shortcomings of the much hyped but disappointing accountability ecosystem Gym Pact (www.pactapp.com/)

References:

Gardner, B., Lally, P., & Wardle, J. (2012). Making health habitual: the psychology of “habit-formation” and general practice. The British Journal of General Practice, 62(605), 664–666. http://doi.org/10.3399/bjgp12X659466

Hartman, Kate “Make: Wearable Electronics: Design, prototype and wear your own interactive garments” O’Reilly Media (2014) pp 56-59

Adrian Meier, Leonard Reinecke, Christine E. Meltzer “Facebocrastination”? Predictors of using Facebook for procrastination and its effects on students’ well-being Computers in Human Behavior 64 (2016) 65-76

Weiser, Mark, and John Seely Brown. “Designing Calm Technology.” Designing Calm Technology. Http://www.ubiq.com/weiser/calmtech/calmtech.htm, 21 Dec. 1995. Web. 28 Oct. 2015.

Concept 2

Problem – we want to connect with loved ones, but screen time takes us away from each other.

Solutions

A metallic pendant worn against the skin incorporating the feather, heating element and Li-poly battery. A squeeze sends a message to the partner’s matching pendant, causing it to glow, vibrate, and/or heat up briefly. Partners wear matching pendants and messaging is 2-way.

Concept 3

Artwork investigating networks – neural networks, ecosystem, societal networks.

Networks are familiar from their hundreds of examples in nature and underpin the structure of our own brain. Neural networks are at least partially responsible for brains.

Creating a wifi (or XBee) connected network of physical nodes (a node being e.g. an LED with a sensor or button in a housing), would it be possible to establish and demonstrate information being passed through the network?

Could human input or intervention alter, enhance or suppress the patterns of information?

If more people come to the party to interact, at what point does it enhance the complexity, connectivity or synchronicity or the network, and at what point does too much human interference make it collapse?

Final Form(s):

Dave:

  1. Free-standing container with Feather controller (coded for wireless access), lithium ion battery & 3 LED’s (Red, Green, Blue).  Container to be configured to “hook” over the screen of a laptop such that the LED’s are visible on a flat surface facing the user.
    1. Each LED coded to flash/blink (or at least turn on) indicating “communication request” through one of 3 pre-established group pages (Facebook, Twitter & E-mail).
    2. Allows for (semi)uninterrupted work on other projects while remaining potentially aware of communication regarding assigned group project.

screen-shot-2017-11-26-at-11-55-45-am

Plain English Logic Flow (not code)

Begin:

All LED’s to OFF

Link Feather to Pubnub account

Link Pubnub account to Facebook, Twitter and E-mail account

Monitor

  1. Facebook group “Odd Man In”
  2. Twitter feed “Odd Man In”
  3. OCADU student E-mail account

IF – posting to Facebook group = YES

Go to RED LED

IF – posting to Twitter feed = YES

Go to GREEN LED

IF – posting to OCADU E-mail = YES

Go to BLUE LED

RED LED

IF – Not logged into Facebook

RED LED at maximum

IF – logged into Facebook

Ignore

GREEN LED

IF – Not logged into Twitter

GREEN LED at maximum

IF – logged into Twitter

Ignore

BLUE LED

IF – Not logged into E-mail account

BLUE LED at maximum

IF – logged into E-mail account

Ignore

Project Daily Log:

Tuesday, Nov. 14 – 12:00 to 13:30:

Chris & Dave met in the D/F lab at 12:00 for a design conference.  Dave put his idea forward (see above Balsamiq sketch) noting the physical design simplicity and adherence to the assignment parameters.  Chris mentioned some related notification input methods involving pressure sensors or switches installed in cushions.

While searching for applicable IFTTT links to Facebook Messenger, Dave received an E-mail from PubNub regarding a new service called ChatEngine that merits further examination.  Chris and Dave to meet again (probably) late Thursday afternoon.

Thurs Nov 16 – 6-8pm

Experimenting with example code. Trying to understand some of the workings:

-how JSON objects are created & parsed

-how pointers (* and &) work in C++

-object-oriented programming principles (e.g. wifiClient object).

Fri Nov 17 – 13:00 and After Class

Chris & Dave met in DF lab and after C & C class.  Further discussion as to which idea to implement and method of implementation.  After brief discussion with Kate, Dave seems to be leaning strongly towards the pager application with Chris concentrating on “point of presence” or “accountability partner” application.  At base, we’re trying to find a relatively simple “hey you” function specific to group members.  Looked through IFTTT for applets that might work.  We may be able to go through Adafruit I/O directly rather than over-complicating the exercise with Pubnub’s ChatEngine or similar.  We might be able to push a notification with a colour code for each communication method required (blue for Facebook Messenger, red for Twitter, etc.).  Chris has some coding examples which will be examined Tuesday.

Mon. Nov 20 (in class)

Further discussion amongst Dave, Chris & Karo re:  final form for individual devices.  Dave is concentrated on the simple pager.  Chris is attracted to the “accountability partner” cushion idea.  There will be some differences in the three products.  Meeting in the D/F lab Tuesday.

We worked together to ensure everyone’s Feather worked and that we could publish and read from the same Pubnub channel. We also prototyped a version of the software which published a message (1, 2 or 3) depending on which one of three switches was pressed, then read that message back from pubnub and lit an LED corresponding to the message (see video: https://youtu.be/kjJLgw94Yiw)

screen-shot-2017-11-26-at-12-01-57-pm

Caption: pager circuit working on a breadboard

Tues. Nov 21

Dave working on container for over-screen paging system as well as wiring for the Feather controller in his version of the project.  Final testing of code and Feather assembly (all LED’s working to spec. — code link and Fritzing diagram below).

https://github.com/DFstop/Odd-Man-In/blob/master/almost_instant_messaging.ino

screen-shot-2017-11-26-at-12-03-49-pm

Thursday Nov 23

Final form for Dave’s prototype cut from foamcore and glued together (photo below).  To set and cure overnight.  Hopefully all connections will remain intact after assembly and will work tomorrow.

screen-shot-2017-11-26-at-12-05-03-pm

Friday, November 24:

Both Chris’ and Karo’s applications functioned as planned.  Dave’s appears to have suffered a disconnect during construction of the housing as it does not function (though it did Thursday night).  Tried multiple reloads/resets of the controller with no luck.  Believe I did at least explain the function adequately.

Consider Growth:

As mentioned above, “Consider Growth” branched off of our group project on Monday Nov 20, though we continued to meet as a group. We discussed the right kind of technology to bring to this problem – how to encourage users without distracting or irritating them.

We decided to build a sandwich switch which could be slid inside existing cushions (e.g. couch cushions or meditation cushions) or placed on top of a chair.

At the same time, we discussed different ways reflecting users’ data back to them on a website or app. A simple line bar graph showing daily and monthly sitting totals would be a conventional option, but we wanted to do something more imaginative to reflect the open-ended experience of taking up a new skill, discipline or hobby. More details on how the graphical representation evolved can be found below.

A typical use scenario works like this:

-At 9am, the server sends a message start the session.

-If the user’s accountability partner is sitting, the user’s cushion EL wire lights up to indicate the accountability partner is sitting. And vice-versa.

-When the user sits, the user’s EL wire turns off.

-When both users are sitting, the server sends a message that rings the users bells signalling the start of the session.

-When either user sits, a generative animation “grows” on the webpage. If both users sit, both animations grow. If either one gets up, their animation stops growing.

-When the timer reaches a set amount of time (e.g. 20 minutes), the bell rings signalling the end of the session.

-Note that once the users sit, they can choose to view the animation or not. They will not need to interact with or receive notifications from the system until the end bell rings. This is a deliberate measure to reduce distractions.

We needed to make a larger “sandwich switch” than the one illustrated in Kate’s book (Hartman,  2014 pp 56-59). We decided to include internal “springs” of felt. As luck would have it, our first guess about the design of insulator between layers of conductive fabric worked well in testing on a variety of cushions. The conductive fabric was cut into two identical shapes, ironed onto the felt, and along with the insulation layer, the 5 layer sandwich was stitched together with bar tacks in the corners.

screen-shot-2017-11-26-at-12-07-42-pm

Caption: Switch – black felt with silver conductive fabric

screen-shot-2017-11-26-at-12-08-38-pm

Caption: you can’t solder to this conductive fabric. We had to stitch with conductive thread.

screen-shot-2017-11-26-at-12-09-34-pm

Caption – the lower layer of conductive fabric is visible through the layer of insulating felt with holes cut into it. It was tempting to make the holes in a seasonal snowflake pattern.

screen-shot-2017-11-26-at-12-10-43-pm

Caption: Assembled switch. The garter clip provides strain relief for the barrel jack used as a connector. We wanted to use a two-prong plug that could not be accidentally connected to our battery back, which had a JST connector.

We chose EL wire for this application because of its soft, even light and inherent flexibility and adaptability to different cushions. We also wanted to use a solenoid to ring a meditation gong at to signal the beginning and end of the sitting session rather than a screen or mobile device-based notification.

Both the solenoid and EL wire required 5V, so we used NPN transistors to switch the 4.5V from our battery pack using the Feather’s 3.3V logic. We tested this assembly by having the sandwich switch operate the EL wire and solenoid using a simple Arduino sketch (see video: https://youtu.be/5M5sWKTFRKo)

Putting the whole assembly into the project box with strain relief was time consuming, but it was helpful to have connectors on everything so that it could be transported easily without yanking wires out of their connections accidentally.

screen-shot-2017-11-26-at-12-12-30-pm

Caption: The solenoid (top), feather and featherwing protoboard (upper middle) with EL wire inverter (bottom) in a project box.

screen-shot-2017-11-26-at-12-13-30-pm

Caption: The wiring diagram for the device. It was all connected to the featherwing protoboard shown at the top.

We didn’t have time to make a second copy of the switch and circuit, and decided to demonstrate the operation by using a second Feather (running the same code) with an SPST tactile switch attached.

Some design sketches are below:

screen-shot-2017-11-26-at-12-14-40-pm

Caption: System architecture v1. “Everything is going to fit easily and there will be no need for a box”

screen-shot-2017-11-26-at-12-15-37-pm

Caption: The two transistor-based switch circuits. The one for the solenoid has a diode to prevent a high reverse voltage from damaging the transistor when the solenoid is disconnected, since it is an inductive load.

screen-shot-2017-11-26-at-12-16-52-pm

Caption: Design notebook page showing final system architecture including connectors. The notes are a prioritized list of the issues to work on. We got to most of these…

screen-shot-2017-11-26-at-12-17-46-pm

Caption: Final assembly with sandwich switch removed from inside of cushion.

The code for the Arduino is here: https://github.com/ChrisLuginbuhl/consider_growth

The code uses a single channel to publish and read from Pubnub. Messages are JSON formatted and have a user name as the key, with a simple binary code indicating whether that user is sitting (e.g. {karoMessage:1} means Karo is sitting). The website javascript also receives these messages, and sends a message to ring the bell.

Consider Growth Visualization

Code: https://github.com/KaroAntonio/consider-growth

Demo: http://realgoodinternet.me/consider-growth/

The original consider growth visualization concept was to employ vis that grows organically, ie morphogenic structures that imitate the growth and system interactions of biological forms. In pursuit of this, I implemented a JS port of the differential line algorithm hoping to use the underlying growth pattern to inform the display. The implementation worked but due to runtime considerations, it’s current version isn’t useable for real-time use. Moving on from this we employed trigonometric waves, perlin deformations and simple modular rhythms in combinations to produce a series of parametric randomized forms that have a large amount of variation but are consistently visually engaging. The waves’ specifics can be investigated in the repo.

The waveforms were hooked into pubnub via pubnub’s JS API so that a waveform is turned on whenever someone sits on their pillow and announces their presence into the virtual space. The intention of using a line as an avatar is to produce an environment which is non-competitive and really stripped down, the limitations allowing users to be present without having to make any choices about their virtual actions and representation.

 

Shepko_Harkin_Exp4

Title

Get Me Coffee

 

Members

Sana Shepko and Sean Harkin

 

Code

GitHub Code

 

Project Description

A messaging system intended to notify members of a group, class, or office space when someone is going to get coffee, in case others want coffee too. This system would be materialized in the form of a small device in the shape of a symbolic coffee cup (perhaps a keychain) with a toggle and two LEDs. The toggle, when moved to the side of the green LED, can have two meanings: either, that you are going to get coffee; or as a response to the message that someone is getting coffee, indicating that YES, you want coffee. When moving the toggle to either side, it lights up the other person’s device. When it’s moved to the side of the red LED, it indicates the response to the message “do you want coffee” as NO, you don’t want coffee.

We envision this project as being used for groups larger than 2 people; especially if used in an office or studio setting, it could save time rather than having to message a large group.

FINAL

 

20171124_130736 20171124_130746

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Process Journal

DAY 1

Talked about our initial idea. We are planning to build a message notification system that would communicate who is getting coffee and how many people would like coffee in a group. Basically, the way it would work is that each person (whether in a class, or in an office setting) would have a small button device with an LED.

We are planning to build this in a similar way to the Amazon Dash (see below).

screen-shot-2017-11-14-at-11-24-41-am

 

There are two parts of the device: an LED button and a small LED light. Each of these components will signal different messages.

SCENARIO

If person A is going out to get coffee and wants to let others know they can get coffee for them too, they press their LED button. Person A, who sent out the initial signal, would see that their LED button fades High-Low. Everyone else in the space will receive this message through their LED light flashing consistently.

If, for example, person B wanted to confirm with person A that they wanted coffee, they would press their LED button which thereby would turn off the flashing LED on their device but turn on the LED button light, which would remain on until person A turns off their outgoing signal.

When person B presses their LED button, and if, for example, person C and D also press their LED button, person A will receive a notification that allows them to know how many people would like coffee. Person A would receive a signal in their LED light; the number of times the LED flashes is the amount of people who want coffee. This LED flash would be on a loop, and would be a series of flashes which would stop, and then repeat.

dsc_0033

This is our initial idea and starting point, and we will see how it develops!

For today:

– begin to explore code with featherboard

DAY 2

Planning on 3D printing our devices. Some sketches:

The original idea was design the casing to be small enough to be attached to a key-chain or be attached to the side of your work monitor. However, due to the size of the Feather and the battery, we realized very quickly that this would not be possible. If the product was to be developed further, we would plan to design and build a custom board which would shrink the overall size of the product.

For the battery we chose a lithium-ion polymer 3.7V 1000mAh. We chose this as the voltage required to power our relatively simple device could very easily be done with 3.7V, and would not require the use of transistors which may be required with larger voltage batteries. Since the LiPo batteries are also rechargeable, both Sana and Sean would be able to reuse the power-source for future projects.

cad-screengrab-2 cad-screengrab-4 cad-screengrab

 

 

 

 

 

 

We revised our sketches to these designs:

lid-sg n-top-sg-2 n-top-sg-3

 

 

 

 

 

 

 

 

 

 

DAY 3

Thinking about our coffee theme and the color palette we might use.

screen-shot-2017-11-20-at-9-50-12-am

 

 

screen-shot-2017-11-20-at-10-23-21-am

Also talked to Nick during class about our current plan and our progress.

Some interesting points that came out of this:

  • Although we were originally planning on having only one button, it seems that this complicates our project, since there would be more than one meaning of button push (for example, an initial button press would mean “I am getting coffee” and essentially functions as an ask to other people with the device; BUT there is also the second button press that others would send back as an answer to the ask, which would function as a “yes” or “no” to the “I am getting coffee”. SO, that being said, Nick initially suggested modes but it seems that we won’t be able to achieve this easily. THE EASIEST SOLUTION: include, along with the button, some sort of toggle or switch that would function as an answer to the ask. One value of the toggle would be a yes, and one value would be a no.
  • This being said, we will now have the following components: The feather, an LED button (to be purchased today), an orange LED, and a toggle, and the main functions of this Get Me Coffee device would be to ASK and ANSWER.
  • Another problem: Sana’s button does not seem to be sending to PubNub consistently. Hopefully this will not be a long term issue!

 

Sana testing a tactile switch code:

Got tactile button to somewhat work:

screen-shot-2017-11-21-at-1-29-41-am

DAY 4

Ok, so we have gotten some things to work, which is a great sign!

For documentation purposes, we have decided that both of our

myval1 = button switch

myval2 = toggle switch

This way when we are subscribing to each other’s messages we will be able to know that yourval1, when writing functions or whatever, will always refer to the button value, and yourval2 will always refer to the toggle value.

DAY 5

We have had to move from our original idea of having both the button and the toggle, to just having the toggle as the main communication component. We managed to print our casing today, however there were some issues. To begin with, the parts were designed using Autodesk’s Inventor software. When exporting as .STL files for printing, they seemed to scale in an unpredictable manner, meaning that we had to very quickly resize the components. Unfortunately this meant that the fit for the case and lid was slightly tighter than designed. With some extra fabrication we were able to compensate for this. The other issue was that due to time constraints, the casing had to be sent for printing before we were able to build the full circuitry, meaning that we underestimated the height which would be required for the internal components. This required us to rethink the casing.

We also have found that although we originally only wanted to have the message to be sent to PubNub ONCE on toggle switch, it turns out that having it constantly publish the toggle state was a more achievable and realistic way to go about the code.

20171124_163611 20171124_163625

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Sana came up with the idea of using actual coffee cups to contain the internal components.

Unfortunately the product was not fully functional for the critique, though we have included video of when we had both devices communicating successfully (the video can be found here: https://www.dropbox.com/s/buww21m29l2080w/WorkingVid.mp4?dl=0 )

As you can see, the product was working as intended (if a little slow due to the connectivity issues working with PubNub). The issues arose when we began soldering the components to the protoboards. Unfortunately we never found the exact cause however Sean’s board would not communicate after being soldered. The obvious cause was a short in the soldering, however we could not find a problem under examination. We remedied this issue by removing the components from the protoboard and using a mini-breadboard – allowing us the flexibility of a more mobile device, while ensuring functionality.

The next issue came the next day when we realized we would have to upload the code to the devices at the same time to boot the sequence. If we did not upload the code at the same, or if we disconnected the boards from our computers, the devices would stop publishing/subscribing from PubNub. Unfortunately we realized this issue the morning of the presentation and were unable to correct this before presentation.

 

References

Battery specs:

http://www.canadarobotix.com/battery-chargers/battery-lithium-1000mah

Figuring out how to use the tactile switches:

https://www.youtube.com/watch?v=n0VbHPB_2Ws

https://www.youtube.com/watch?v=tmjuLtiAsc0

http://www.instructables.com/id/Use-a-Momentary-or-Tactile-Switch-as-a-Pushbutton-/

https://learn.adafruit.com/adafruit-arduino-lesson-6-digital-inputs/arduino-code

 

Project Context

Some more information on the Amazon Dash buttons:

https://www.cnet.com/news/appliance-science-how-the-amazon-dash-button-works/

Everyone loves coffee. The idea for the initial concept came from months spent in the studio with everyone grabbing coffee for each other. Many times we would be travelling to the studio and would think about messaging our team and asking if anyone wanted a coffee, however this requires the time it takes to send a message. It would be far more convenient if we were able to send out a signal letting our peers know we were getting coffee and enable them to ask us to bring them one with 1 click of a button. The initial concept also included a p5 page which could track how many times individual users had been for coffee so the group would know was slacking on coffee-retrieval responsibilities.

 

 

Zapped Out

 

Zapped Out

Group Members: Jad Rabba and Kylie Caraway

experiment4_finalphotos

 

Project Description: Zapped Out is a “stress-o-meter” that sends a notification to a companion of your stress level. Through squeezing a stress ball implemented into a side pocket of a backpack, the user gets immediate stress relief, while also sending a signal that illuminates a range of thunderbolts on a piece of art that can be either hung or placed on a desk.

Final Video of Project: Zapped Out Real Time  &  Zapped Out Ad

Github Code:  Zapped Out Codes

Design Files:

backpack

Backpack & Stress Ball Design

boxdesign1

Box Design

 

Diagrams:

Flex Sensor Diagram

flexsensor_diagram_1_bb

LED Lights Diagram

lightsensor_diagram_1_bb

Sketches:

01

02

 

Process Journal:

Day 1:
Today we learned how to connect our feathers to the OCAD WIFI. We also learned how to use Pubnub to send data between feathers. While we both made Pubnub accounts, we decided to use Jad’s account to send and receive data. We had difficult sending information at first, due to vague channel names (metoyou, youtome) , as well as confusion about the switch of “pub key” and “sub key” in our codes. Luckily, with Nick’s help, we fixed our issues in Pubnub. By the end of class, we managed to get our feathers to talk to one another! We sent data from Jad’s feather using a potentiometer to change the brightness of Kylie’s LED, and then we sent data from Kylie’s light sensor to change the brightness of Jad’s LED.

Day 2:
Today we began brainstorming.

We discussed how we both wanted to make a product that could potentially be used and marketed. We were both interested in using sensors for biometric data. Kylie mentioned that a potential useful tool would be to create a messaging system that would notify her partner if she had fainted from her Type 1 Diabetes. Diabetes sensors are very expensive, and this could prove to be a useful tool for parents of children with Diabetes, or a more cost effective tool for Diabetics to send for help when needed. This idea came with logistical issues: Kylie did not want to try to tinker with or program her glucose monitor, and she did not want to try and recreate a sensor for blood sugars. She tried to find research supporting a link between heart rate / pulse and her blood glucose levels, because there are currently no sensors at Creatron to monitor blood sugars. While there is a link between high heart rate and poor blood sugar management, the linkage for immediate data between blood glucose and heart rate was unclear. Although the idea to create notifications based on blood glucose looked difficult, we still wanted to explore the use of biometric data for other wearable products.

img_0493

(Notes for heart rate / pulse sensor alert)

After our meeting, we went to Creatron to purchase a pulse sensor. Unfortunately, Creatron was out of all biometric sensors except the heart rate monitor. We asked when they would receive more sensors, and the answer was unclear- either Monday or Thursday. She decided to instead purchase the last biometric sensor, the heart rate monitor. Creatron also did not have the pads to go with it. We quickly realized this was not going to work easily with wearable technology, because the sensor was made to connect to three portions of the body. The supporting documents suggested to connect the sensor to both sides of the heart, and a spot further away from the heart (such as the leg.) We wanted to create something wearable on the wrist, rather than an EKG, so we decided to move away from this sensor to something more portable.

 

Day 3:
Today we met to re-brainstorm through project possibilities. We came up with three different options:

img_0497

  1. Use a heart rate monitor that measures when someone has fainted. When the user faints, it sends a message to a friend.
  2. Use a proximity sensor to detect a when a dog sits in front of his bowl. When he sits in place, we get a notification. We then click a button on P5, that uses a servo sensor to drop a treat in their bowl.
  3. Use a flex sensor, touch sensor, or force sensitive resistor to detect when someone squeezes or presses a stuffed animal / doll / toy. This could be used for children with autism to communicate with their family, friends, or teachers.

After brainstorming, we went to Home Hardware and bought a pulse sensor. Unfortunately, they did not have any pads to put on the end of the pulse sensor. We then went to Creatron and purchased a flex sensor.

Once Kylie got home, she started brainstorming about the flex sensor and toy for autistic children. Jad made an important point that changed our ideation path: if we use sensors for autistic children to communicate, we would have to create a language of different colors and brightnesses to denote meaning to the other user. Perhaps we could use the flex sensor and a toy as a stress reliever to send information to someone, to let them know you are in distress or need help. This could be a joke object (such as dammit doll and Panic Pete Squeeze Toy), or a serious object for stress relief. If the toy is squeezed for a short amount of time, a white light. If the toy is squeezed for a medium amount of time, a yellow light. If the toy is squeezed for a long time, a red light. This would send information to another stuffed animal, or some other sort of notification to a friend, family member, or significant other.

 

Day 4:
Over the weekend, we decided to go step by step and attempt to get the Flex Sensor working. In order to accomplish the end output, we began by making sure the LED light was working.

First step: Test light blink example code. Works perfectly.

Second step: Test light fade example code: Works perfectly.

Third Step: Test Flex Sensor – this took some trial and error to set up and receive data. We followed the Sparkfun example online. The first issue was with the hardware. We had a difficult time connecting the flex sensor to the breadboard, because the flex sensor is very fragile and we were afraid of breaking it. We managed to wiggle the sensor into the breadboard. The next issue was the code. The code called for calculating the voltage the flex sensor received, and we weren’t sure of the exact voltage we were receiving. The code gave a range of potential resistors, from 10K ohms to 100K ohms. This called for a small calculation of the resistor used on the board. We used a 10K resistor because it was included in our kit. After testing out different hardware setups and code calculations, we were finally able to retrieve data from the Flex sensor. We received data in ohms and bend degree, which was a bit confusing to wrap our heads around.

img_0503

step_2_flex_code

Fourth step: We tried combining the flex sensor to affect the LED light on the board. The code kept crashing Kylie’s featherboard and Arduino. Her computer begins to read the code, stops reading, and freezes up. She has to close Arduino. She tried reloading the code onto her featherboard, after unplugging and replugging her featherboard in, and it says it can not find the port. She then had to restart her computer. After multiple attempts, Arduino could not find the port, over and over. She tried clicking the reset button on the feather. It still would not find the port. She finally reloaded the flex sensor code (not the flex sensor and light code) and it started working again. She went back to flex sensor and light, and it crashed her featherboard again. Not sure what is happening, or why?

step_3_flex_light_code

Day 5:
Today we worked on the code for the first featherboard. First, we made the flex sensor affect a light.

Video : Flex Sensor and Light Test

Then, we sent the data from the flex sensor to Pubnub. At first, it was sending multiple values to Pubnub, such as ohms, flex degrees, and flexR (from the code we sourced from Sparkfun.) We tried to send Pubnub only one value from the flex sensor. After many renditions, we succeeded in sending the value of flexR to Pubnub. Next, we worked on the code of the second featherboard. We tried to make it read the same value from the other board, flexR, through Pubnub. At first, it would only read 0, and in the monitor on Arduino it shows it reads other values from the class example (randoval1 and randoval2). We believe we have removed these values from the code of both featherboards, so we keep trying to find where randoval1 and randoval2 are located in the code.

 

Day 6:
Today we have had various issues with our feathers and code. The first issue was Kylie’s board not connecting. Arduino will randomly find the port, and then the next time it can’t seem to find the port. We tried pressing the reset button, but it still would not work. We removed the USB connected to the feather from the computer, and replugged it in. The computer made a comment about the power supply potentially not being enough for the USB. We talked to Nick, and he suggested the reason we might be receiving this message is due to hardware issues. We tried to simplify our wiring, and this seemed to fix the problem (for now).

We still had the issue of reading rando vals, but after sifting through the code, Jad realized we didn’t switch the channels correctly.

Jad decided to start clean with our flex sensor data, so he removed the voltage and resistor information, and mapped the values from 0 to 1023. We double checked with Nick, to make sure removing the resistor and voltage information would not affect our data. He reassured us that the extra information was not necessary. This proved to be a much cleaner approach to handling the values from the flex sensor.

step_5_flex_simplified_code

With our new sensor data, we found that 240 was the most bent value, and 1023 was as straight as it could go.

 

Day 7:

Today we focused on incorporating the flex sensor onto the stress ball. Soldering proved to be very difficult. The solder iron tips in the lab were not working. We had to use the side of the solder iron tip, making it very difficult to solder the two small prongs on the flex sensor. We also accidentally got solder on the sensor itself… we decided to not remove it because we were worried it would ruin the sensor to scrape it off. We also worked on the Fritzing diagrams.

As we dived into the code, we started by mapping the sensor values from minimum to maximum. We planned on dividing these values into four separate stages: no light, one light, two lights, and three lights.

Our first step was to retrieve measurements from flex pin (flex sensor)  and light pin val (flex sensor mapped from 0 to 255) with the sensor on the breadboard:

Flex pin val: Straight : 710 / Bent: 281

Light pin val:  Straight: 103 / Bent: 241

Our second step was to monitor measurements from flex pin and light pin val with the sensor soldered and tape to the stress ball

Flex pin val: Straight : 500 / Bent: 285

Light pin val: Straight:160 / Bent: 230

Our third step was to send data from the stress ball and to receive data to the other feather, which was connected to 3 lights.

We started our attempt to map the sensor values into 4 categories.

No light : 371-500

One light : 348 – 370

Two lights: 324 – 347

Three lights: 300 – 323

First try: Jad is able to read my code, but it is not effecting his lights.
We also notice that the flex sensor does not have a consistent behavior.

Second Try:

We notice the reason Jad is not receiving the code is because the F from FlexPinVal is not capitalized in the second code. We changed the category numbers to make it a progress bar:

No light: 371 – 500

One light:300- 370

Two lights: 300- 347

Three lights: 300-323

The numbers received from the sensor remain sporadic. We tried removing the resistor, but we only got 1023 (which means an error).  We tried a 1K resistor instead of 10K. The numbers were very small, and would not give a wide, consistent range. We went back to our original 10K resistor. The numbers have a wider range, but they are once again very scattered. Perhaps it is because we are putting our hand directly on the flex sensor? We decide to put masking tape over the sensor to try and remedy that issue. It seems to improve the fluctuation slightly.

image1

Third Try:
We remap our values with the masking tape sensor:

No light: 356-360

One light: 337-355

Two lights: 319-336

Three lights: 300-318

We realize that we don’t need to say 300 as the minimum. It could be zero.

On first try, Jad’s lights are not lighting up, so it appears it isn’t working. We reload the code and try again. It works! There is a major delay between data sent and data received and displayed.

Video: Stress Ball & Light Test

Fourth Try:

No light: More than 340

One light: 320-339

Two lights: 300-319

Three lights: Less than 299

We believe the flex sensor has a limited life, because the values keep decreasing. The flex sensor is being bent more and more, and it doesn’t go back to its original position.

The delay is substantial, so we try removing the delay built into the code. It changes it slightly, but not significantly. It has a substantial lag in the response time.

step_6_send_simplified_data

Day 8:
Today we met with Nick to discuss our issue with code delays, as well as our decreasing sensor values. We continue to notice a trend of decreasing values every time we squeeze the stress ball. We were worried this meant something was wrong with our sensor, or perhaps the sensor has a short life span. With Nick’s help, we removed one of the delays in the code. Nick also suggested we monitor the change in the two values, rather than comparing the two values themselves, so that we would not have to update the code every time we use the sensor. Jad began adjusting our code to match this framework. We also began designing the graphics and backpack, implementing the pieces into the backpack and box, and fine tuning the code.

Video: Stress Ball Backpack Light Test

03

Sketch of Box Dimensions for us to match our design to fit around the box.

 

Day 9:
Today we are implementing the various pieces into our backpack and box. There are issues printing from Photoshop on a home printer. While the measurements are exact in Photoshop, with 300 ppi, when printed, the document is not staying true to the measurements in Photoshop. After several attempts of printing at home, and lots of wasted ink, we brought the design to OCAD in attempts to print it here. The OCAD printshop printed it perfectly.

 

Context:

Many of us feel stress from time to time, from our jobs, to relationships, to other facets of life. In fact, Seventy-two percent of adults surveyed by the American Psychological Association  reported feeling stressed out for at least some portion of each month. In Dana Becker’s book, One Nation Under Stress: The Trouble with Stress as an Idea, Becker defines stress as shifting from a mechanical description to something endured by humans, ”a physical, chemical, or emotional factor that causes bodily or mental tensions and may be a factor in disease causation” that is due to our movement towards liberal individualism. Becker argues that, “now, particularly in the middle class, we ‘work’ to overcome stress; we don’t suffer it. And stress is not considered a sometime thing in contemporary Western societies; it is believed to be constant.”

To remedy these feelings, designers have created multiple objects to relieve stress: there are a multitude of various stress balls, from traditional squeezable stress balls, to bean bags, to liquid stress balls, to Chinese stress balls that come in pairs. There is also the iconic Panic Pete Squeeze Toy. Another current trendy stress object is the Dammit Doll. Parihug is a plush toy that allows users to send messages by hugging a stuffed animal.

Communication through tangible objects can be used beyond measuring stress: it has been noted that many children with autism enjoy communicating through objects rather than verbal communication.

From our research, very few stress relievers have communication built into the design. We want to incorporate a portable communication technique to help users deal with stress by signaling their emotions, rather than bury their feelings. With this design, users can relieve stress and send communication in one occurrence. Our project can be considered an SOS distress signal.

Simone Schramm’s concept piece, “Less Quantified Self – More Qualified You” gives a haptic response to body data by transforming a rigid body sphere’s texture. The sphere measures the conduction of the skin to determine the stress level. When calm, the sphere is smooth. When stressed, small knobs stick out of the sphere. The collection of user data to change the output of the object creates a personalized experience and sculpture for each individual user. Schramm believes that through a representation of our own body data, we can “influence human intuition.” We are inspired by Schramm’s data collection methods and visual response output, and we would like to move our project’s output in a haptic direction similar to this project.

 

Future Development:

In order to improve on our prototype, our first step is to deal with lag times between the feathers and the input and output. As of now, the response time is still too far behind real time to feel like a seamless notification of the user’s stress level.

After fixing response times, our next step for this project is to integrate communication between both objects, so that the user can receive immediate feedback when they are stressed. This could be through visuals, such as lights or emojis, or it could extend into more haptic feelings, from vibrations to a temperature change via a heating pad.

The next step of development would constitute refining the design, so that the flex sensor and feather are within the stress ball, allowing the object to not be tethered to the backpack. The design could also be transferred to different objects beyond a stress ball, such as dolls, soft toys, or something as small as a keychain.

 

Bibliography:

Dumbleton, Trevor. “Reduce Your Stress with Stress Balls.” Health Guidance, www.healthguidance.org/entry/4756/1/Reduce-Your-Stress-With-Stress-Balls.html.  Accessed 22 Nov. 2017.

 

Goldsmith, Barton. “Don’t Bury Your Feelings.” Psychology Today, 4 Nov. 2013, www.psychologytoday.com/blog/emotional-fitness/201311/dont-bury-your-feelings. Accessed 21 Nov. 2017.

 

Olivo, Erin. “How to Tell If You’re Stressed or Depressed.” Psychology Today, 8 Jan. 2016, www.psychologytoday.com/blog/wise-mind-living/201601/how-tell-if-you-re-stressed-or-depressed. Accessed 23 Nov. 2017.

 

Popova, Maria. “Stress as Metaphor.” Brain Pickings, 19 Mar. 2013, www.brainpickings.org/2013/03/19/stress-as-metaphor/. Accessed 22 Nov. 2017.

 

Sashin, Daphne. “How Bobby Smith Learned to Talk at 9, and Other Autism Success Stories.” CNN, 29 Apr. 2014, www.cnn.com/2014/04/29/health/irpt-autism-communicating/index.html. Accessed 23 Nov. 2017.

 

“Stress Ball // Concept 2 // Less Quantified Self – More Qualified You.” Vimeo, uploaded by Simone Schramm, 2 Feb. 2016, vimeo.com/153951310. Accessed 20 Nov. 2017.