Author Archive

The ‘Call Mom’ Project

By Frank, Jingpo, Tabitha

Project Description:

Mom misses you. She wants to know why you never call. The ideal moment to call Mom would be that fleeting period of time before bed where she’s snuggled under the blankets with a hot tea and a good book, just about to drift off to sleep. But of course she hasn’t told you this, you’re just expected to know through a nonexistent form of parent-child telekinesis.

Call Mom is an Arduino-based networking project that uses a light sensor to determine when her reading lamp is switched on and sends you a notification that she’s in relaxation mode, ready to hear from you. The device is housed inside a vintage book that blends in seamlessly with her bedroom decor. Powered by a simple battery pack it’s a low maintenance internet connected device that sits inconspicuously by her bedside. Some moms will want to know how it works and others won’t care in the least, but it’s a universal truth that moms just want to hear your voice and know that they haven’t been forgotten amidst the chaos of your busy life.

Github Link: https://github.com/imaginere/Experiment-4

Ideation:

The first iteration of our project was a simple device that allows parents to send their young children messages while they were at school, kind of like a kid-friendly pager. As we continued to develop the idea we discussed how children have trouble perceiving time in the same way adults do, so parents and teachers could program reminders with friendly icons to mark key moments throughout the day. Based on Frank’s initial sketch we decided that the object should resemble a wooden children’s toy with an lcd screen in the front and a simple button on the top for the child to confirm that they had received the message.

Frank's first sketch for the kid-friendly messaging device.

Frank’s first sketch for the kid-friendly messaging device.

We quickly ran into trouble when we realized that the Feather and our lcd screen were not compatible with Pubnub. After discussing the situation with Kate we determined that it was best to pivot towards a new idea. In the brainstorming session we explored other ways of marking the passage of time but many of these ideas felt like watered-down versions of google calendar. So we went back to the initial concept – networking. What does it mean to network? Why do we seek connection? Rather than think it through intellectually we distilled our project down to the universal feeling of separation through distance. Longing to be with the person you care about the most. Late night calls to your loved one, though miles apart, still knowing that you’re both looking out the window at the same night sky. I just want to know that you’re thinking of me.

As we discussed new directions Frank made drawings on the blackboard to help clarify our ideas.

While discussing new directions Frank made drawings on the blackboard to help solidify our ideas.

We continued to explore the idea of a remote and networked self. Tabitha explained how she keeps her favourite travel destinations on her phone’s weather app to help her imagine that she’s somewhere else. On a rainy Toronto day she can see the current weather in Paris and concoct an escapist fantasy of the adventures she would be having if she was there instead. Jingpo told us that she does something similar to help her imagine what her Mom is doing overseas. She described the experience of never knowing the best time to call her mom who lives far away in another time zone. The best time to call is usually right before bed. Frank mentioned hearing about a project where the artist created a networked sensor that would alert him when his mother was seated in her favourite chair. It was these elements combined that caused the “ah-ha!” moment – A light sensor that could send you a message when Mom’s bedside lamp turns on and she has settled down for the night with a cup of tea and a good book. Now we had a project to address the question “What’s the best time to call mom?”

Coding Process: (Frank)

Hardware & Coding the Device

This is the hardware we used to achieve this project:

– Adafruit Feather ESP8266

– 7mm photoresistor

– 1K Resistor

The Fritz diagram for the circuit:

booklight

As you can see the circuit is super simple. We have focused primarily on the functionality of what we were trying to achieve.

What do the parts do?

The photoresistor is a simple device that measures the amount of light it is receiving. It sends this number which is constantly changing to the Analogue pin(An analog pin is represented with the letter before the number, in this case, A3, A1, and A2 cannot be used in this instance since we are using WIFI on the ESP8266 which disables those two pins for use) on the Arduino.

When the Arduino is ON(Powered up) the Loop() function is constantly monitoring the sensor information, and when it goes above a certain number it triggers an event which sends a message to IFTT(An internet service which provides hooks to various notification protocols) The IFTT service which in our case is using Webhooks sends an email and notification to the users ID which is set in the Arduino code.

We can easily change this amount of people who receive this notification to up-to 12 different emails if we use the Gmail notification through IFTT, we did not use this during our presentation and in the code as it was unreliable at times and was causing our code to get glitchy.

The code also makes sure we only receive only one notification until the lamp is turned OFF, if the lamp is turned on again it will send another notification.

A feature that we would like to add in the next iteration is time of day, in which case it will bypass the notification if it is not 8-12pm, in which case it can be made very power efficient too if we use a switcher to power down the use of Wifi when it is not needed. This would allow for this device to be made very small using different hardware and a better power management circuit.

Trouble Shooting the Code:

The code although straightforward needed some mental gymnastics to get sorted out. These were the main challenges facing us.

1. How can we reliably get the notification to trigger? We were given PubNub (www.pubnub.com) as an internet protocol to use, which is essentially a messenger service, we would have to get the Arduino to send a message to PubNub and that in-turn send us a notification, this was easier said than done as PubNub has a lot of Api’s that connect to it which could do this but  it was very technical to set them up on the server side and also figure out the api documentation in a matter of a few days. We are not inherently coders and most of these technologies are introduced to us at the start of the week alongside 2-3 three other projects we have on the go. Given that limitation, we looked for a simpler solution which could meet our communication wireless needs for this project. IFTTT was the answer, there were good youtube videos which demonstrated the use of setting up the applet in IFTTT and using the api Key. The one I referred to was: [https://www.youtube.com/watch?v=znFMNzT_Gms&t=107s]

2. The second problem we faced was the trigger would keep going off all the time as soon as the light was turned on, this again is a simple fix in retrospect but at the time was a head scratcher for people who don’t come from an intensive coding background. The solution was using a conditional if then boolean statement which bypasses the trigger once the light has been turned on and the trigger is set to light on = true.

3. The final piece of the puzzle was the getting the notification to trigger reliably and finding a good median light reading that would not trigger in normal lighting conditions. Also, we would have liked to use the gmail notification which would notify multiple people(siblings) at the same time but this proved to be unreliable because of the IFTTT service. It still works and can be used in the code but might skip a few times the lamp is turned on because the IFTTT has an issue with this applet.

The Internet of Things

We looked at the internet of things for inspiration to build this project, which we environ becoming a standalone project in the future. The electronics are embedded into a real book creating a sense of continuity in a nostalgic use of an artifact we are all used to having on a bedside table. We would like to add a practical use to the book in the future, which could be a way of recording all the times we did call our mom because of this little book. It also blends into our daily lives silently performing what some would consider a mundane task, but the links us across continents giving us a tangible feeling of knowing our loved ones are getting ready to call it a night.

We could extend this project by adding an internet log of all times the lamp was switched on and off, which would show our parents sleep times, but we question the use of this and if they would want this kind monitoring, it could also be used for parent in aged homes, its a gentle nudge on our phones just letting us know the rhythm of their lives.

Where the wild things are

We are taken into the world of the imagination where we find a thread of connection from a simple sms, not sent by our mom’s but sent by a machine which is silently placed beside their bedside table reminding us that they matter. We have a world of over communication from all sides, we may receive a message every day from loved ones forwarding jokes or making casual remarks, but there is something magical about a voice and the time we spend just to reflect about time and space they might be in.

A sketch I did thinking about our concept over the weekend.

A sketch I did thinking about our concept over the weekend.

Fabrication: (Tabitha)

We decided early on that it was important for our project to feel nostalgic. Working from the idea of a mom’s bedside table we began to think of objects that could house a light sensor, Arduino and battery pack that wouldn’t feel out of place. We settled on a book because it would be large enough to house the components and could be sent through the mail to wherever Mom lives.

Items from my home

Items from my home including the book corner.

I looked around my apartment for objects that fit within the vision we had for the project. My family is really into antiques and my husband works for the library so I had no shortage of materials to choose from in our home. In selecting these objects I tried to create a vignette with universal appeal. Even though the items come from my family it was important that I wasn’t recreating the bedside table of my own mom. To connect emotionally with our audience they needed specific details but also the freedom to project their own Mom onto these objects. So in our case that meant a cup of tea, an assortment of books, a lamp and an old family portrait.

Building the book

Building the book required several hours of gluing and snipping.

After selecting a musty copy of Heidi from the never-to-be-read section of our bookshelf I headed to the Maker Lab where Reza shook his head and said there were no shortcuts in this case, each page had to be cut and glued by hand! At over 250 pages this had me wishing I had chosen a smaller book… but thankful that I hadn’t picked something ridiculous like Ulysses. The light sensor was placed on the front cover of the book and two holes were drilled to connect the wires to the Arduino. At the end I attached velcro to the front cover so the components would stay in place.

The final setupThe final setup at the Grad Gallery.

Before the presentation we set up the table with the light-sensitive book as well as props to help build the world of this fictional mom character. There was a cup of tea – brown rice tea, low in caffeine since it’s just before bed! She keeps by her bedside a copy of Heloise’s Kitchen Hints as well as  Machiavelli’s The Prince and we leave you to decide which of the two books has the greater influence! After it was all set up we did one more test to make sure we were receiving messages from the Feather to our phones.

Future Development of a User Interface: (Jingpo)

We decided to keep the main function and make user interface simple this time, but for future development of this project. We are thinking of having a web interface where provide users other useful ancillary functions. After class critique on last Friday, we found one interesting insight that sometimes it’s very difficult to call to our parents since we haven’t contacted in quite a while.

We got a very good reaction from our international students about this project. The emotional response that happened while demonstrating the project exceed our expectations. Many of them expressed they would buy the device if it was a merchandise for sale.

Follow-up functions for products:

Possibilities for a calling interface.

Possibilities for a calling interface.

1.Web page: When you received an email or text message alert, you could click on a link to an external webpage.
The whole webpage would be created in p5 Javascript. The image of the page changes when the light is turned on and off. Users can visually check multiple data, such as local weather, current temperature, humidity and air quality, date and time. We found there is a possibility to receive a central district of the city/town mom lives with its own parameters (city name) in API response.

2. Sensors:
If possible we can add temperature sensor into this device, so users can only know the local temperature but also real temperature at home.

3. Click-to-call links:
In most cases, international call would be very expensive, so people usually choose to call their moms online. It would be great to create click-to-call links for mobile browsers. They can call their mom directly through the link without downloading or opening video chat application such as Facetime or Skype. We found some meta tags or syntax code call Skype or Facetime applications from a website.

4. Data generator:
Hopefully we can access to user’s personal data and provide them some useful data, such as “What the average time she sleep?”, “What the last time you call her?”, or “How long did she read yesterday?” . We care about our parents’ health and want to know if they go to bed on time even though we don’t call them everyday.

5. Chat topics:
We are very interested in the insight we found that sometimes we feel it was difficult to call to our parents. You miss her voice and want to call but something hold you back. After you struggle a bit, it turn out you choose text. If possible we can randomly provide you some topics that you can talk with your mom.

A tracker that documents mom's routine before bed.

A tracker that documents mom’s routine.

Class Critique and Conclusions:

Presenting our project to the class.

Presenting our project to the class.

We had discussed doing a role play scenario where one of us would act as the mom but that didn’t seem like the right direction. As we were setting up Tabitha had the thought of calling her real mom during the presentation – coincidentally they had just been talking on the phone that morning. So Tabitha sent a synopsis of the presentation as well as some photos of the setup and told her mom to be herself and say whatever she’d normally say before bed. She was very excited to be asked to participate in the project!

Sending secret messages to my mom in class.

Sending secret messages to my mom in class.

The feedback we received was complimentary but the most striking was the emotional response that happened while demonstrating the project with Tabitha’s mom. We were able to see the heart of the project reflected in faces of our classmates. There was a sense of understanding why this simple device matters and how it can make a big difference in a very small way.

We were asked to think about practical aspects like the future iterations of the device. Suggestions included shrinking it down to bookmark size, repurposing it for different family members and networking it with communication messengers to develop a calling interface. The gallery project could be expanded by creating multiple character vignettes using the bedside table theme. However no technology can fully address the question “Now that she’s on the phone, what do I say to her?” Sometimes it’s very difficult to relate to our parents as they are disconnected from our day to day. But perhaps that’s besides the point. Moms are resilient and all they want is a brief acknowledgement that they are loved through the simple act of saying goodnight.

References/Context: The following are some items, blogs and resources that inspired us and helped develop our project.

screen-shot-2018-11-24-at-10-46-19-am

https://learn.adafruit.com/wifi-weather-station-with-tft-display/software

This weather station was the initial inspiration for our parent-child communication device. The plan was to use this lcd screen however we changed the direction for our project.

http://blog.ocad.ca/wordpress/digf6037-fw201602-01/category/experiment-3/

As we were still exploring the calendar idea we found this project from a previous digital futures class.

https://www.hackster.io/createchweb/displaying-an-image-on-a-lcd-tft-screen-with-arduino-uno-acaf48

This was useful in trying to troubleshoot the lcd screen and arduino connections.

https://www.youtube.com/watch?v=znFMNzT_Gms&t=107s

This is the video I referred to for the help with the setting up IFTT, he uses block code to setup the function.

http://easycoding.tn/tuniot/demos/code/

This is the block code editor for ESP8266, it makes troubleshooting code a little easier if you can’t follow normal syntax, It also provides the C++ code if you build your logic in the block code editor, you can just copy and paste the code in your arduino sketch.

600x600bf

https://en.wikipedia.org/wiki/WireTap_(radio_program)

Tabitha – The inspiration to call my mom came from years spent listening to Jonathan Goldstein interview his family on CBC’s Wiretap as well as a general interest in ‘ordinary people’ as performers. Three years of Second City training has taught me the power of unscripted acting for its spontaneity and truthfulness, but I especially love it when untrained actors are brought onstage. All it takes is a short briefing about the premise and away they go. That’s when the magic happens.

 

Winter Is Here!

Experiment 3: Winter is Here

by Tabitha Fisher

Description

Winter is Here is a multimedia installation that fully embraces the inevitable. The project uses a p5 snowflake simulator in combination with Arduino which allows the user to “turn up the dial” on winter for a fully immersive experience.

Code

https://github.com/tabithadraws/winter

Process Journal

Step one: make a thing control another thing. A daunting task when you can barely understand your previous things. See, in the last two assignments I had the benefit of working with some very smart people. We balanced out each others strengths and weaknesses. I can delegate, I can facilitate and I can ideate, but I’m not great at doing when I don’t know what I’m doing. I am also not great at retaining information without having time to let it sink in. Which is a problem when that is the guiding philosophy of this particular course. To rephrase that Jenn Simmons quote, The only skill I know is how to identify what I don’t know… and then what???

I began this project knowing that I had to keep it simple. My goal was to have something as soon as possible with the thought that at any moment this can be taken away. Basically, working from the general to the specific. It’s a philosophy I use while drawing/writing/animating – the idea that your work is never truly finished so you must have something to show at every stage of the process. It’s not meant to be as melodramatic as it sounds.

I knew I wanted to understand the code. In the previous group projects I understood the theory behind some of the code, but I couldn’t confidently explain how it works. So, rather than start with some lofty plan that I couldn’t possibly execute I decided to work from what was already working as a basis for my experiment. My goal was to create a project that I could slowly take the time to understand.

In class we were learning about JSON Protocol and how it allows the Arduino and P5 to talk with each other. There was an in-class example that used the mouse positioning on-screen to control two LEDs. After a few tries and a bit of help I was able to get it working, which was quite exciting. I also happened to have my potentiometer hooked up from a previous exercise, and it was at this point where I realized how I can have multiple devices on the same breadboard without having them mess each other up. There was also this serial port thing that was fairly new – in order for JSON to work we had to use this serial control software and add the serial port name into our code. I managed to get that working too. Knowing you can do something in theory is very different from actually doing it yourself.

Fig. 1 - Two things on the thing and nothing exploded!

Fig. 1 – Two things on the thing and nothing exploded!

After this I thought since I’ve already got a potentiometer on here I should try to get it to control some stuff. In the p5 examples page there was a particle simulation for falling snowflakes. I ran this idea past Nick and we discussed possibilities for displaying the project. I thought it would be pretty lame to just have it on a laptop and shake it like a snow globe, so Nick suggested that I check out what’s available in the AV Rental room. That’s when I learned all about short throw and long throw projectors!

At this point, I had to remember how to get the P5 example to display on my computer. Basic, I know, but at this point we had tried out so many other new things in class that it was a struggle to remember the steps. Oh yeah… I had to make a new folder with an index.html document and copy the code into a javascript document. And open the index.html in Chrome. Baby steps, people…

Fig. 2 - Mockup of my dial that I used from a class example.

Fig. 2 – Mockup of my dial that I used from a class example.

I got the snow working but I had no idea how to control it – or even begin trying to figure out that information. We had been given a document from the in-class assignment that I had already loaded into my Arduino but I wasn’t sure how to figure out the code. A few classmates had some ideas but I wasn’t able to fully follow along, which had me even more confused. One of them even got the snow to fall but I didn’t understand how they did it and there were way too many snowflakes, and I couldn’t fix that either. There were many challenges such as remembering the shortcut to access the console – not to mention remembering it was even called ‘console’ so I could google it and find out for myself. At this point my n00b brain was completely fried as I came to the realization that all my computer knowledge up until this point has relied solely on an understanding of a few select programs and outside of them I am a helpless baby kitten.

Fig 3. - Way too many snowflakes, my computer is dying! But at least the potentiometer is working!

Fig 3. – Way too many snowflakes, my computer is dying! But at least the potentiometer is working!

After speaking with Nick he suggested that I put the Arduino away for now and spend the weekend trying to understand the code as well as the controls for the snowflake properties. Could I figure out how to change the number of snowflakes? How about the size? So I went home and adjusted some of the code on my own. To start, I knew I wanted to make the screen larger than a little rectangle so I played around with the canvas size. Then, I changed the colour to blue! Next, I wanted to try adding an image but the usual method wasn’t working.

Fig 4. - Bigger canvas! Blue canvas!

Fig 4. – Bigger canvas! Blue canvas!

During Explorations class I was able to add images by putting one in my library and writing the title into the code, but nothing was happening. I found an example from the P5js site that described how to load and display an image using the command line to create a simple server. It was not working as described, so I abandoned that option and took Kate’s advice to host my sketch on the OCADU server for now. I just wanted to see if the image was working!

Fig 5. - Oh yeah, remember cyberduck? What a quack… #dadjokes

Fig 5. – Oh yeah, remember cyberduck? What a quack… #dadjokes

After a bit of tinkering I got it to do the thing! Meaning, my image was on screen and my snow was falling. Super great. Except this wasn’t a permanent solution since Arduino can’t talk to my website over the OCAD server… le sigh.

Fig 6. – Webspace snow video

So I put aside the idea of working with the image on screen for the moment and focused on trying to get the Arduino to speak to the p5 sketch. Which had been working a few days ago. But when I sat down to recreate the interaction I couldn’t remember what I needed to do. I knew I had to load something into the Arduino, but what? I had taken notes but suddenly they no longer made sense. Was that something located in my project folder? Weren’t they supposed to be .oi files – and how come I wasn’t seeing one of those? And how come when I tried to load my index.html page I got… these horrible grey streaks??

Fig 7. - The saddest snow.

Fig 7. – The saddest snow

At this point I was very concerned that I had somehow ruined the whole thing. It was already Wednesday and the project was due on Friday. Going back to my work ethos that you should always be at some form of “done” for every stage of a project… at this point, it seemed as though I had no project at all. Dark times. I spent hours fussing with the code, trying to get it back to where it was a few days earlier but to no avail. I had a meeting with Nick and Kate that afternoon where I was hoping to discuss presentation possibilities but at that point I had nothing to show and no idea how to fix it. I really can’t recall a time where I felt more lost on a project.

Nick got me back on the right track by starting over with the original p5 example. It appears that I had just messed up something in the code, but within a few minutes he was able to get it going again. I was relieved but also pretty frustrated that I wasn’t able to figure it out on my own. He also pointed out the videos on Canvas that showed us how to make a server on our computers. That’s exactly what I needed to run the sketch with my images. Then it was time to head off to work for the evening, so any testing would have to wait until the following day.

Fig 8. – Building it up again

I was able to spend the next day applying these changes in the time between my other classes, and for the first time I was able to get through it all smoothly. I was simply retracing my steps, but this time I was understanding why I was doing them. When my Arduino failed to speak with my sketch I knew to re-select the correct serial port. I had a better understanding of the purpose of the index.html file vs the javascript file. I swapped my old background image for a new one. I knew where to go if I wanted to upload the .ino file to my Arduino. I felt as though I was controlling my project rather than allowing it to control me!

Fig 9. – Hey, things are working!

Fig. 10 - Thanks coding train!

Fig. 10 – Thanks, coding train!

The best part was getting the image up onto the wall. By chance I had positioned the projector so that it was tilted towards the corner of the room and when I turned it on the image came to life. Because the image was a nature scene it created the feeling of a little immersive world. A simple one, but still. The projector took it from being was is essentially a 90s-era screen saver to an actual installation project. I’ve never made such a thing before. Generally, I don’t make ‘art’. I make assets that are part of a great big commercial venture (animated tv!). Or, if I make something for myself I do so using methods I already know (films! sketchbooks!). But nothing that I’ve made has ever been in a gallery. I have always wished I could do this, but modern forms of installation art have always seemed so mysterious. Nuit-blanche type stuff for fancy art folks. How do they come up with those ideas? What are the guiding principles behind this style of making?

Fig. 11. – My baby on the big screen!

I can draw stuff and I can draw it pretty well, but DF requires an entirely different set of skills. It would have never dawned on me to consider the effects of scale on a projected image unless it was a comparison between a film screened in a festival vs on a phone. I remembered back to the second project when Kate described the opportunities of working with physical materials. In terms of code the project may be technically simple but the way we use the physical environment can turn the project into something special. I knew I wanted to create this immersive snow environment but when I saw my classmates react so positively to the projection I thought it could be about revelling in the thing we dread… winter… with its dark days and icy sidewalks. My project could be about embracing the best sides of winter and all the things that make it special. Cozy scarves and hats. Hot drinks and chocolate chip cookies. A little holiday ‘muzak’ for ambiance. The comfort of the familiar. A metaphor for this particular journey of mine, perhaps?

Toques!

Fig. 12 – Toques!

Presentation

On the morning of the presentation I gathered the last of my supplies and left time to setup and ensure that my project was running properly. One of my classmates had suggested using Incognito while working on  project to ensure that my changes updated properly on the browser. It also had the side benefit of being quite dark which helped it blend in with the rest of my image while I was presenting. In a moment of brazen recklessness I decided to pull the yellow and blue LEDs from my breadboard moments before the start of class. They weren’t exactly needed anymore, and I felt that I understood my setup enough to do it with confidence. Thankfully I was right. Then, glorious synchronicity. I learned that another classmate had brought an office-sized vessel of coffee to share which they generously donated to my cause – I was planning to pick up something similar for my installation. I had grabbed some hats and scarves at Black Market and found some tasty-looking cookies at Rabba to share. They need to be tasty or what’s the point?

When it was time to present I cranked the Bublé tunes and found myself feeling… strangely exposed. My project was not nearly as sophisticated as the work of my classmates. It had taken me two weeks to use a preexisting snow simulation and make it work with a dial. What’s so special about that? Could you even call it an experiment? Well, for me it was and here’s why. Up until the moment I started at DF my value as an artist (and maybe as a person?) has been measured by my ability to draw. This has always been my currency. It’s at the very core of me, but in a way that’s very limiting. I came to this program to explore the unexplored and expose myself to methods of working that I know nothing about. Well, coding is one of those things.

Interestingly, during the critique Kate made the point that she wished I had used some of my drawings on this project. I agree that would have been great, but admittedly it hadn’t crossed my mind because my last few months of schooling have been about exploring worlds beyond that person. It is very easy for me to dress up one of my projects with a nice drawing. I know that’s not how she meant it, but I wanted to have a reason for using my drawings and I hadn’t quite arrive there yet. In a way I think I needed that reset. Maybe it’s silly to purposefully disengage from that part of myself but I was hoping that I’d benefit from the distance. Like going away on a very long holiday to somewhere completely new, only to return with a newfound appreciation for the familiar. Having gone through this process I now feel that I’m ready to reimagine what’s possible.

Fig. 13 – Final Snow

 

Context

Underwater Aquarium - Windows 98

Underwater Aquarium – Windows 98

Once I started working with the P5 snow example I noticed how it gave off a 90’s screensaver vibe. I really love that kitchy aesthetic. While I wasn’t able to fully explore the options here because I was so caught up with managing the basics, I kept them in mind when selecting the image. I have fond memories of those fish.  https://www.youtube.com/watch?v=5j5HA3Z8CZQ

Office party photo booth

Office party photo booth

These days it seems as though every office holiday party needs to be equipped with some kind of photo booth. My favourite part about it is the fanciness of the suits in contrast with the silliness of the props. When I think of an installation this is what comes to mind – probably because I’ve experienced more office holiday parties than immersive art projects. But there’s an earnestness to it all that I love very much.

 

Winter as a national tragedy

Winter as a national tragedy

I am very interested in the way citizens of a large city collectively gripe about specific topics throughout the year. It seems as though everything is always the worst. Winter especially. Maybe it’s somehow cathartic for us to perform this strange ritual at the turn of each season?

 

References

Original snowflake simulation example:

https://p5js.org/examples/simulate-snowflakes.html

 

Instructions on loading images (not super useful, however):

https://p5js.org/examples/image-load-and-display-image.html

 

Making a local server for the P5 Sketch:

https://www.youtube.com/watch?v=UCHzlUiDD10

https://www.youtube.com/watch?v=F6tP3joL90Q

 

Panic Mode

Panic Mode
By Omid Ettehadi, Lauren Connell-Whitney and Tabitha Fisher

Overview:

“Panic Mode” is a multi-person experiment that gives a physical and sensory form to the experience of human anxiety and introversion. Starting from the question “What happens when you go into panic mode?” we built a wearable Arduino device that measures the proximity of the people around you. When someone gets too close the plastic collar puffs up like the neck of a frightened animal and a mood indicator flips from a calming blue colour to an aggressive red.

The object itself comes in two parts that work together. An Arduino Micro board and its components are housed inside a circular box that is worn as a necklace draped along the chest. A mood indicator is painted on the bottom half of the front panel with a colour that ranges from blue (calm) to purple (in between state) to red (panic mode). A white arrow points to the current state, determined by data picked up by an ultrasonic sensor placed above to measure proximity. The second wearable is a plastic collar connected to two fans, one for inflation and another for deflation, and hidden inside is a string of red LED lights.

When worn the ultrasonic sensor gauges the distance of the user in relation to other people in the room. A safe distance is determined within the code and when someone crosses that threshold the mood indicator shifts closer to red. In this scenario the threatening figure is given a visual warning that they are invading the user’s space and have the opportunity to either step away and neutralize the situation or continue their advance. If they choose to press forward both the mood indicator and the string of LEDs flip to red and the plastic collar puffs up in an act of defence. To restore calm the threatening figure will need to back away to a safe distance at which point the deflation fan will kick in, the red LEDs will switch off and the mood indicator will turn back to blue.

Within the code the measurement of safe distance resets after each moment of full panic. Human moods can be unpredictable and sometimes we must approach with caution. In Panic Mode the responsibility to regulate emotion falls on both the wearer and the person initiating the distress. In order to restore calm both people will have to empathize with each other’s position and negotiate a comfortable distance that works for everyone.

Process:

We began ideating by talking about all of the things we as a group were interested in. And came up with several ideas, all quite different, but generally relating to how technology changes the initial human assumption of output.

We talked about signalling and how technology can be used as an agent for human emotion, what this meant for the viewer and how this changes the interaction. Some initial ideas for making included an emoji container that could be used as a playful way to pass notes in class. It is interesting how much overlap there was in class with emoji use. It seems that many of us are examining how communication happens and why this particular mode has become the popular way of sending small thoughts to each other.

line3

That being said we decided not to pursue the emoji device. We began talking about how one human reaction could possibly be read positively or negatively and how the read was in the eye of the beholder. How an interaction has two sides to it. Personal space was also something we began examining and how the concept was different for all people depending on several factors; culture, relationship, comfort, mood. We had all done an exercise in another class that sparked a conversation about the comfort of touching each other and the levels of personal space that we all had. This was a catalyst for the Panic Mode object. In some initial ideas we had spoken about using animal sounds as an alert, this lead us to talking about how different animal display discomfort or aggression which then lead us to talking about inflatables.

New rabbit hole: Inflatables. We all loved this idea! Though, the creation of an inflatable depends entirely on airflow. So began a deep dive into all the fans of the non-Amazon world that were available to us for under $15. Our naive assumptions of the success of this project assumed that the fan would be the least of our worries.

However, this was not the case. We tried a variety of fans all around that price point, all in the end, turned out to not have the ability to spin in two directions. This was an issue since we needed inflation and deflation as mechanisms to display calming and agitation. More intensive research produced no better idea than to use two fans (for the price point we had initially set our budget at). Beyond the initial research, we found that no one working with inflatables had actually found a cheap way to produce a product that self inflated.

img_3198

img_3203

Next we began talking about the object to hold the mood indicator; a container to house the circuit board, servo motor and the ultrasonic sensor. The idea of a pre-made container was enticing… of course. So we had a trip to Chinatown for the wonders of pre-made things that could be repurposed. The container we settled on could be opened and close easily and fit the PCB and all of our mechanisms nicely. It was a prefab bug collector for kids. We had to only drill holes for the ultrasonic sensor and the servo motor. So a trip to the Maker Lab was in order. We began hand drilling holes in stages because the hard plastic was very sensitive. In speaking with Reza, we found that the drill press would solve our problems. The drill press was a dream, and we are all looking forward to using more of the capabilities of the Maker Lab at a later point.

line2

Next we took to fashioning a more geometric shape for the inflatable. We began by just using 2 pieces of plastic ironed together as a test. Then we began drawing and shaping a paper pattern to make the final inflatable. Overall, the experiment did not turn out as planned but was a good test run for further development. We definitely learned about how this type of plastic shapes itself when inflated.

line1

At the end we put all our findings together and came up with this final results:

line4

Production Materials:

Github Project Link : https://github.com/Omid-Ettehadi/PersonalSpace

One thing we wanted to get around was the quantification of emotions that is done through most digital designs. We wanted our emotional indicator to be as close to reality as possible. In order to do that we though about having 2 indicators, a single binary one, where it shows only angry or happy, and another indicator that could provide us with more stages in out happy or angry states. To check how each component related to our project worked an how we could use them, we test them each individually and then we tried to combine them. The servo motor played an essential role in our project and to find out its capability we first tested the servo motor. We programmed it to start from 15 degrees and move up by 30 degrees, showing a different angle at different times.

servo-motor_bb

We then tested the Ultrasonic sensor. We needed a sensor to get a reading on the proximity of objects, and as we all had a US Sensor, we checked it. We programmed it to get a reading in cm and inches of objects to the sensor.

ultrasonic-sensor_bb

We then connected the two circuits together. We programmed it so that depending on the distance of the closest object, the servo moved to a specific angle out of the 6 options. It also printed out the angle and distance on the monitor. One thing we found was how jerky the movements of the servo were. We realized dividing it into only 6 states reduces the number of levels of emotions that the servo could show, so we decided to find a way to make the servo move through every single angle.
motor-sensor-combination_bb

We then started playing with light as a binary indicator of emotion, this added to our previous steps could give us a much bigger range of emotion to play with. We added an RGB LED to the program, so if an object got closer than the safe distance, the LED would turn Red and safe distance would increases. Or if an object got further than the safe distance, the LED would turn Green and the safe distance would decrease. Servo motor also was set to show the safe distance. But the LED was not bright enough to be very visible, and it didn’t get the feeling across. If we wanted to used a brighter LED, it would have been much more expensive to build.

rgb_bb

At this stage we were sure that we wanted to work with fans as a binary indicator for emotional states, but we were still looking for a fan that could do the job. We first started working on the algorithm still using the LED instead of the fan. Initially we kept track of the state using a variable. if an object was further than the safe distance, we added to the variable, if not reduced from it. The value was then mapped onto the six states and base on each state the servo would move to a specific angle and LED turn to a specific colour.

trial-algorithm_bb

It was very difficult to find a strong enough fan for the price point we had decided on in the time we had. All the fans that we were able to find had a circuit built into them making them only work in one direction. We needed something that could do the inflation and deflation both. One option was to use an air pump, but the cheapest option was 30 dollars (more than our budget), so we decided to stick with fans. Fans can’t handle any pressure like air pumps but they do transfer air into a bag. To get around the idea of a fan moving two ways, we tried sticking two fans on top of one another, but because of how fans are designed, they only allow air to flow easily into them from one direction, so they very much reduced the power of the second fan. We decided to put a fan at each end of the bag. This made the design more symmetric, but also heavier. The fans that we found were not very strong, so we needed to provide them with extra time to fully inflate or deflate the bag. The servo was set to keep track of time an object was in the clear or in the personal space of the user, and the air bag was used as a physical indicator, so that when someone crossed the barrier it would light up and get filled with air. The idea was to have a visual display of the user being choked by the inflatable, which in turn would cause the person on the other end of the interaction to move away out of sympathy.

The board in production in its many stages of iteration:

line5

Product List:

components-list

Schematic:

bb-view schematic

 

Related Works:

Our initial interaction with the idea of inflation came from several sources, one of which being Kate Hartman’s work on the Inflatable Heart. An external organ that you could inflate or deflate to show and communicate your emotional state.

fv03zhnfmygvds3-large

In the early stages of development we looked to Rafael Lozano-Hemmer’s work Vicious Circular Breathing as an exploration of breath, anxiety and social discomfort. The crinkling effect of the paper bags in this project demonstrates the power of sound when working with physical materials.

Vicious Circular Breathing

In order to get a better idea on how to use inflation in our design and how to automize the process, we looked at Hovdino, the inflatable helmet. This is something cyclist would instead of a traditional helmet. If they get into an accident, a helmet made of air is formed immediately inflates around their head protecting their head from hitting the ground.

3315r_2

Another project we looked at was the Re-inflatble vest by , where she uses micro water pump to fill up the vest with air. It was designed for office workers who constantly hunch over their laptops. The design would fill up with air every 20 minutes, forcing the user to fix their posture in order to let the air out of the bags.

fm5hq1ti4sccscu-large

Reflections:

Our main takeaway from this project was an understanding of the limitations of physical materials. In theory everything can work but sometimes components fail to operate as expected. With a limited budget and a quick turnaround we were unable to find the ideal fan for our inflatable so we settled on what was available, which did not provide the desired outcome. We also found that elements of the project worked separately while connected to the computer but failed to perform as needed once they were powered by the battery pack. Now we understand why it’s so important to get the components into the casing early to allow for more testing. Overall we feel that the project was a success because it taught us about the frustrations of working within the physical world and how a contingency plan is not just a luxury but a necessity.

The concept itself is solid and one of the benefits of working with real materials is they can become a source of inspiration. We were able to further refine our concept by exploring the form in 3 dimensions. For example, when placing our hardware into the circular casing we noticed that the configuration of the ultrasonic sensor in relation to the mood indicator gave a face-like appearance to our front panel. Later, when inflating the plastic bag around our necks, we noticed how uncomfortable it made us feel and realized how the sensory aspect to the project could relate directly to the core concept. The Panic Mode device is uncomfortable for the viewer but also for the person who wears it, making this work a multi-faceted exploration of both the experience and the effects of social anxiety in our society.

During the critique our project sparked a great discussion about personal space and how we experience anxiety in a public setting. The class felt that the device might serve the practical function of providing visual cues for a nonverbal person in distress such as someone experiencing a medical emergency or a psychologically triggering incident. We all agreed that it would be interesting to see how multiple devices would react to one another – perhaps a singular panic incident would cause a ripple effect throughout the group like a startled flock of birds. It was suggested that we consider adding sensors to other parts of the body since people are approached from many angles, not just the front. In the end it seemed as though Panic Mode was well received by our classmates and many were able to relate directly to the themes of our experiment.

References:

Antfarm. “Inflatocookbook.” Inflatocookbook, inflatocookbook.kadist.org/.

Cottrell, Claire. “A Beginner’s Guide to Inflatable Architecture.” Flavorwire, 6 July 2012, flavorwire.com/306518/a-beginners-guide-to-inflatable-architecture.

CrimethInc. “Inflatables.” Inflatables | Destructables, 14 June 2011, 12:00am, destructables.org/node/53.

Hartman, Kate. “The Art of Wearable Communication.” TED: Ideas Worth Spreading, ted.com/talks/kate_hartman_the_art_of_wearable_communication.

Hartman, Kate. “How to Make an Inflatable Heart.” Instructables.com, Instructables, 9 Nov. 2017, instructables.com/id/How-to-Make-an-Inflatable-Heart/.

Kraft, Caleb. “Learn Plastic Welding with Giant Inflatable Tentacles | Make:” Make: DIY Projects and Ideas for Makers, Make: Projects, 22 Oct. 2015, makezine.com/projects/learn-plastic-welding-giant-inflatable-tentacles/.

Lozano-Hemmer , Rafael. “Vicious Circular Breathing.” Rafael Lozano-Hemmer – Project “Vicious Circular Breathing”, www.lozano-hemmer.com/vicious_circular_breathing.php.

Q, Ziyun. “RE-Inflatable Vest.” Instructables.com, Instructables, 11 Oct. 2017, www.instructables.com/id/RE-Inflatable-Vest/.

“Test: Hövding’s Airbag 8X Safer than Traditional Bicycle Helmet! This Is How It Works.” Hövding, hovding.com/how-hovding-works/.

Dancy

Experiment 1: DANCY
By Norbert and Tabitha

Description:
Dancy is an interactive mobile site that lets you create a spontaneous dance party with friends. Using p5 it allows the user to select from multiple midi tracks to play back a unique dance mix – but the songs will only play when the phone is shaken. If you want to hear the music you’ve got to dance!

GitHub Link: https://github.com/Norbertzph/dancing3 

Process:

The idea for Dancy was created after a night of hanging out with our fellow classmates in Kensington when we realized that they love to dance! So we asked ourselves whether we can create a dance party experience using multiple phones. First we determined that the most important part of a dance party is the music. Each phone would represent an instrument and when all the phones come together they will form a band.

img_3036 img_3092

First Wireframes

After our discussion we created separate wire frames to see if we were both on the same page about the project. We realized that this project will also need a menu where you can select the instruments.

Norbert shared video of an art project called “Butterfly Room” to help explain his vision for the symbols. We decided that the symbols should look cute and fun, so we also referenced the dust creatures from the animated film “Spirited Away”.

img_3094 img_3093

Early Brainstorming – Exploring Jazz Music/Garageband

Next, we asked ourselves what type of music will be playing. We knew we wanted it to sound good and with the limited timeframe we could not record our own music tracks so we experimented with Garageband. We tried importing jazz standards into Garageband and found that they were very complicated. If the tempo wasn’t perfect the music wouldn’t sound good. So we experimented with the Apple Loops provided in the Garageband Library and pulled simple tracks with a clear beat.

screen-shot-2018-10-07-at-4-09-20-pm

Garageband audio files

In class we had a conversation with Nick that helped clarify our idea. He suggested that we explore the phone’s shake function as a way of playing our music. Kate had provided a document with a few p5 projects that we tried out in class and we found inspiration in the “shake to change colour” example. Nick said that we can use the phone’s accelerometer to change the tempo for the music. Different gestures could achieve different results as well. So we dug into some more p5 functions to see if there was any code that could help us.

img_3164

Animated character poses for Dancy that were later scrapped.

There was a period of time where the program had more complex animation. We wanted the characters on the homescreen to wiggle and when a user shook their phone in the music page the character would squish into the side of the screen. Also, we wanted the accelerometer to speed up the playing of the music when shaken so that the user would have to sync their gestures with their neighbour to achieve the same tempo. Ultimately we didn’t figure out how to do these functions in time for the deadline and dropped them from the project to focus on Dancy’s core functionality. Nick reminded us that the user would not be looking at their phone while shaking anyhow, so we feel that it was a good decision to invest our time in other things.

img_3229

Working on the character animation inside a music page. 

A big milestone in the coding was when we discovered how to move the a white box to either side of the screen. Later we would change the square to a colourful png character and make it play music when you shake your phone. For two people without any programming foundation, implementing such an idea is not an easy task. Therefore, once we determined the structure of the project we began to learn how to implement various functions. Starting from video, we worked our way through the most basic lessons of The Coding Train on Youtube to The function Coding in libraries in p5js.org. Many times, the code is technically correct, but it just doesn’t work. We finally solved our major problems by consulting teachers and classmates. For example, playing music, which is a simple function, took two days of various attempts and still could not be achieved. But when Norbert asked Nick, he found that a line of code was missing and the file was opened the wrong way. Through this process, we feel that writing code can only be the joy of trying, failing and finally achieving.

First movement of our white box

To be honest, before this project, we were all a little bit afraid of coding. But in the process of completing this project we are gradually falling in love with coding. When we needed to implement a feature we went to libraries to look up the code and found that there were many things we did not understand, such as abbreviations and logic which hindered our understanding. Then we go to Youtube to search for video that explains these elements again, and bit by bit we could break down a whole piece of code to fully understand its usage and logic. Then we could combine our own understanding and requirements to write the code that is suitable for Dancy. The whole process of learning can be frustrating in the beginning because trying is often met with failure, but when you calm down to learn a little bit and finally achieve even a small bit of progress, the feeling is very exciting.

img_3154-2

Final Wireframe with shake function

Presentation:

On the day of our presentation we asked each of our classmates to select a colour from the 6 buttons on our homepage so that the music choices were evenly distributed. Tabitha ran through every button with the group so they understand that they play the music using the shake function. This was also an opportunity to isolate each sound before we play everything all at once. When it was time to combine all the tracks the class got into it immediately and we noticed that the physicality of the shake gesture helped everyone get into the mood for a party. One of the students turned out the lights and – unprompted – people started to turn on their phone flashlights. The result was genuinely fun and we were surprised at how good the music sounded.

presentation1

Classmates are instructed on how to use Dancy

During the group critique we got a lot of useful feedback. People wondered whether it would be more fun to allow the group to explore the sounds on their own rather that assign people to groups. By doing so the music could shift and change as people discover different sounds. Another suggestion was to link Dancy to a playlist and as more people play a particular track that music would spike, almost like a battle of the bands. Everyone agreed that Dancy’s simplicity was also its strength and if we were to develop the project further we should double down on the program’s core functionality to make it even better.

img_3368

Dance party begins!

Besides what was demonstrated in class, Dancy has many possibilities for play and group engagement. For example, under the guidance of a music teacher students could use the program to explore different instruments in a group setting and create unique combinations according to different styles. This way students can learn about a variety of musical styles using an approachable hands-on approach without having to master various instruments.

As mentioned in class, Dancy has another meaning for today’s isolationist society. At parties many people are busy looking at the phone screen but real communication and interaction are lacking. Using Dancy, there is a direct correlation between the number of participants and the scale of the music and party atmosphere that’s created. Ultimately, it’s more fun if everyone participates. As a result, people can no longer stare at their phones because their bodies must always be in motion to generate the sound. No more texting in a dark corner while playing Candy Crush. They must shake their bodies to facilitate face-to-face communication with their friends.

Context:

We found a lot of inspiration in Norbert’s case study of Yuri Suzuki’s sound installation Sharevari, which relates to Dancy because of the common element of music and also his dedication to user experience. He creates a tool that is easily accessible for all groups so they can learn music easily by simply waving their hands. This inspiration helped us continue to develop the idea of Dancy. We wanted to create a small game that is easy to use without extra guidance and professional music knowledge. Anyone can pick up a phone with a friend and create an exciting musical experience through interesting interaction and exploration.

fea_sharevari_main-e1490077383860

Yuri Suzuki’s sound installation Sharevari

As described earlier, we wanted the characters for Dancy to be simple and cute. Here we were both inspired by the dust characters from the animated film Spirited Away. We settled on a round shape with eyes because we had intended to animate the character bouncing from the side of the screen, and rubber balls are easy to animate. In the end we didn’t get to that part but it still worked well because they were easy to see on screen since they were not too complicated. The colours we chose were bright to create a more playful party aesthetic.

screen-shot-2018-10-05-at-9-54-04-am

Creating the character buttons

In Conclusion:

If we were to continue to work on Dancy we would match each piece of music to a specific gesture. That way the user could explore the sounds without having to go back into the menu and they could jump to different tracks on the fly. This was a function that we were not able to create in time, but overall the project was still a success. If we were to run another group test with a new prototype it would be interesting to take the suggestion to eliminate the teaching element when introducing the game. We’d like to see what would happen if the users were able to explore the program on their own with minimal prompts. How would that affect the interaction? It would also be cool to include a flashlight function as that was such a large part of creating the party atmosphere for Dancy.

 

Project website: https://bit.ly/2yinzLJ

Github:  https://github.com/Norbertzph/dancing3

 

REFERENCES:

Code reference:

https://p5js.org/reference/#/p5/deviceShaken

https://p5js.org/examples/mobile-shake-ball-bounce.html

https://p5js.org/reference/#/p5/loadImage

https://p5js.org/reference/#/p5.SoundFile/loadSound

https://www.youtube.com/watch?v=nicMAoW6u1g&t=265s

Project reference:

https://www.taborrobak.com/butterfly-room

http://yurisuzuki.com/design-studio/sharevari

https://webspace.ocad.ca/~khartman/ColorShake/

http://midkar.com/jazz/jazz_01.html 

https://support.apple.com/kb/PH24971?locale=en_US

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.