Author Archive

OptiTune

Title: OptiTune

Group members: Nana, Ginger, Jeffin

Code: https://github.com/jeff-in/visualizer/blob/master/optitune

Video: https://vimeo.com/195742513

https://vimeo.com/195900950

End of session report form: https://docs.google.com/forms/d/e/1FAIpQLScYtcUHOisW-cRpMRChXUwvzfECQ86sx5AraUiFuxYtsdoJAw/viewform?c=0&w=1

Data collected: https://docs.google.com/spreadsheets/d/1qDAdw28VxTC8LjGghtS2fVy7nuEwH92aJPYlenbDOWg/edit?usp=sharing

Item price list: https://docs.google.com/spreadsheets/d/1zHBtic3kez1F6UK0COSJLrJF5MW41BQQMshA4U6wR5g/edit?usp=sharing

What is the concept behind OptiTune?

Restoring the connection of an individual who has headphones on with other individuals, through displaying real time visualizations of what the user is listening to.

What is the form?

A 3 inch cylinder that lights up according to the beat of your music.

How does it work?

You can either plug in your headset and have the lights turn on according to your playlist, or take the headset out and have OptiTune visualize the sounds surrounding you.

What materials does it use?

  1. Arduino micro
  2. 3.5 mm audio jack (x2)
  3. 3.5 mm audio cable
  4. 100k resistor (x3)
  5. Porto board
  6. PCB mount pins
  7. Portable USB charger

Preparations

What needs to be done ahead of time?

  • Create prototype
    • Set up the board with NeoPixel and microphone
    • Sewing the prototype into a badge
    • Constructing circuit and decorate the badge
  • Charge battery pack

Do you need extra batteries?

  • We have decided to use old phone batteries

What goes into your repair kit?

  • Fabric, thread and needle, battery charger

During Process:

First of all, we will document our user experience on using the the optune. And also select a group of testers to try it. We will document the process by vedio and writing notes.

Secondly, we will carry on our stuff and test it in a daily life. We will take notes regularly and also take selfies at different locations. We will be observing other people’s reactions to us playing the Optune, and we will collect  quantitative data like how many times does people inquiry our product.

Crunching the data

How will you structure a debriefing conversation?

  • Meet for a debriefing conversation, and take notes

What will you do with the data and media once you find it?

  • Look for improvements for future iterations
  • What was liked about the device?
  • What was not like about the device?
  • Was the device comfortable?

Where did you place your OptiTune?

We mostly put it next to our laptops when we were working on our projects at school. It was a fun and playful object that made stressful times entertaining.

What did you like about OptiTune?

Looking at the beautiful lights! Especially in the dark when the lights are off, or when you are taking a walk outside. It really is beautiful and exciting to see the music you are listening to.

What did you dislike about OptiTune?

The case, and the wires.

For future iterations we will have a case that can not only keep the device safe and waterproof but can easily be attached to anything.

How was the experience of the observers of your OptiTune?

We were on campus most of the time. So many of the people who passed us by were students. The fun part was when we were visiting the undergraduate buildings and the students would ask us if we had bought them from the OCAD store. Ofcourse, with pride we told them that we had made them as part of our project.optituneposter optizine_p_page_1 optizine_p_page_3optizine_p_page_4optizine_p_page_5 optizine_p_page_6_mg_3127_mg_3086

img_2634img_2553
img_2608img_2609
img_2613

optitune_bb

Language of Love | NANA |

5-sending-texts-2 fullsizerender-1 fullsizerender

img_2287 img_2291

img_2297 img_2299 img_2305 img_2306 img_2264img_2286img_2292img_2296img_2271img_2309 img_2312 img_2321
reading-together screen-shot-2016-11-27-at-6-57-38-pm screen-shot-2016-11-27-at-6-57-52-pm
screen-shot-2016-11-27-at-6-58-07-pm screen-shot-2016-11-27-at-6-58-18-pm separate-readings

 

screen-shot-2016-11-27-at-6-57-58-pm

 

img_2336

 

 

 

 

 

 

 

 

 

Project Title: Language of Love

Video:

https://vimeo.com/193072285

Code: https://gist.github.com/ex0self/bff5e57eb56597fe9a6120e3c1bae783

Project Description: According to a study at the University of California on the physical effects of being in a relationship, couple’s breathing patterns and heart rates synchronise when they sit close to each other. Drawing inspiration from this study, Orlando & I decided to put our relationship to the test.

We wore a heart rate & stretch sensor for the duration of 8 hours with the goal to alter the language of love through data. The more our breathing patterns and heart rates were synchronized the higher the percentage of intimacy between the two of us.

Individual Work:

I was in charge of the pulse sensors.

Challenges:

  • Look for a code that worked for feather (most of the code is for Arduino which is more powerful). The circuit also provided by a random guy on the internet allowed the sensor to get the data to a range of 1v (feather unlike other boards work with 1v analog) and because it has just 1 analog pin, (and all the sensors involved are analog we had to use 4 boards)
  • Making the wearables was very time consuming.
  • Going through the 8 hour experiment and having our own daily routine was very challenging in terms of having a normal behaviour and not constantly be aware of the data we were collecting.
  • Wearables: the wires kept on getting detached so we had to constantly check to see if we are “online”

Video Production:

Equipment:

  1. Canon t5i.
  2. Two white florescent lights stands.
  3. A black backdrop.
  4. 2 wireless chest microphones.
  5. One professional stationary microphone.

Editing software:

  1. Adobe premier & after effects.
  2. Garageband

Audio:

  1. A downloaded theme song (open source)
  2. Phone call conversation recorded and edited in Garageband
  3. Narration by Rana recorded and edited on Garageband
  4. A downloaded heart beat sound track (open source)
  5. Breathing sounds that were produced by us.

Storyboard:

Consists of 30 scenes.

Script:

Written in accordance with the storyboard.

Wearables:

Our main objective for the wearables was comfort. We had to come up where we wanted to place the hardware so that we would remain comfortable through out the 8 hours. Also, we didn’t want to hide any of the hardware within the wearable (as part of the concept of our project “displaying the language of love”).

Future Iterations:

  • Wearables: Use of thicker material for the shirts. It seemed that the hardware was too heavy for the cheap shirts we had purchased
  • Figure out a way to place the hardware on the wearable properly so that the wires don’t detach through out the experiment
  • Run the experiment for longer period of time to allow for behaviour to adapt to the idea of being analyzed.
  • Expand the team behind the work in order to get more precise data.

Questions Answered: 

How does the device blend in or stand out in the environment?

The device is displayed on top of the shirts worn by the couple. “Displaying the mechanical language of love”

How does this device respond to the data it is collecting?

The device does not respond to the data it is collecting.

How does this behaviour evolve as it collects more data about or for this person/place?

It gives a more tangible and visual feedback regarding a very intangible feeling such as love.

What is the overall size and form of the object?

Shirts. (M size)

Does the object encourage interaction from others?

Yes. You and your partner would wanna know how your breathing and heart changes when you interact with one another.

References:

http://www.businessinsider.com/the-science-behind-zestx-the-bachelor-love-lab-2016-2

https://www.zestxlabs.com

http://www.dailymail.co.uk/health/article-2277586/Two-hearts-really-DO-beat-youre-love-Scientists-couples-vital-signs-mimic-other.html

https://www.ncbi.nlm.nih.gov/pubmed/21910541

Script:

Phone conversation

Black background. Audio of a phone conversation between Nana & Orlando.

Beep… Beep… Beep…

Orlando: Alo.

Nana: Hey

Orlando: Hola

Nana: How are you?

Orlando: I’m good, how are you?

Nana: Good. Babe have you come up with any anything for creation & computation?

Orlando: Nothing that I like so far.

Nana: Me too.

Orlando: I liked what you wanted to do before.

Nana: I donno.. Somehow I don’t like it anymore. Like I want it to be more conceptual you know.

Orlando: I know baby

Nana: I miss you

Orlando: I miss you too. Hey what if we did two assignments that somehow complimented one another.

Nana: what do you mean?

Orlando: Like we made things that worked separately and then worked together too.

Nana: OH MY GOD. I love it!!!! That would be so cool… We’d be like the coolest couple ever..

Orlando: Such geeks

Nana: hehehe

Orlando: So what if..

Nana: OH I KNOW I KNOW… hold on

Orlando: ok

Nana: I watched this thing before on TV … OMG IT WOULD BE SO COOL IF WE COULD DO IT… hold on hold on… Ima google it now… hold on

Orlando: Baby chill… Im holding

Nana: ok… wait.. let me call you back in like 5 min

Orlando: ok

Narration

“Love”. What does intimate love look like? One might see my boyfriend and I, and based on our behaviour come to an understanding that we are in love. But, what does the language of love really consist of? So, in the words of Mark Watney, we decided to science the shit out of it. According to a study at the University of California on the physical effects of being in a relationship, couple’s breathing patterns and heart rates synchronise when they sit close to each other. Drawing inspiration from this study, Orlando & I decided to put our relationship to the test. We wore a heart rate & stretch sensor for the duration of 8 hours with the goal to alter the language of love through data. The more our breathing patterns and heart rates were synchronized the higher the percentage.

 

OBLIVION

clocks screen-shot-2016-10-24-at-4-33-03-pm screen-shot-2016-10-26-at-1-57-43-pm screen-shot-2016-10-26-at-6-48-49-pm screen-shot-2016-10-26-at-6-42-08-pm20161028_141757 20161028_141702

Oblivion experience

Name of group members: Rana Zandi (NANA), Natasha Dinyar Mody

Project title: Oblivion

Code: https://gist.github.com/ex0self/3088f3242eeac78f2aff551206abf8bb

Project description:

Oblivion is a non-linear multi-screen experience narration, based on the functionality of memories within the human brain using a hyperlink structure.

The narration, explores the protagonist’s memories (Lily) of the people around her at school.

Although the experience involves the reader on his/her own to read the narration, the experience needs to be completed as a group. Since the narration uses a hyperlink structure, it provides readers with an illusion of choice and allows each individuals to read the same story through different paths enabling them to compare & contrast their knowledge of the events within the story with one another.

Supporting the narration is a timer enhancing the group experience. Users can compare their results (pages completed) at the end of the time given. (15:00 min)

Also embedded within each page of the story, are supporting visuals  (3 clocks) that change based on the movement of the mouse and the path taken by the user enhancing not only the concept, but also the multiscreen experience.

Process context:

Oblivion explores various concepts;

  • Memories & the hyperlink structure – According to Ray Kurzweil in his book “How to Create a Mind”, memories are like hyperlink structure. Memories have the ability to be linked to each other and recall one another based on new memories (experiences). For-example, for 6 months you might have a neighbour with a funny looking moustache. Years later, walking down the street a man may pass you by who has a moustache. Suddenly, a memory of your old neighbour will pop into your head even though the man who passed you by might not at all look similar to your old neighbour. Oblivion explores the same notion. Through the hyperlink structure Lily’s memories (the protagonist) get recalled by one another.
  • Memories – What is a memory? Our brain doesn’t hold on memories like a computer or a folder cabinet that one can open and browse through. In fact, memories don’t exist at all even though the entire functionality of the human brain is based on memories and experiences. Each time an individual experiences something new, a specific neurological pattern (neutrons passing messages to one another, think of it like a lightning pattern) takes place within the brain. It is the recalling of this pattern that we call memory. When an individual recalls a memory, the same neurological pattern assigned to that “memory” needs to be recreated/gets recreated. This means that each time you recall a memory, the whole experience takes place in your brain as if it was the first time. However, each time this pattern gets recreated it is a bit different from the previous ones. Hence, some researches believe that each time you recall a memory it becomes further and further away from what actually took place the first time that neurological pattern was created. In short, the memories you never recall are the safest. Oblivion was written based on this concept. The story forces the reader to remain in a state of oblivion through loops of recalling memories, not being able to tell which of the events or characters are real according to the protagonist (Lily).
  • Time – The clocks within Oblivion – Oblivion starts with the first page holding 7 hyperlinks for the reader to choose from. Each link is linked to a different page that unfolds a certain event in time while holding several links that repeats the same process. The only way the reader can know if they are traveling backward or forward in time is through the clocks. When the mouse is not moving the clocks are frozen in time. As the reader moves the mouse on the screen, the clocks are in a state of oblivion; some are moving clockwise and some counter-clockwise with varied speed. When the reader has to make a decision between which link to choose, as the mouse hovers over different links the clocks will ALL either go clockwise or counterclockwise hinting to the reader of the direction of time.

Process journal: 

https://drive.google.com/drive/folders/0B0v0C2NFCXOiVzVST25RQnp2NDg?usp=sharing

  • Phase One – Writing the story – It was challenging to come up with the narrative. Knowing who our persona’s were helped us to narrow down a theme. Since, our readers were going to be our classmate and majority of them were girls, we decided to write a love story based on familiar events taking place within the school environment. Since NANA is currently researching memories and cognition, we decided to go with this concept. It was tricky however, to embed the hyperlink structure while writing the story. Each page had to go back to a certain other page, and this had to be kept in mind through the writing.
  • Phase Two – Coding the story – Coding the story wasn’t hard at all. Using basic HTML & CSS we had a layout and a base to work from.
  • Phase Three – The timer – Setting the timer was very challenging. We had to figure out how to create a fixed frame that could hold the countdown clock and the content. Both of us very new to coding how to understand pages and pages of Javascript from various searched on google. We found a timer on one our google searches, but we had to study and learn the code of the timer. We asked help from other classmates, and even some second year students. Finally, we were able to understand it enough to be able to tweak it to add our own time and aesthetics to it. Through this phase, we learned to be more organized with our code in order to understand what we were doing. Sometimes, our code wouldn’t work and we couldn’t find out why. When a second year student, suggested we duplicate our files and keep organized or even sometimes work backwards, helped us to finish these phase successfully.
  • Phase Four – The visuals – The P5.js experience was a mission on its own. Phase three and phase four took the longest. The creation of clocks weren’t as challenging as embedding them within our story and coding them to go backwards and forwards in times according to the movement of the mouse. After figuring out the code, we had to go through the story and embed the code for each link. (which there were many links!!)

References & resources:

http://p5js.org/reference/

http://www.genekogan.com/code/p5js-transformations/

http://www.animatedimages.org/img-animated-clock-image-0006-82099.htm

https://processing.org/examples/clock.html

https://github.com/processing/p5.js/issues/1375

https://www.sitepoint.com/build-javascript-countdown-timer-no-dependencies/

Panic-Pad

https://github.com/sharkwheels/materialMadlibs

Group: Material Madlib 3 – Nadine Lessio & Rana Zandi

Project Title: Panic-Pad

Project Description: Using fabric, a button, and a buzzer to simulate the physiological and psychological feelings of a panic attack for the user who has never experienced the feeling.

Project Context: As a person (Rana) living with panic disorder, not only do I suffer from the illness itself, but I find it very challenging to learn how to cope with the stigma that is connected to having a mental illness. Many people (including loved ones) hold preconceived notions or lack understanding of the illness. It is common for a person who suffers from panic disorder to get comments such as “You have nothing to be nervous about, or it is just your mind..”

However, panic disorder is not just stressful and racing thoughts inside the mind. They have painful and very much so physical symptoms. Panic-pad, is an attempt to remove this stigma by allowing the user to experience a panic episode through the sound of a rhythm of a heart of a person who is suffers from panic disorder.

Panic-Pad was tested on 3 different subjects. Subject A, was a 29 year old female suffering from panic disorder. She went through a panic episode right after trying the experience. Subject B, was another female, 22, who said she feels anxious after trying the experience. Subject C, was a 47 year old male, who was reminded of his childhood nightmares after trying the experience.

Relative articles: 

http://www.huffingtonpost.com/sarah-fader/stigma-mental-illness_b_4680835.html

http://anxietypanichealth.com/category/stigma/

Evolution of Panic-Pad Video Link: https://vimeo.com/185875466

Code Link: https://github.com/sharkwheels/materialMadlibs

Process Journal:

  • Phase 1: Getting familiar with Arduino tone library, and exploring some native sounds that could be made. Ideation about concept took place while experimenting with the number of inputs and outputs. We ended up having 1 single input that switched between 2 sounds (or songs), while we were thinking about the “Rape culture” or “Sexual Harassment” the binaries of ok and not ok.
  • Phase 2: We explored doing multiple inputs and 1 output, switching between different states, while considering our fabric choices. More conceptual ideation took place, while we explored the texture of the material we had and started using real sounds rather than tone sounds.
  • Phase 3: While exploring different sounds to represent “what is not ok” we started considering various heart beats that could communicate the feelings of anxiety. As a result of such exploration, our concept was re-foremed into the stigma surrounding mental health and panic disorder. Panic-Pad was born at this stage. While aiming to include a resting state we experimented with speed, and volume to see how we can affect the system. At this stage, the longer you hold the Panic-Pad the louder and faster the sound of the heart beat gets.

Limitations:

  • Clarity – Arduino’s sound ability is somewhat limited and it tends to be good for square waves, and 8-bit noises. When you start to get into purposeful distortions it is not as good.
  • Processing Power – There are a couple of libraries that we tried, including TMRpcm for the second prototype, to enable the Arduino to play sound clips of a sound-card. Which yielded some interesting tones to play with but had issues with low growl or gutteral sounds. We also looked at MOZZI which enabled the Arduino to be a full modular synthesizer. It was really strong but it pushed the Arduino to capacity and disabled native arduino functions like millis(), which we needed to provide timing.
  • A general lack of play back options.
  • In the end our explorations lead us to use Processing to manage our sound and speaker output.

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.