Plants vs Kylie

Plants vs Kylie

by Kylie Caraway

plantsvskylie

Project Description:

Plants vs Kylie is a program to remind Kylie when her plants are thirsty! Combining Arduino, p5.js, Adafruit IO and IFTTT, Kylie’s plants remind her when they need water through an animated avatar on the web, as well as a notification on her phone. Using moisture soil sensors, Kylie can evaluate if her plants are happy (less than 40 % dry) or thirsty (40% dry or more). Through the anthropomorphization of plants, perhaps Kylie will feel greater kinship to the health and happiness of her fellow organisms.

Video of project:  https://vimeo.com/242331497
Web Link: https://webspace.ocad.ca/~3164325/kylieplanttext/
IFTTT Notification:

img_0479


 

P5.js Code:
 https://github.com/kyliedcaraway/DIGF-6037-Creation-Computation-plants-vs-kylie/tree/master/kylieplanttext

Arduino Code:
https://github.com/kyliedcaraway/DIGF-6037-Creation-Computation-plants-vs-kylie/tree/master/Experiment_3_kylieplant

Circuit Diagrams:

experiment-3-moisture-sensor_scheme moisture_sensor_kylie_caraway


Design Files:

plantreference

plant reference photos

roughsketches

Preliminary Sketches

colorpalette

Color Palette

finalsketches

Final Graphics

GIF Animations: https://vimeo.com/242482539

img_0472

Presentation Setup

img_0488

Plants at my Desk


Process Journal:

Day 1: Brainstorming.

What could I use in the vicinity of my desk? What can I make that could improve my day to day tasks? I quickly decided I wanted to use Arduino for my input to send data to P5.js. First ideas:

-Audio increases if you fall asleep at computer (proximity sensor and audio- interesting, but I decided I would probably never use this)

– Use proximity sensor to detect me at my computer (Why? Also, it could detect anyone, not just me)

-Measure temperature outside and get a display (I realized I could just look at my phone to get this information, so I abandoned this idea pretty quickly)

– If the temperature inside is too hot, turn fans on (more applicable in Texas than Toronto, also I doubt the fans in our kits would actually cool me down)

– Lights change color based on weather (I still like this one, but I wanted to do something a bit more creative)

– Measure soil and create automatic irrigation system (where is the computer in this idea? I would not need p5 to complete this)

– Measure plants moisture and create a visual and audible response (very creative but too experimental, less practical)

– Measure soil- illustrates plants happiness based off moisture level. ( I really liked this idea: this is something that was different than examples in class, it could be used in the vicinity of my desk with my houseplants, it allowed for creative freedom, and it was a practical use that I would honestly use to help me take care of my plants.)

 

Day 2

I made a decision on my project theme. In the end I decided to measure soil moisture that illustrates plant’s happiness and create a visual out of it. I then planned to create a button alongside the visual that, when pressed, would signal for my irrigation system (pump with water) to begin. This would be helpful when I am away from home.

I went to Sheridan Nurseries to purchase soil and a smaller planter that would be easier to bring my desk plant to class for presentation. I ended up buying three plants along with the soil and planters: I couldn’t help myself. I bought an ivy plant, a succulent, and something similar to a philodendron.

I went to Creatron to purchase soil moisture sensors and extra wire. I was very happy to find that the sensors were inexpensive, so I purchased three. After class on Friday, I decided to simplify and not do irrigation for this project, per Nick’s advice.  I decided to focus on creating only an interface, not an input that creates a visual, and then a visual that in-turn creates a physical output. This was too complicated. Too many things could go wrong, with too little time. I refocused on the soil moisture sensors to “Neopet” my plants.

 

Day 3:

Today I tested my soil moisture sensors. I began by using the Arduino Demo Code for soil sensors on Creatron’s website. According to Creatron, “this sensor uses the two probes to pass current through the soil, where it will measure the resistance to calculate the moisture level. More water makes the soil conduct electricity more easily (less resistance), while dry soil conducts electricity poorly (more resistance)” (Creatron).  The sensors register moisture on a scale from 0 to 1023, 0 being perfect conductivity and 1023 being no conductivity.  I created a short to see how close I could get to 0: the closest I got was 31. When I hold the sensor in the air, it is normally 1023, although sometimes it wavers to 1020 or so.

image3

I began by testing the same soils with different amounts of moisture in them.

image1

image2

The sensor number changed based on plant, amount of soil, location of sensor in the soil, and whether I was moving the sensor back and forth between different plants. The numbers were all over the place. I could not find a consistent trend as I moved the sensor within the same plant, between plants, between moisture levels, etc.

image5

I decided to use one sensor per plant, because my sensor was malfunctioning when I moved it around. I also didn’t like the feeling that I was stabbing my plants every time I re-inserted the sensor.

I later realized my dirt was not dry enough. I attempted to use flour, but it was not effective and would not register with the sensor. In another attempt, I placed a pot of soil on a heating fan in attempts to dry it out.

image4

I also dunked my sensor in water, which did not work very well. I rarely got a reading, and I later realized that different types of water (tap versus bottled) would have different effects on the conductivity. While my first idea was  to have various levels of very dry, dry, normal, wet, and too wet, I quickly realized that the numbers were too varied to make strict, narrow distinctions.  I then changed my idea and instead decided to do dry and wet only. Using Arduino, I mapped the range of numbers 0 -1023 to a percentile, 0 to 100. I was then able to read the sensor value, as well as the percentile (percentile referred to dryness, so 10 % dry does not need watering, whereas 60% dry needs water).

 

image6Sticky Note process: Simplifying data from 0 to 1023 into categories of moisture

 

Day 4:

Today I connected Arduino code to P5. I used a combination of codes from our practice in class. I used the input code as a reference to get p5 to see my Arduino data. I used the display code from the output code to display the information, and tinkered around with the code. I decided to use if statements, and I told p5 to categorize in 3 areas: Less than 40% = happy. More than 40% = thirsty. 100% = not connected to sensor (this is air, and I did not want it to say “thirsty” when the sensor was not connected, in the plant, or placed correctly.) I decided on 40 % because, after reading recommendations online, the plant moisture data should be around 600 out of 1023 in order for it to be satiated. While this depends on the specific plant, I decided to follow this number as the basis.

Next, I tried to use graphics rather than text, for aesthetic reasons. I ran into the issue of needing a server, because Google Chrome won’t let you reference something straight from your computer. I attempted to use Wamp server, so that it was local and could reference Arduino, but also support my graphics. This worked well. Next, I tried the OCAD webspace, because I was unsure if it would send data from my Arduino to this website. To my surprise, it also worked. I ended up using both servers to test my program.

 

Days 5 and 6:

I used these days to draw up graphics for my thirsty plant, happy plant, and sensor error images.

 

Day 7:

I have successfully centered the image and text in the webspace. I am now attempting to connect my sensor information to Adafruit. No success so far. The feed is blank, and I am unsure how to monitor this or fix this. I am going to talk to Kate or Nick about this tomorrow. 

No Data Adafruit

Day 8:

Today has been an extremely stressful day. Nothing is working. I have to clear cache and restart my computer to get the data to update from Arduino to p5. Arduino was working, but the local server was not updating the data from Arduino. (I believe this has to do with the serial port: perhaps it is timing out or going into some sort of sleep mode? I am not sure.)

Luckily, I have finished 1 of 2 animations using Photoshop and After Effects. Unfortunately, I can not get p5.play to work. I only have 11 frames in a sequence in a folder, and they are small files. Not sure why I can never get them to work. I instead attempt p5.Gif.js and it works! Now I need to change my animations from image sequences into GIFs, which are crunching the quality of my animations, but they are working nonetheless.  I also bought a box from Michael’s to put my breadboard in. I drilled a hole in the side so the USB could fit through to connect my feather to my computer. I was going to glue moss around the box, but the smell of the moss was very chemical and I didn’t like how it looked aesthetically, so I went for the clean wooden box instead. This is easy to transport,  it stylistically matches my desk, and the box could be used for various other things.

img_0487

Day 9:

I have finally animated my happy plant, thirsty plant, and sensor error GIFs. The program is working, but sometimes it gets stuck and stalls out. It seems like the serial port is not updating the information to p5, because while the information is changing in Arduino, it is not changing the percentage value on my p5 webspace. In order to fix this, I have to open Arduino, then p5 serial control, then Atom, then clear cache, then open the file. There must be another work around, but this is the only way I can fix it at the moment.

 

Day 10:

My final attempt to connect p5 with Adafruit. I decided to start from the basics by following Nick’s code from scratch in separate example. I try sending data. I manage to get it to connect to Adafruit. Then, I connect Adafruit to IFTTT. It works! Except that it is using a mouse click example, and I want it to update without me interacting with the computer. There is also a major lag from my mouse click to the text I receive. It is somewhere in the communication between Adafruit and IFTTT.

img_0464

Next step: I need to implement this example into my own code.  I manage to get it to work with a mouse click.

Next step: I need to change from a mouse press to set it on a timer. This step was trickier, and it took me a while to figure it out. I had to copy and paste various pieces of code from the sending data to adafruit example and receiving data from adafruit example that Nick provided. 

First iteration: I need it to constantly check, so I need to place it in function draw rather than function mouse pressed.

Second iteration: I am not wanting to check data, i am wanting to send data. So I need to swap the IFTTT check with the IFTTT send data.

Third iteration: the polling time is way too quick. 2.5 seconds. I need to change that, I instead select 10 seconds (for the purpose of our presentations on Friday; I will likely change this to twice a day or longer when I use this at home).

Fourth iteration / problem: Although it says to send every 10 seconds, Adafruit is updating anywhere from every 10 seconds to almost a minute. There is a lag in information somewhere in this. (I did not fix this, and I am not sure that I can)

Fifth problem: As there is a lag between my webpage and Adafruit, there is an even longer lag between Adafruit and IFTTT. It seems that IFTTT checks for data once every minute or so. This results in me not receiving texts until the next minute, and then receiving 3 or 4 texts in a row. I also want to set a parameter so I don’t get a text when the data is 100, which means the sensor is not connected. It looks like IFTTT only has greater than, less than, equal to, or not equal to parameters. I can not say to send me a text between 40 and 99, I can only say to send me a text if it is greater than 40.

img_0467

 

Day 11:

Final test: connect Arduino to computer, open Arduino, open serial port, clear cache, open webspace… wait for plant percentage on healthy plant… working! Change to dying plant… Working! Wait for text…

Not working. Lovely.

I guess I will reload the page. Now it’s working! It is very finicky on when it runs, but when it does, it works well! I just have to make sure the serial port doesn’t randomly close.

I wait for another text…. Minutes later, I get bombarded with 10 texts.

Then 5 minutes later, 10 more texts.

Soon, I am flooded with text messages that have piled up somewhere in the data transfer.

I get an email from IFTTT: i am almost at my limit of texts for the month… Fantastic! I need to stop testing my project so that it will (hopefully) work during the project critiques tomorrow.

Presentation Day:

I check the morning before presentations: IFTTT emails me letting me know they will not send me any more texts. I quickly download the IFTTT App to send me notifications. It works well, but still has some issues with data lag.

During my presentation, my page became unresponsive due to the serial port closing. When I reload it, it works exactly as it is supposed to. I need to figure out how to fix this, but other than that small issue I am very happy with my peripheral prototype! I plan on using this project at home and further developing it. I know a few friends and family members who could use this with their plants.

 


Context and References:
Context:

Plants are an important piece of our ecosystem. While many people treat plants as inanimate objects, it is important to note that plants are complicated, living organisms that can sense, communicate, and possess memory.

Yet, as we move into increased urbanization, nature is not as accessible to us as it once was. We are slowly moving away from the natural and towards the mechanically produced. This project is an attempt to connect the two, to create a relationship between plants and technology. Plants have numerous healing affects. They are known for improving concentration and productivity, as well as reducing stress. In fact, some plants are known for actually purifying the air, according to studies by NASA. Caring for a living thing such as a plant gives us a sense of purpose — especially when you see it happily thriving. Unfortunately, many people are “plant killers” (the number of online articles titled something along the lines of “How to Not Kill Your Plants” or “Houseplants that are Impossible to Kill” must be in the hundreds). This project hopes to remediate this common issue through data visualization and communication that anthropomorphizes our houseplants.

The notion of plants combined with technology is not new: some artists have transformed plants into musical instruments, while others have opened up communication for plants to tweet or call you when they need some TLC. They can be used for scientific developments (such as these plant lamps that convert bacteria within dirt and vegetation into electricity) or they can be used as installation pieces (such as Botanicus Interacticus) and artwork (Floral Automaton grows flowers digitally). Some artists have gone fully digital, such as Sasha Katz, who uses computer graphics to depict an ever-growing relationship between the natural world and technology.

While the combination of nature and technology might at first seem arbitrary, placing these pieces together can prove to be not only aesthetically pleasing, but also an important development as we become swallowed by technology and shift into “metropolitan dwellers” (Bryant). It is important that we don’t fall away from our connection to the natural world. In our often commodified, urbanized lifestyles, perhaps the relationship of plants and technology can reconnect us to our roots and environment. Perhaps pieces such as those listed above will invite humans to take a step closer in experiencing the consciousness of another organism.

23376331_10155799640523149_3302470569521971361_n

(As a side note: this image popped up on my Facebook during this project. They are obviously data mining my browser history)

References:

Reference for soil moisture sensors: I used Creatron’s demo code to understand how to properly use soil moisture sensors. My sensors came directly from Creatron (they were not the popular SparkFun version I found online) so my safest bet was to follow the code provided from Creatron. The code immediately worked in Arduino, which allowed my coding process to run smoothly.

Reference for sending data between Arduino and P5.js: In order to send my data from Arduino to P5.js, I used the serial-input to p5.js code on the website we used in class on the first day. I edited the code that displayed a graph to instead play one of three animations that would appear based on the data.

Reference for Adafruit and IFTTT: In order for me to send my plant moisture data to Adafruit IO, I followed Nick’s class examples and combined both versions of the IFTTT connect codes. I used version 1 to understand how to send data to Adafruit. I used version two to implement a polling rate within my draw function, so that Adafruit would automatically receive data in specific time intervals, rather than relying on the user to click the screen or interact with the page to send data.

 


Next Steps:

I am very happy with how this project has turned out, and I plan on using it in my home. While it could be used at this very moment to monitor my plants, there are a few ways I would like to improve this project:

  • Monitor sunlight intake
  • Send information wirelessly, rather than having to plug my plants into my computer every time
  • Monitor different plants at once based on their individual water needs; additionally, create a webspace where you can view all plants at once with their individual data
  • Create an automatic irrigation system that leaves them alive and well without my need to interact with them ( I am honestly on the fence with this idea – while I of course want my plants to be consistently healthy, and this would help me in times when I am away from home, I also feel like removing myself from the picture removes the relationship between plant and human, something that I really value with my plants)

Bibliography:

 

Bryant, Taylor. “Why Are Millennials Obsessed with House Plants?” Nylon, 21 Mar. 2017, nylon.com/

   articles/millennial-house-plants-obsession. Accessed 9 Nov. 2017.

 

Faludi, Robert, et al., editors. Botanicalls. 2006, www.botanicalls.com/. Accessed 9 Nov. 2017.

 

Cook, Gareth. “Do Plants Think?” Scientific American, 5 June 2012, www.scientificamerican.com/

   article/do-plants-think-daniel-chamovitz/. Accessed 9 Nov. 2017.

 

Igoe, Tom. “Lab: Serial Input to P5.js.” ITP Physical Computing, 11 Oct. 2017, itp.nyu.edu/physcomp/

   labs/labs-serial-communication/lab-serial-input-to-the-p5-js-ide/. Accessed 9 Nov. 2017.

 

Jobson, Christopher. “Animator Sasha Katz Explores a Symbiotic Relationship between Plants and

   Technology.” Colossal, 25 May 2017, www.thisiscolossal.com/2017/05/

   animator-sasha-katz-explores-a-symbiotic-relationship-between-plants-and-technology/. Accessed

   9 Nov. 2017.

 

“Moisture Soil Sensor.” Creatron Inc., www.creatroninc.com/product/moisture-soil-sensor/

   ?search_query=soil+moisture+sensor&results=4. Accessed 9 Nov. 2017.

 

Puckett, Nick. “Data and APIs.” Github, 3 Nov. 2017, github.com/DigitalFuturesOCADU/

   creationANDcomputation/tree/master/P5examples/DataANDapis. Accessed 9 Nov. 2017.

 

Rodriguez, Natalia. “Hacking House Plants to Make Music When They’re Touched.” Fast Company, 18 Feb.

   2014, www.fastcompany.com/3026612/hacking-house-plants-to-make-music-when-theyre-touched.

   Accessed 9 Nov. 2017.

 

Vezina, Kenrick. ““Plant Lamps” Turn Dirt and Vegetation into a Power Source.” MIT Technology

   Review, 23 Nov. 2015, www.technologyreview.com/s/543781/

   plant-lamps-turn-dirt-and-vegetation-into-a-power-source/. Accessed 9 Nov. 2017.

 

Visnjic, Filip. “Floral Automaton – Digital Growth with Physical Adaptation.” Creative Applications,

   21 July 2017, www.creativeapplications.net/arduino-2/

   floral-automaton-digital-growth-with-physical-adaptation/. Accessed 9 Nov. 2017.

 

Wolverton, B.C., et al. Interior Landscape Plants for Indoor Air Pollution Abatement. NASA, 15 Sept.

   1989, https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19930073077.pdf. Accessed 9 Nov. 2017.

 

 

Experiment 3: FoodNeg

The FoodNeg allows dog owners to know when their dog is hungry and in need of food. The FoodNeg is using light sensor, placed above your dog bowl. Once the dog approaches the bowl, the sensor will detect the change in light level, and send a notification to the owner’s browser to let them know their dog is hungry.
A happy dog is a well-fed dog!

Designed with love for my snow-bunny Lola
capture

Link: https://webspace.ocad.ca/~3164180/foodneg/

giphy-2

Video

Animations

webp-net-gifmaker

webp-net-gifmaker-1

l1

 

User flow

flow

Diagrams

diagram

diagram2

Code

https://github.com/LolaSNI/FoodNeg

Process and challenges
Along the way, I switched my original idea from using a microphone to light sensor. I’ve come to realized that Arduino microphone, which was my first choice, is not working as planned. I wanted to use the microphone to pick my dog’s bark and howl and create two different of sound level, high and low, each representing a different need (food and play).

  • Day 1 – Try to define Arduino microphone as the input and map the values.
    – Started the process with a sound detector but realized it works mostly as ‘analog’ device – on/off and not giving different sound values, so I had to switch to a microphone to get the results I want. (or at least I thought it would).
    – Search online for a solution since the microphone requires different voltage (5v) than what my Arduino is offering (3v). I used USB as power resource to overcome this problem.
    – Tested the input so I’ll be able to map the values. I figure out that the values are between 0-110. So I created two groups of values 0-49 and 50- up.
  • Day 2 and 3 – Build P5js code and connect it to Arduino.
    –  First I did some research about sound to find out the options existed. I played with some code examples, with tiny modifications (mostly graphics).
    https://youtu.be/uqUKbtvUafg – Mic input from – https://p5js.org/examples/sound-mic-input.html
    – Define the structure of my code. I find the visualization process to be very helpful especially if you are not very experienced with writing code.
    code
  • Day 4 – Overcome the ‘need’ to use my computer’s microphone as an outsource for voice input.
    – Since I accidentally copy the whole libraries from another project, I forgot to exclude the sound and speak libraries from the HTML.
    – Security issues in Chrome – prevent me from using chrome as a browser because it is detecting an outside device. Look for a solution online, but since the answers I got were too complicated for me to understand I couldn’t figure out what was wrong and ask for help. I was suggested to use another browser since I’ll probably won’t be able to fix it. So I decided to try Firefox, and it was working fine there.
  • Day 5 – Lots of frustration and one MAGIC! (Continue with P5JS code).
    – I plugged the board again and open it in Chrome (out of habit) and everything works (Magic!).
    – Try to map two levels of output – each for different sound level – complete failure.
    Howls = Low = Food animation
    Barks = High = Play animation
    – Turning point – After spending two days of working on the microphone definitions and mapping, I then realize that the values on my console are not changing. While I was looking for a solution online, I realized that the microphone, same as the sound detector, can’t pick different levels of sound (note: always read the spec BEFORE you start using any sensor!!!). So I’m switching my original plan. I decided to use a light sensor instead. I’m creating a basic light sensor Arduino code. Map the values on the console. Change the P5JS code based on the new values.
    – Could not make gif images to work – I wanted the notifications to be dynamic so that they will grab the dog owner’s attention. After trying a few examples I search online to solve this issue, I decided to replace the gif images with simple p5js animations.
    – Try to test different animations code at the p5js animation – play library.
    https://youtu.be/BNP5R1EWUlU – follow the code from http://p5play.molleindustria.org/
    – Create the assets and write the animations and implement them in the original code + update the HTML with the play library.
  • Day 6 – Design
    – Create the product design – a dog bowl station with the light sensor attached to it
    – Improve the animations.
    – Test the light sensor ranges with my dog to make sure it fits and works.lola

 

Testing and evaluation
– The design should feet the size of the dog otherwise it won’t be able to detect it.
de
– Changes in natural light in different rooms affect the values of the if statements. I had to test it again in the room we present in to reset the values.
– Consider- The mess dogs make when they might harm the sensor.

Future development
Add more sensors to other parts of the house so you can track other needs such as go for a walk – if the dog stands next to the door, or let’s play if the dog is standing next to his toys box.

References
https://www.youtube.com/watch?v=setjiVH0_IYhttps://p5js.org/examples/sound-mic-threshold.htmlhttp://www.instructables.com/id/Arduino-Sound-Sensor-with-LED/

https://www.youtube.com/watch?v=lsW3upcvsxg

https://tkkrlab.nl/wiki/Arduino_KY-038_Microphone_sound_sensor_module

https://stackoverflow.com/questions/28799792/failed-error-in-connection-establishment-neterr-connection-refused

http://p5play.molleindustria.org/

https://www.freepik.com/premium-vector/hand-drawn-dog-illustrations_813814.htm

JOG – Quinn Rockliff

About

This piece deals with my personal experience moving through online spaces and being triggered by content. In my time online I have seen an increase in the use of content and trigger warnings online, as well as a rising debate surrounding the experiences of those being reminded of trauma online. Often these experiences are delegitimized, challenged and seen to be over reactions. From my own experience, online content is just one of the inputs I have that can cause me to relive and be reminded of my trauma. This piece is both a statement and experiment as I extend my internal experiences beyond my thoughts and translate them into a recorded physical reaction. Everytime I am triggered by content online I will press the button, prompting a facebook status to be sent. The record of my emotional reactions exposes my insecurities surrounding not being believed, fueled by years of being challenged. The platform of social media is one that I share all of my art on, and have been able to reach a large audience through this and foster community. But, I often wonder how this public platform complicates my healing, how it reinforces concepts of validation, and how the platforms themselves are tools used often to perpetuate rape culture online. This tool to record this data is not a page I will share with others – although those who I did show it too told me they found themselves checking it often – wondering if any new statuses arose.

Title Meaning 

The piece is titled JOG to reference the term “jog my memory”.

Making it Happen

I used the button set up from experiment one digital input example. The button parts were soldered on and connected using longer wires.

featherm0wifi_digitalinput_button_bb

 

Link to my code used for Arduino and P5js – https://github.com/quinnrockliff/Experiment-3-

 

Process Journal
img_8061 img_8063-2This is where I began my train of thought. I broke down my process into steps and followed it pretty much how I thought I would. Although it seemed much more simple at the time. I began to become very aware of the amount of content online I was bothered by – this made this project a little bit hard to work through, but ultimately – this was what I was trying to negotiate.

For step one, my arduino code, I used the input example from experiment one. This worked well and I was able to monitor if the button was working by looking at the serial.write function. It was doing a weird thing where it was writing between 48-49 instead of 0/1 but this was fixed with a change to the serial.write function.

I knew I had to connect my serial port (arduino data) to a webpage. I knew how to do this from our in class example that used graphing and a potentiometer but I wanted to use a button. I started off building on the graphing code and had a very silly graphing function happen where the graph just went up one line and down again when you pressed and released the button. screen-shot-2017-11-08-at-5-16-42-pm

This was something! So now what I needed to figure out was how to send this data to IFTTT using Adafruit IO. I used the code from the IFTTT example but was having some trouble creating an if statement that made sense. It turned out this was really an issue of not having two =.

Then came the next issue, Adafruit IO was reading the data, and IFTTT. It seemed that everytime I pressed down the button, even for a second it read a change in data upwards of 200 times, this meant 200 tweets were being sent to my twitter account and I was blocked out very quickly. I had to delete and recreate applets multiple times after this happening. I went to office hours and I created a new variable called prevdata which = 0 so that any time prevdata came before 1 it would register, otherwise it would not. screen-shot-2017-11-08-at-5-01-12-pm

As you can see in my initial sketch I expressed some interest in a way other than the classic box to put the button in but I didnt have time to do so. I put my breadboard and wired my button through a hole in a box. I wanted to add more emphasis to the narrative and doubts of this piece so I did stream of consciousness writing on the outer layer of the box.

img_8064

img_8110 img_8105 img_8113The issue of presentation and demonstration also became an interesting aspect of my piece I did not previously consider. I did not want anyone else to touch the button as this would stop the data being a reflection of my experience. This caused me to write a little excerpt and placing an acrylic box over the button. This raised issues of consent and temptation that I did not anticipate but I believe are actually fundamental to the extension of my emotions into physical action and the vulnerability associated with doing so.

fullsizerender-4

Luckily and unsurprisingly, all my classmates were respectful and followed the instructions. Yet, I wonder if placed in a public place without my gaze if people would do the same. That is an equally interesting experiment tackling consent and gaze that I may consider in future iterations or new pieces.

screen-shot-2017-11-12-at-5-44-55-pm screen-shot-2017-11-12-at-5-44-48-pm

Documentation of my project functioning – via fb statuses published by IFTTT.

Context 

I am  influenced and inspired by Emma Sulkowicz. The artist best known for their work “Carry That Weight” after their university refused to expel their rapist. Their later work involves technology and a lifesize sculpture called Emmatron that answers questions so that they don’t have to. This is something I negotiate in my work often, how I answer questions interpret those questions and how they challenge my experience. How to be open without opening wounds. la-et-cm-emma-sulkowicz-los-angeles-coagula-em-002More about that piece here: http://www.latimes.com/entertainment/arts/culture/la-et-cm-emma-sulkowicz-los-angeles-coagula-emmatron-20160228-story.html

Furthermore, this was very driven by my own experience. When I signed up to make a twitter for this assigment this is what I saw screen-shot-2017-11-07-at-6-47-50-pm

Checking to see if my facebook status posted I saw this screen-shot-2017-11-07-at-11-26-22-pm

The desire to both address these things and ignore them are what encouraged this design. Issues of validation and self doubt drove the text written on the box. The prompt to make something that is an extension of what I see online made this decision obvious, what I see online reinforces what I experience daily and the connection of the two through a somewhat dramatic physical movement (pressing a big blue button) exposes the desire to make this experience more evident to both myself and others.

Thank you 

#HaveABreak (Experiment 3)

A project that paces work and break sessions #HaveABreak #HaveAKitKat
by  Jad Rabbaa

LINKS :  Project Video | Arduino Code | Interface Code | Live Site

animation3

img_9332


PROJECT TITLE
#HaveABreak”  inspired from the KitKat slogan:  “Have a break, Have a KitKat”.
sloganAnd the hashtag because this project posts this hashtag to Facebook.

 


INSPIRATION:
1- The “psychology today” website mentions in this article why is it difficult to focus and what are the different ways to avoid distractions.

2-The 6 underlying principles of the Pomodoro Technique which is a time management method that uses a timer to break down work into intervals, traditionally 25 minutes in length, separated by short breaks of 3-5 min.
inspir1

This website http://www.marinaratimer.com/iZxavQJQ  for example uses the pomodoro technique to time the work sessions and notify user with sound the end of each session.

3-Productivity apps such as “Forest: Stay focused” which helps you:
inspir2• Stay focused and get more things done
• Share your forest and compete with friends
• Track your history in a simple and pleasant way
• Earn reward and unlock more tree species
• Customize your whitelist : Leaving Forest and using apps in whitelist won’t kill your tree.

 


SUMMARY: Project Context
As a DF student, I lately have been swamped with work, studies and assignments. Like almost all my classmates, I am continuously tired and needing a way to stay focused and avoid distraction.
Once focused, I personally lose track of time and because of the stress I forget or procrastinate taking any breaks which affects the productivity and health.
On a side note, due to the big amount of work and time put into those assignments free-time has gone down to just few minutes when I take small breaks. Friends have noticed I’m being anti-social since it is always a bad time to interact and connect with them as it is hard to sync my short free time with theirs.
The digital timer and setting it manually gets old after couple of attempts and is not effective, there was a need for a solution.

 


CONCEPT: Project Description
This project creates a new extension (peripheral) to the computer during working time.  (I incorporated it within a box that is supposedly a laptop cooler, possibly distributed by KitKat as a part of a promotion or advertising campaign oriented towards High school and university students ).

It is a ‘physical timer’ that doesn’t need to be set digitally or manually, but rather monitors the time spent working. It starts counting automatically the duration of the work session from the first second the user sits in front of the laptop.
It notifies (visually) when it is time to take a break (or take a KitKat if you may 🙂 ) following the “Pomodoro technique” that is known to increase productivity by pacing breaks and distributing work sessions in a certain pattern of time intervals.
Furthermore, To make the semester time less anti-social, the notifier posts automatically on the user’s facebook that a break is taken through posting a picture along with the hashtag.

 


What For? – OBJECTIVES:
1-Motivating user to work and focus and avoid distraction through a certain reward at the end of each work session (in the case of this project,  a Kit Kat chocolate bar)

2-Notify user that one pomodoro (25 min) of work session is complete and it is time for a short break to increase productivity and keep mind healthy.

3- Inform the social circle through Facebook that a break from work was taken and it is good time to connect live and contact the user to interact on the spot.

 


How? – PROJECT PROCESS:
In the case of this project Kitkat company could use this “laptop cooler” as a part of a promo or advertising campaign / awareness #HaveABreak #HaveAKitKat and make its external outlook in the shape of a Kit Kat chocolate bar just like I did in this project.
img_9331-1img_9330-1


1- Kit Kat distributes Kit Kat laptop coolers to high school and university students. The cooler would have a link on it for the students to follow when using ths cooler with their laptop.

2-Students then go on doing their assignments while they can see on the webpage they visited the progress they are doing in the form of a countdown animation to reach the end of the first work session (25 min) and deserve a break after that.  

3- Once the pomodoro or work session of 25 min is completed, student would be notified with the window pop up that it is time for break and a new page with webcam opens with a button ok to snap a pic with the hashtags  #HaveABreak #HaveAKitKat  in order to post the pic on Facebook.

4-Facebook friends and classmates would see the post online and can interact for a short while before going back to the next pomodoro or work sessions.

5- Kit Kat could then make a competition to reward the most focused participant and who completed the most pomodoros and reward with gifts and prizes.

 


To Whom? – USER:
High School and University students and/or any computer workaholic.

 


CODE:
You can find the final code for the project here including the code for the arduino, as well as the p5.js interface.

 


JOURNAL:

DAY 1:
I started brainstorming for the the big idea of my  project and I was wondering what could be helpful to me when I use my computer and what problem should it solve.

First idea that came to my mind was that I am spending too much time in front of my screen without taking spaced brakes that help me be more productive and mobile…. thought about so many ideas but i filtered them all to my main aim: keep me focused when working on my computer and help me take breaks in a smart way while staying social and healthy.

I mainly wanted to use the proximity sensor to trigger some visual on the  screen to keep me motivated and push me to work more and  keep going until the completion of my assignments without feeling overworked.

I started researching online and I found that lots of studies  prove the efficiency of taking paced small breaks between work sessions and the benefits on productivity and motivation. I even found websites such as  http://www.marinaratimer.com/iZxavQJQ which I am using right now to time my work session and take a break every 25 min following the Pomodoro Technique.
The Pomodoro technique takes a low-tech approach, using a mechanical timer, paper and pencil but for me personally if i had to physically set the timer I would procrastinate, I feel I need something that detects the physicality of me working and start counting to notify me digitally or even physically as well if possible when it is time for a break.
So i started sketching the interaction:

screen-shot-2017-11-13-at-10-08-01-pm


Day 2:
First things first: setting the arduino circuit with an ultrasonic proximity sensor following the examples provided in the course outline.

screen-shot-2017-11-13-at-10-08-11-pm

screen-shot-2017-11-13-at-10-08-28-pm

I altered the code provided and I started and added some Integers for the distance value and and the threshold.

I had an issue that my arduino keep saying a integer is already defined and was showing errors when reading the sensor values, which took me some time to solve.

I realized I cannot upload the code to the feather when two code files are opened I don’t know why, I got the error below.

screen-shot-2017-11-13-at-10-08-38-pm

Then through the serial port I started reading on my browser the values of the sensor. So far so good.  (side note: sometimes the values drop down to 0 as seen below which is due to the finicky sensor)

screen-shot-2017-11-13-at-10-08-46-pm

So serial.P5 reads the distance from arduino now, great, but that’s continuous, I need the trigger to happen at a certain value. I ideally need the console to read the time a user spends when they are less than “maxDistance” (35 cm) from the sensor.

After a lot of logical thinking back and forth I figured the steps that I should take to test,  if what I am thinking is correct:

..So back to arduino: I am printing below the distance from sensor, and also the amount of seconds that this distance has been below than a certain value (35 cm) . As you can see below in the arduino monitor after “Seconds above threshold” I can monitor the duration the user is spending when they are less than 35 cm from the sensor (from the computer).

screen-shot-2017-11-13-at-10-08-56-pm

Things in this phase don’t end here. I have to make sure my browser reads this value so leaving arduino for now and going back to the sketch.js:
I added the console.log to read those values in the browser’s console and it was sweet to see it respond to the time I spend blocking my sensor with my hand and then back to 0 when I release.

To visualize this on my canvas, I went back to Kate’s graph example and I changed the “inData” to “Threshold”, so the mapping is 0 to 5 “thresholdLevel” even though I would eventually change it to a completed pomodoro’s duration which is 25min multiplied by 60 seconds = 1500

screen-shot-2017-11-13-at-10-09-06-pm

Ideally the work time should be 25 min but for demo purposes of this project in class i brought the time down to 5 seconds (thresholdLevel) so if a user spend work time (threshold) more than 5 seconds the action would be triggered. To get that value I had to parse the console.log to a variable which triggers a function.

Javascript syntax would be like :

If (threshold>=thresholdLevel){
Console.log (“triggered”) // to make sure it works
////Placing my function lines here
}

And… It worked !!! (I forgot to take a snapshot probably because of how excited i was, it felt like i did half of the code work at this point)
Yeaaaayyy !!


Day 3:

I dedicated this day for visualization.
Currently when graphdata is operating, I see a graph moving up as the threshold increases before I see “ triggered” in my console.
I now have to design the loader page (instead of what the grafphdata in draw does) and the interface page which would be triggered by the completed threshold.
I thought the loader must be easy, just time consuming to make it loog good, so I started with designing the interface page that will show when I see “triggered” in my console.
I want to :
– change the background color slightly
– show the image of the interface background
– show the camera capture
– show the button ok that snaps a pic from the webcam capture.

The interface is the design below:
screen-shot-2017-11-13-at-1-08-09-am

and I made sure the dimensions and position of the webcam capture is placed in the right place on the screen to fit perfectly in the that black square.

The button.show would help the button appear, and I make sure it is in the right position as well.


Day 4:

Before I joing the peaces, I was eager to make sure the ok button could post a message or hashtag on my Facebook so i made an applet for that on IFTTT as below:
screen-shot-2017-11-13-at-12-15-47-am

and it worked perfectly.

Now back to the interface, webcam works fine, it immediately shows the capture after the graphdata marks value=5. The webcam is opened from before the graphdata reaches that value but that is because the capture has to be called in the setup function.
Now the next step is to take snapshot when the ok button is pressed and trying to send this picture to IFTTT.
Following the capture example code https://p5js.org/examples/dom-video-capture.html I succeeded in hiding the capture and the button when it is pressed and the snapshot image appears instead on the canvas in the same position of the camera.

Now on IFTTT i opened another applet as seen below that takes a picture and posts it to facebook:

screen-shot-2017-11-13-at-12-16-22-am

But I realized that the IFTTT need an image on a URL and the camera capture saves only to the hard disk, or just as frozen frame on the canvas.
I tried with so many ways and I researched and tried different things online that were in vain.

I gave up after so many trials and decided to discuss that issue with Kate the next day.

In the meantime, I thought that on top of the picture I can add a message and hashtags as part of the same post, so posting is not useless after all.

screen-shot-2017-11-12-at-11-32-52-pm


Day 5:

Today I met with Kate to discuss with her the progress of the project.
We talked about the visualization, the concept and the path to be taken and what priorities should be given attention in the remaining time before the critique day.
As for the image not found on IFTTT, we looked into the code to see if there is a way to solve the IFTTT missing picture but it was a long process and needed another route that needed more time and research.


Day 6:

At this point the first visual on the canvas is the graph moving upwards with the increasing threshold and that needs to be changed into a 25 minutes timer, or a loader that shows the user their progress into the pomodoro.
So I started working on the loader to replace the graph.  I found this animation online:

animation1

It has 5 frames so with the help of the gif.p5.js library I was able to assign the number of the frame that appears to the threshold 1 to 5 that the graphdata was using to show the level of my graph.

I Made sure it works, then I used it to change the design from that animation to another one , showing the 25min timer and made sure it is positioned in the center of my canvas.

animation3

Not to totally delete the graphdata, I changed its visualization from vertical to horizontal, colored it yellow to match my animation and made it look like a progress bar from left to right.


Day 7 :

This day is dedicated to integrate my arduino circuit within my laptop environment so it fits and operates without looking very intrusive or exterior to my usual use of computer.
As I spoke to Nick the first day of this project, we talked how this sensor can be hidden within a cooler of the laptop.
To follow the same KitKat theme, I made sure this cooler would look like a KitKat bar. I used a box, I measured it and covered it with the same Kitkat Label design .

I cut a small square on the side for the USB port, and two circles on the front so the sensor could pop out of the box. For design purposes I made sure the two circles would pass undetected by picking a location camouflaged by the outer label design .

screen-shot-2017-11-12-at-11-52-34-pm

img_9331-1img_9330-1

 


LEARNINGS:
-This project helped enhance my coding skills and it helped me memorize many Javascript syntaxes.

– I learned how to parse the code for values being displayed in the console which was so useful to trigger a particular function in a particular timing.

– IFTTT is easy to work with but it is a standalone platform that is rigid at times and cannot be altered.

 


EVALUATION – What’s Next:

If I had longer time,  I would make sure the webcam capture opens only when the webcam interface opens (It is tiring to see the green light next to the webcam be on during working time).
I also would definitely fix the image not found issue on IFTTT. During critique, Nick mentioned a little hack which is to save the image on Dropbox which is connected online through URL and it means IFTTT can take the picture from there and that sounded a good solution.

Furthermore, I would like to time the breaks as well to 3 or 5 min before the interface goes back to the working timer.
Following the pomodoro technique, the fourth break should be longer and I would want to give each pomodoro a value of 1 to 4 so the break between the 4th and 5th work session would be longer.

To follow healthy practice, The breaks could display information and tips about what can the student do in this time such as what snacks to eat and suggestions for stretching exercises.


RESEARCH  –  THE POMODORO TECHNIQUE:
The Pomodoro Technique is a time management method developed by Francesco Cirillo in the late 1980s. The technique uses a timer to break down work into intervals, traditionally 25 minutes in length, separated by short breaks. These intervals are named pomodoros, the plural in English of the Italian word pomodoro (tomato), after the tomato-shaped kitchen timer that Cirillo used as a university student.

The technique has been widely popularized by dozens of apps and websites providing timers and instructions. Closely related to concepts such as timeboxing and iterative and incremental development used in software design, the method has been adopted in pair programming contexts.

Underlying principles
There are six steps in the original technique:

  1. Decide on the task to be done.
  2. Set the pomodoro timer (traditionally to 25 minutes).
  3. Work on the task.
  4. End work when the timer rings and put a checkmark on a piece of paper.
  5. If you have fewer than four checkmarks, take a short break (3–5 minutes), then go to step 2.
  6. After four pomodoros, take a longer break (15–30 minutes), reset your checkmark count to zero, then go to step 1.

The stages of planning, tracking, recording, processing and visualizing are fundamental to the technique. In the planning phase tasks are prioritized by recording them in a “To Do Today” list. This enables users to estimate the effort tasks require. As pomodoros are completed, they are recorded, adding to a sense of accomplishment and providing raw data for subsequent self-observation and improvement.

References:
https://lifehacker.com/productivity-101-a-primer-to-the-pomodoro-technique-1598992730

https://en.wikipedia.org/wiki/Pomodoro_Technique

https://play.google.com/store/apps/details?id=com.dacer.simplepomodoro

http://www.marinaratimer.com/iZxavQJQ

 


EXAMPLE CODES:
Arduino Code
https://github.com/DigitalFuturesOCADU/creationANDcomputation/blob/master/Arduino%20Examples/Basics/Input/proximity_wTimer/proximity_wTimer.ino

Code for Video capture and snapping photos
https://p5js.org/examples/dom-video-capture.html
https://github.com/processing/p5.js/issues/1496

Code to control Gif animation:
https://github.com/antiboredom/p5.gif.js/tree/master
https://www.youtube.com/watch?v=mj8_w11MvH8

Dwell Belt

Dwell Belt

Chris Luginbuhl

Description:

This project was motivated by the creative anarchy of the 6th floor digital futures lounge. In this chaotic and collaborative space, students work, talk, mix, rest, study and hide out.

The videos posted by DF students on this blog give a pretty good sense of the variety of creative activities taking place there. The air is filled with experimental music, flux fumes, hilarious conversation and the sights and sounds of creative collaboration. You often have the opportunity to test someone’s brand new prototype or learn how to do something new with or without a computer.

In addition, students working on individual assignments like this one, and those in second year sometimes require uninterrupted time to focus, to code and to think. But how can I know if the student sitting across from me is available to help with a technical question, or would rather be left alone to work?

We’ve seen a students C-clamp a shelving unit to a table to hide behind, or wear earphones to signal their intentions. We’ve rolled whiteboards over to a table to create some visual, if not auditory seclusion. Despite this, it’s usually not clear when someone in the space is available to talk, or when they’re working against a tight deadline and would rather be approached later.

The Dwell Belt is intended to provide a friendly non-verbal cue about the wearer’s availability.

Highly focused time, such as when writing or coding can be interrupted by smartphones and notifications (Ward, 2017). Likewise, when brainstorming with a colleague, it’s detrimental to have a phone anywhere near, even if you don’t look at it (Przybylski 2013). How can wearable electronics provide us with the ability to connect professionally and socially with others without causing all of the distractions associated with smartphone use and desktop notifications?

The Dwell Belt provides the option of posting pre-programmed messages directly to a Facebook group, summoning coding help, or suggesting face-to-face social time.

successful-post
Posted to DF 2019 Facebook group wireless from the Dwell Belt

Hardware and software details
(Code, sketches, design files, photographs)

Final piece

The finished product consists of a belt and electronic buckle that allows the user to select an LED animation, and a button for wirelessly posting to Facebook

Finished belt buckle
Finished belt buckle
In class demonstration. Audience like a deer frozen in headlights.
In class demonstration. Audience speechless.
Code

https://github.com/ChrisLuginbuhl/belt/tree/master/utility_belt

System Architecture:
Hardware and software architecture
Hardware and software architecture

As produced, the Belt consists of a battery pack, on off switch, arduino with wifi, three LED rings, a six position mode switch, a button, software and a housing.

The switch permits various display animations, and the button posts to Facebook, via IFTTT. Since I wanted to post to a Facebook group rather than a page, I needed to use Gmail to post, since IFTTT supports posting to pages but not groups. (Zapier supports posting to groups, but the feature is in beta, and didn’t work for me).

Wiring diagram. Rotary switch is 2P6T, neopixel array is round and has only 36 elements. Protoboard is only used for power and ground.
Wiring diagram. Rotary switch is 2P6T, neopixel array is round and has only 36 elements. Protoboard is only used for power and ground.
In use:

The Belt has several modes to signal the wearer’s state.
Off: power switch is off to save battery
On/standby: code is running for quick start, but no display.
Converse/collaborate: In the first switch position, the lights animate outward in a gentle magnetizing display
Party Time: This colourful flashing display of many colours is inspired by a disco ball. The wearer is looking to chat, have fun, go for coffee or drinks and debrief about how Creation and Computation critique went
Need Help: The flashing red pattern indicates distress and sends the message that all is not well. Perhaps there is a persistent error in the code? People who are not otherwise occupied are invited to come over and ask “How can I help?”
Summon Help: The animation changes to concentric red rings highlighting the “Post” button. When pressed, the Belt connects to wifi to post to Facebook. The animation switches first to amber, and green rings appear as the wifi connection process completes. When the message has been sent successfully, a soothing pulsating green light displays. The wearer may keep debugging, or may adopt the recovery or fetal positions until help arrives.

Video https://youtu.be/0Qi7TIDl5rQ

Mechanical design:

I originally intended to design a smaller housing with an ornamental design covering the Facebook button (“Svelte Dwell Belt”, see design v.1 below), but I ended up with a more boxy approach.

Miniaturization was a challenge. I used hook up wire and a small PCB bus for power. I made the buckle hardware from bicycle spokes and spoke nipples, since these were the smallest threaded parts I could find without machining anything or going to specialty stores.

Mechanical housing v1 and ornamental design (hair becomes circuit board trace)
Mechanical housing v1 and ornamental design (hair becomes circuit board trace)
Belt buckle with removable flask - can be replaced with arduino housing v.2
Belt buckle with removable 3oz flask – I thought flask would be swapped for mechanical housing v.2
Buckle design v.2 based on cowboy belt buckle which holds a 3oz flask.
Buckle design v.2 based on cowboy belt buckle which holds a flask.
Mechanical design v3. "Boxy but good"
Mechanical design v3. “Boxy but good”

 

buckle with hardware made from bicycle spokes
buckle with hardware made from bicycle spokes
Feather mounted to standoffs
Feather mounted to standoffs

Software Design

I started with example code from Adafruit – WifiClient from their Wifi101 demo, and striptest from the Neopixel demo. First I got the Feather working as a server, then as a client, then reading switches & buttons, then making calls to adafruit.io when a button is pressed. I then added the animations, and finally “interwove” these so that animations could be interrupted with a switch or button. I wrote rough pseudocode along the way to help me structure the code and prioritize the tasks.

I alternated between writing code and assembling the product. This seems like an unusual approach, but I didn’t want to get too far with any one part before realizing there was an in surmountable problem in another area that would require a redesign.

Pseudo-pseudo code v.1
Pseudo-pseudo code v.1
Pseudo-pseudo code v.2
Pseudo-pseudo code v.2

Process

As expected, the original concept evolved during the creation of this project due to technical constraints, part availability and schedule limitations.

In addition, while working on the project’s details – making parts, soldering wires, writing code – I noticed I spent a lot of time dwelling on the problem and the creative space; part of the reason for the Dwell Belt’s name. This was different from the more ordered and deliberate design phases of concept generation, selection, design, etc. It made me think there is some value in the idea of “Critical Making” I was studying to Research Methods class (e.g. Ratto, 2011). While building, I spent contemplative time with the question of how we can interact and give each other signals, rather than just analytical time.

Issues with this version

The belt did work as intended. There are a few questionable bits of code. Firstly, it was necessary to comment out the serial port communication when working wirelessly. Secondly, in order to run the animation which reading the switches, I couldn’t separate our the functions for animation and switch-checking. A better solution would use “listeners” or other asynchronous computing techniques.

In addition, the wifi call (to Adafruit, and from there to Facebook via IFTTT and Gmail) is pretty basic, and not very secure, since it includes my Adafruit credentials in the http GET request, which could be overheard or cached along the way. It does not use https.

I had hoped to have time to do more with the belt and not just the buckle. I would have liked to have a smaller buckle and more light signals from the belt itself. Some of the components (e.g. battery) could have been distributed around the belt rather than being solely contained in the buckle.

The critique feedback was pretty clear – the belt buckle is not a place people want to look for social cues. It would certainly be possible to move the LEDs to other areas. I would like to try sewing 6 or so LEDs around the belt for a more understated effect, and removing the animations from the belt buckle.

References

Andrew K. Przybylski, Netta Weinstein Can you connect with me now? How the presence of mobile communication technology influences face-to-face conversation quality. Journal of Social and Personal Relationships Vol 30, Issue 3, 2013

Adrian F. Ward, Kristen Duke, Ayelet Gneezy, and Maarten W. Bos Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity. Journal of the Association for Consumer Research 2017 2:2, 140-154

Ratto, Mark Critical Making: Conceptual and Material Studies in Technology and Social Life. The Information Society: An International Journal, Volume 27, 2011 – Issue 4 pp252-260

Experiment 3 – Emotions of Internet – Yiyi Shao

sadhappy-copy

Check out the website here!


Project Description

Internet created a cyber world for people to live a second life, which also created a space for people to express their emotions and feelings, especially through social media. If we see internet as a huge container, it is gathering every users’ emotions in every second. If you ever wondered how everyone in this world is feeling right now, this project will give you the answer. It reads particular emotional hashtags from twitter. When most people tweet about sad, it will draw random blue eclipse in the website canvas and light up an cloudy led which presents a teardrop coming from the “internet container”. When most people tweet about happy, it will draw pink rectangle instead and turn off the led. In this project, I considered the interactions and relationships not only between human and computers but also between physical world and cyber world. Every users are participating in the interactions and be part of this project. It also complete the transformation of emotions from physical human feeling to digital form of hashtag and come back as a whole in physical form of a tear drop and visualisation in the website. This world is bitter-sweet, there must be some points in your life that in a really bad mood, but so does everyone else! Hope next second you will be happy again.


About Me 

Yiyi Shao is a digital artist from China and interested in interaction design, motion graphics and VR. She is doing her Mdes in OCAD University and very passionated about the combination of cool technology and arts.


Project Journey

Day 1

We were given a new assignment today about Peripheral Experiment, I always want to explore new type of technology to create project. So my aim for this experiment is to go as far as I can to experience something that I never tried before.

After some research, I found there is a really cool levitation system which can work with arduino. So I came up with an idea of using wind speed data from weather API to control the floating plant.

Inspirations and examples online:

http://www.dangerouslymad.com/projects/chapter-13-levitation-machine

http://www.dangerouslymad.com/download/chapter-13-levitation

http://www.instructables.com/id/Magnetic-Levitation/

https://medium.com/@luigifcruz/making-a-levitron-with-an-arduino-e32b1340376b

http://www.instructables.com/id/Magnetic-Levitation/

http://www.instructables.com/id/A-Levitating-Sphere-Rotates-Glows-and-Blinks-With-/#step9

Here is my sketch:

img_7151

How it will work:

image

image source:

https://www.tes.com/teaching-resource/electromagnetic-induction-6365198

Feather board will connect to one hall effect sensor, one solenoid with electronic current and one small fan (external force to make plant spin). A website will read weather API of the wind speed and transfer the data to feather board.

My attempt process:

img_7035

img_7039img_7041

Video:

Results:

Although I successfully made the electromagnetism, but it never levitate metal or any magnets I got. I don’t have device to measure the magnetic force to let them stay in a balance, so I have to switch to another more practical idea.

P.S: Don’t use your hand to touch the electromagnetic object, it will make you sick…

Day 2

I switch my idea to use water pump and DC driver. Water is a very interesting material to work with, it’s flexible and capable with lots of sensors, so I camp up two ideas as following:

Idea 1: Still working with weather API, but not sure which types of data to use, maybe the the possibility of rain.

Idea 2: Completely change my input, using twitter service in IFTTT. Every time people tweet something about sad, the water will drop.

I really like the second idea and I think it’s brilliant, so I decide to keep going in that way and I literally ran to Creatron to buy the stuffs I would need (A 12V liquid pump, 5V driver, 12v power supply with wire).

My new sketch:

img_7154

How it will work:

When people tweet with “#sad”, IFTTT will send a trigger to Adafruit IO then pass it to P5.js, and finally through SerialControl to turn on the DC driver which will eventually make the pump to drop water.

Fritzing diagram:

water-pump

Process:

img_7077

My first attempt create an absolutely disaster and split water every where on the table,which almost destroyed my laptop and feather board.img_7158

But finally the water pump works!

Video:

Day 3

Today I am working on the IFTTT and p5.js. I have set up my IFTTT recipe to get data from twitter and send message to Adafruit IO. I change the message from “On” to “1”, which is more easy to use. Then I used the class example to test on the LED light. After the meeting with Kate, I realised there will be a problem for this project. Every second there will be at least 15 tweets about sad and my output will be always on. I would need to make a threshold or counter to set a critical value. For example, every 5 seconds, if there are more than 100 tweets about “#sad”, then my output will be triggered.

screen-shot-2017-11-08-at-21-56-01

img_7090

Well, I spent my rest of day try to figure out the logistic flow but no luck. So I move to build my container to which will hold everything inside including a small container with water, DC driver, water pump and feather board.

img_7083

img_7084

Day 4

I talked to Feng today about the timer and counter function, she suggested me to use a function called setInterval, which will trigger a function in a certain period of time. But the difficult part is, every time after a counter called, it must go back to zero and count again. So the logistic is, the counter  need to count and save the incoming data inside an empty array, and then this array will be empty in a particular period of time by using setInterval.

This part confused me a lot and I never get it right. I even asked my friend from Computer & Science for help, we spent 5 hours to work on the counter and timer function and finally make it slightly work. However it didn’t turn out to be the in the right effect as I expected.

img_7155img_7156

screen-shot-2017-11-12-at-23-56-56

At this stage, I don’t have many time left to finish this project. And another idea just pop up into my mind which just saved my life. All I want is a “0” to turn off my output, why not just add it from my input? This world can’t be sad all the time, there must be happy somewhere right?

screen-shot-2017-11-13-at-00-21-38

screen-shot-2017-11-13-at-00-21-49 screen-shot-2017-11-13-at-00-22-06

screen-shot-2017-11-13-at-00-36-07

After I add the happy feed to IFTTT, everything works normally now! I finally tested my LED, and it can be turn on and off based on the input from IFTTT and Adafruit IO. I was so excited and quickly replace the LED with water pump and DC driver, after I uploaded the code(wrong one, it’s actually the code for LED) to Feather board I heard weird sound coming from water pump and it eventually burned and can’t function anymore. I googled online and then realised the motor is really sensitive can easily get burned. It is also risky to use 5V motor to work with 3V Feather board. It’s the last night, So I have to alter the idea of using LED in the end.


Reflections and Evaluations

screen-shot-2017-11-12-at-13-46-10

I tried and learnt lots of new things and went out off my comfy zone in this experiment, it challenged my personal abilities. The whole process aligned with the main concept of this project, which is sad but also happy. The sad part is that it didn’t turn out as what I expected for the final result, but for the happy part is that I tried my best and learned a lot as I said before.

For the future of this project, It definitely has a lot to improve. I would like to make the water pump work, and I think I will also keep the LED part. The visualisation can be even more tie in with the sad and happy concept, maybe the contents of each tweet can be also placed in the website. It would be also great to work with Kylie’s plants project like what Max said in the class. The tear of Internet can water a plant and give its life, it reminds me a quote which is saying “Without rains, nothing grows”. The concept can definitely develop more like more considerations of how to prevent the plant to be too dry or too drowned.

My code is available at Github.

Shepko_Experiment3

Sana Shepko, Creation & Computation: Experiment 3 – Digital Bus Stop

Description: An LED notification system designed to let the user know when the 504 bus is around 15, 10, or 5 minutes away based on the color of the LED that lights up.

Code on Github

Final demo

Process Journal

Day 1 –

Thinking about what might useful for me to have. The first thing that came into my head was something that would allow me to remind me of the things going on in my own body, and how (whether i like it or not) it affects the way I think, work and feel when I’m on the computer. As a person who has experienced periods for several years i know the effect they have on my body. As far as what would be personally helpful for me, I think having some sort of notification system that is external from the computer would be useful to remind me of my physicality in an otherwise very digital working environment.

If this turns out not to work very well, I may also create a similar notification system, but instead with to do lists.

DAY 2

Struggled with even figuring out how to pull data from API into my p5 file; started by watching this https://www.youtube.com/watch?v=ecT42O6I_WI

DAY 3

After exploring implementing this API https://github.com/jessamynsmith/eggtimer-server , I think I may have to search for more usable ones. This one’s documentation is not easy to understand, or very rich in resources! However, finding other (open source) APIs that fit within the realm of women’s healthy are few and far between, and although I was very interested in the connection of body and technology I think I will have to depart slightly from this initial concept.

Moving forward, I am interested in creating a notification system specifically for Google calendar events.

 

screen-shot-2017-11-07-at-11-09-03-pm

I started with reading up on how to use the Google Calendar API:

 

https://murze.be/2016/05/how-to-setup-and-use-the-google-calendar-api/

Which to be honest was not very illuminating. As time went on, I remembered that IFTTT was an available resource, and began to explore that more fully.

As pictured above, I’ve figured out how to get the IFTTT to talk to Adafruit IO, but am having trouble getting each tweet to be considered a data point, which I believe is necessary in order to have any sort of output in the API, as you can see in this next image.  

The way that I understand it is if there’s no numeric data in the API it won’t trigger a response in Arduino.

http://www.instructables.com/id/WIFI-Notification-Flag/

^^helpful to figure out how to connect feather to adafruit io

http://www.makeuseof.com/tag/the-ultimate-ifttt-guide-use-the-webs-most-powerful-tool-like-a-pro/

^^article on using IFTTT

Trying out the code:

screen-shot-2017-11-08-at-5-38-27-pm

DAY WHATEVER

photo-on-11-8-17-at-5-02-pm

 

screen-shot-2017-11-08-at-5-08-00-pmThinking about ways to physically structure this project. Want to work on some sketches, but it would be a good idea to extend the legs of the LED light so that the light can poke out of some sort of structure that would otherwise hide the breadboard. Thinking about possibly using some sort of styrofoam bowl or rounded top.

https://learn.adafruit.com/adafruit-io-basics-analog-output/arduino-setup

^^OK THIS ONE IS ACTUALLY HELPFUL

HOWEVER, it looks like all of the adafruit examples are using wifi boards, not usb ones so this is causing some problems in my understanding of this.

Starting over from the basics to sort of begin to understand better?

Got multiple LEDs to be controlled by the mouse keys; now trying to figure out how to implement IFTTT data (google calendar data, essentially) and have that as the initiator for the LEDs.  

 

Questions for Nick during office hours:

Is the fact that the adafruit io isn’t reading the pieces of data from IFTTT as numeric data going to affect the way the feed shows in the API? How to get it to read event instances as like a 1 or 0 piece of data? Probably should figure out how to run it constantly at 0 and then when an event notification happens bring it to 1??

How to implement the functionality of output to arduino file and also pull data from the IFTTT applet (like the IFTTTconnectv2_poling)? (the numeric data might have something to do with this?)

And then like basically talk about my plan… anthropomorphic device that lives near computer, eyes light up when event notification comes up…

Materials? Etc.?

Thinking about ways to visualize this; i’m thinking it’ll be some sort of physical object/anthropomorphized cute robot thing that would live near your computer, static except for when the eyes light up to notify you…. Similar to this: https://www.geeky-gadgets.com/apripetit-the-talkative-and-cute-robot-03-06-201/

screen-shot-2017-11-08-at-6-39-19-pm

My agenda for tomorrow (thursday omg!!!!!!)

  • Figure out materials
  • Build it
  • Figure out how to connect api data to getting the lights to turn on!!!!
  • Is there a way to have the lights on some sort of timer? So they come on 15 min before event starts, and stay on until event begins or something?
  • IF I HAVE TIME, try to figure out sound activation… but realistically i probably won’t have time (unless i suddenly become a genius within the next 24 hours)

https://creative-coding.decontextualize.com/changes-over-time/

Ok so i am going to explore other APIs, AGAIN. Looking for something that will change more constantly (also something that I don’t have to constantly input into the calendar myself.)

At this point my physical setup is a wooden box upon which three “light” bulbs (actually clear christmas ornaments) are resting. My plan is to have each of these bulbs represent something different about the current weather in Toronto; blue and red represent temperature, and green represents precipitation. If green is on, that means it’s 50%+ precipitation; if red is on, temperature is >10 celsius or above, if blue is on, temperature is <=10 celsius.

I’m trying to understand how to draw data from this weather API but it’s not working for me.

screen-shot-2017-11-09-at-9-10-11-pm

THE POSITIVE IS THIS: i feel like i am getting it a little bit??? Don’t want to jinx myself but i MAY BE UNDERSTANDING THINGS YAY

Day 6

After messing around with various APIs and trying to get them to work, I finally settled on the most accessible solution to me with so little time before the presentation on Friday. I decided to use the TTC API that Nick and Kate had demonstrated in class and base my code off of that example. My biggest struggle was understanding how to get Arduino and P5 to talk to each other in a way that Arduino would be able to use that data to turn on a light. It took a lot of tweaking.

My first problem when starting to work with the TTC API was that the data was triggering the lights to turn on, which was a good start, but the problem was that they were ALL turning on at once.

dsc_0019I was struggling a lot with this portion of the project. Basically, during my meeting with Nick we had talked about serial.write in p5 and how that is the data that is sent to and read by Arduino; initially I had set my code up as something like this:

screen-shot-2017-11-09-at-10-54-41-pm

 

 

 

 

But this wasn’t working well in my case since I wanted to have 3 different types of data specifically to be read by Arduino to light up 3 different lights in various instances. I had to write some different if statements in p5 which were basically like “if minutesTilNext is [data] then serial.write([number]).” During the process of writing these statements I had to figure out the proper syntax for a range of values in an if statement, which was exciting to learn!

I was using these resources during this part of the project:

From API to LED: First Connection

https://www.arduino.cc/en/Tutorial/SwitchCase2

Ultimately, writing a random number in serial.write was not working and was not read accurately by Arduino; there was problems with the light flickering, or all/none of them turning on. What finally ended up working was using ASCII encoding; so in p5 I set the red LED as ‘R’ which was read by Arduino as 82. This worked amazingly and honestly I have no idea how I figured this out but thank goodness!

Also, a little flop on my part was trying to figure out why Arduino wasn’t working perfectly; apparently, in my Arduino if statements I had set up the if{} as one pin and then the else{} as another during one of my frantic edits and only thanks to Max (thank you Max) did I figure this out.

And with that, I finished up the project by styling the website a little bit more, and done!

Project context

Working within the age of information ensures that we have access to an enormous amount of information at all times. Often times, this data is also programmed to notify us of certain events; our phones are constantly letting us know if someone has commented, liked, reached out, etc., and it even tells us about breaking news or the weather if we want it to. One research project that I found particularly interesting was “An In-Situ Study of Mobile Phone Notifications.”  This study reinforces the idea that phone notifications create distractibility and decrease a person’s ability to focus. I would argue that my project, in its physicality, is a compromise between our need for certain information and the way that our screen technology distracts us.

Digital Bus Stop is also not quite a notification system; in my personal opinion, I would reference it as a physical data visualization, although I suppose a notification is essentially the same thing.

Another project that is closely aligned to mine (and which helped to guide me!) is this project by Caleb Ferguson. Essentially this was also an LED setup, with two lights letting you know if the price of Ethereum (not sure what this is) has gone up or down in the past hour. Ferguson took this a step further by programming the LEDs’ brightness to reflect the value of change from one hour to the next.

Experiment 3: On 2nd Thought … – Dave Foster

Experiment 3: Creation & Computation – “On 2nd Thought…”                         Dave Foster

DIGF-6037-001 (Kate Hartman & Nicholas Puckett)

Project Description (from Canvas site):

Look around the room – All of your laptops are basically the same, but you all use them very differently.  What is it missing?  For this project you will work individually to create a new peripheral for your computer that customizes an input or output function specifically for you.  This could be a new input device that changes the way you interact with your digital world or a means of notifying you about physical or virtual events around you.  To achieve this, you will use P5.js in conjunction with a variety of web APIs and a USB connection to your controller to create your new device.  Beyond the intended functionality of your new peripheral, you should also consider its materiality and spatial relationship to you and your computer.

Project Idea:

  1. Use an ultrasonic proximity sensor through the Arduino Feather board controller as an “on-off” switch or trigger to activate the display screen (through IFTTT) and the webcam on a Mac to photograph the viewer’s reaction whenever anyone comes within 46 cm (2 ft.).
    1. Ideas for display:
      1. Text selections from joke sites on the web
      2. Image selections from joke sites on the web
      3. Text selections from “quote” sites on the web
      4. Image selections from photography sites on the web

Project Log:

Graphic of proposed logic flow:

screen-shot-2017-11-06-at-9-03-16-am

Simple English logic flow (THIS IS NOT CODEto translate into P5js later):

Wire and program Feather controller to use proximity sensor – Set up IFTTT link (link to site tbd)

Start –

Set Mac screen to base display (“your 2nd thought for the day” background colour & font t.b.d.)

If – proximity from Feather board reads as <= 46cm

go tojoke routine

joke routine

display 1 joke from site (site t.b.d.)  through IFTTT – AND   set timer @10 seconds

If timer = 10 seconds – triggerwebcam routine

webcam routine

Capture image and store in file ‘2ndthought’ (location t.b.d.)

reset Mac screen to base display.

Feather board wiring:

capture

Failure Analysis:

So OK … how does one describe an utter and absolute debacle?  Well perhaps not absolute – I did get the Feather controller wired for the proximity sensor and programmed correctly and it works as required.  That portion was just a matter of following directions and modifying the code from Experiment 1 to my specifications (40 cm distance before trigger).  It was the rest of the project that fried my self image.  You’ll notice I did not present on Friday because I really have nothing to present.

Problem List:

  1. Most (almost all in fact) of the humour sites I wanted to pull the jokes from through IFTTT were not text-based, but image or video based and I could not code my way around an image to text translation requirement.  This leaves aside the fact that a good 30 – 40% of them were highly offensive in nature.
  2. The one site I found that was (mostly anyway) text based (https://9gag.com/gag/a97g20j/26-clever-jokes-you-have-to-be-a-little-nerdy-to-find-funny) does not have an IFTTT link that I could locate on a search.
  3. Could not … could not … code an error free feed or trigger through Adafruit IO.
  4. (obvious) – no feed or trigger = no link to P5js (though I did try)
  5. On advice from Nicholas, and on Thursday night no less, I attempted to set up a simple array of text files usable by a random number generator triggered by my working Feather controller.  Nothing but crickets and error codes on my desktop at home and the whirling beachball of death on my Mac Powerbook.

Following all of that, at 3:00 AM on Friday morning I finally surrendered.  I erased all of my effort in complete frustration, caffeine overload and exhaustion and now throw myself on the mercy of my peers and professors.  Next time (perhaps … no definitely … next time I’ll request assistance with coding well prior to the generated migraine).