Shepko_Harkin_Exp4

Title

Get Me Coffee

 

Members

Sana Shepko and Sean Harkin

 

Code

GitHub Code

 

Project Description

A messaging system intended to notify members of a group, class, or office space when someone is going to get coffee, in case others want coffee too. This system would be materialized in the form of a small device in the shape of a symbolic coffee cup (perhaps a keychain) with a toggle and two LEDs. The toggle, when moved to the side of the green LED, can have two meanings: either, that you are going to get coffee; or as a response to the message that someone is getting coffee, indicating that YES, you want coffee. When moving the toggle to either side, it lights up the other person’s device. When it’s moved to the side of the red LED, it indicates the response to the message “do you want coffee” as NO, you don’t want coffee.

We envision this project as being used for groups larger than 2 people; especially if used in an office or studio setting, it could save time rather than having to message a large group.

FINAL

 

20171124_130736 20171124_130746

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Process Journal

DAY 1

Talked about our initial idea. We are planning to build a message notification system that would communicate who is getting coffee and how many people would like coffee in a group. Basically, the way it would work is that each person (whether in a class, or in an office setting) would have a small button device with an LED.

We are planning to build this in a similar way to the Amazon Dash (see below).

screen-shot-2017-11-14-at-11-24-41-am

 

There are two parts of the device: an LED button and a small LED light. Each of these components will signal different messages.

SCENARIO

If person A is going out to get coffee and wants to let others know they can get coffee for them too, they press their LED button. Person A, who sent out the initial signal, would see that their LED button fades High-Low. Everyone else in the space will receive this message through their LED light flashing consistently.

If, for example, person B wanted to confirm with person A that they wanted coffee, they would press their LED button which thereby would turn off the flashing LED on their device but turn on the LED button light, which would remain on until person A turns off their outgoing signal.

When person B presses their LED button, and if, for example, person C and D also press their LED button, person A will receive a notification that allows them to know how many people would like coffee. Person A would receive a signal in their LED light; the number of times the LED flashes is the amount of people who want coffee. This LED flash would be on a loop, and would be a series of flashes which would stop, and then repeat.

dsc_0033

This is our initial idea and starting point, and we will see how it develops!

For today:

– begin to explore code with featherboard

DAY 2

Planning on 3D printing our devices. Some sketches:

The original idea was design the casing to be small enough to be attached to a key-chain or be attached to the side of your work monitor. However, due to the size of the Feather and the battery, we realized very quickly that this would not be possible. If the product was to be developed further, we would plan to design and build a custom board which would shrink the overall size of the product.

For the battery we chose a lithium-ion polymer 3.7V 1000mAh. We chose this as the voltage required to power our relatively simple device could very easily be done with 3.7V, and would not require the use of transistors which may be required with larger voltage batteries. Since the LiPo batteries are also rechargeable, both Sana and Sean would be able to reuse the power-source for future projects.

cad-screengrab-2 cad-screengrab-4 cad-screengrab

 

 

 

 

 

 

We revised our sketches to these designs:

lid-sg n-top-sg-2 n-top-sg-3

 

 

 

 

 

 

 

 

 

 

DAY 3

Thinking about our coffee theme and the color palette we might use.

screen-shot-2017-11-20-at-9-50-12-am

 

 

screen-shot-2017-11-20-at-10-23-21-am

Also talked to Nick during class about our current plan and our progress.

Some interesting points that came out of this:

  • Although we were originally planning on having only one button, it seems that this complicates our project, since there would be more than one meaning of button push (for example, an initial button press would mean “I am getting coffee” and essentially functions as an ask to other people with the device; BUT there is also the second button press that others would send back as an answer to the ask, which would function as a “yes” or “no” to the “I am getting coffee”. SO, that being said, Nick initially suggested modes but it seems that we won’t be able to achieve this easily. THE EASIEST SOLUTION: include, along with the button, some sort of toggle or switch that would function as an answer to the ask. One value of the toggle would be a yes, and one value would be a no.
  • This being said, we will now have the following components: The feather, an LED button (to be purchased today), an orange LED, and a toggle, and the main functions of this Get Me Coffee device would be to ASK and ANSWER.
  • Another problem: Sana’s button does not seem to be sending to PubNub consistently. Hopefully this will not be a long term issue!

 

Sana testing a tactile switch code:

Got tactile button to somewhat work:

screen-shot-2017-11-21-at-1-29-41-am

DAY 4

Ok, so we have gotten some things to work, which is a great sign!

For documentation purposes, we have decided that both of our

myval1 = button switch

myval2 = toggle switch

This way when we are subscribing to each other’s messages we will be able to know that yourval1, when writing functions or whatever, will always refer to the button value, and yourval2 will always refer to the toggle value.

DAY 5

We have had to move from our original idea of having both the button and the toggle, to just having the toggle as the main communication component. We managed to print our casing today, however there were some issues. To begin with, the parts were designed using Autodesk’s Inventor software. When exporting as .STL files for printing, they seemed to scale in an unpredictable manner, meaning that we had to very quickly resize the components. Unfortunately this meant that the fit for the case and lid was slightly tighter than designed. With some extra fabrication we were able to compensate for this. The other issue was that due to time constraints, the casing had to be sent for printing before we were able to build the full circuitry, meaning that we underestimated the height which would be required for the internal components. This required us to rethink the casing.

We also have found that although we originally only wanted to have the message to be sent to PubNub ONCE on toggle switch, it turns out that having it constantly publish the toggle state was a more achievable and realistic way to go about the code.

20171124_163611 20171124_163625

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Sana came up with the idea of using actual coffee cups to contain the internal components.

Unfortunately the product was not fully functional for the critique, though we have included video of when we had both devices communicating successfully (the video can be found here: https://www.dropbox.com/s/buww21m29l2080w/WorkingVid.mp4?dl=0 )

As you can see, the product was working as intended (if a little slow due to the connectivity issues working with PubNub). The issues arose when we began soldering the components to the protoboards. Unfortunately we never found the exact cause however Sean’s board would not communicate after being soldered. The obvious cause was a short in the soldering, however we could not find a problem under examination. We remedied this issue by removing the components from the protoboard and using a mini-breadboard – allowing us the flexibility of a more mobile device, while ensuring functionality.

The next issue came the next day when we realized we would have to upload the code to the devices at the same time to boot the sequence. If we did not upload the code at the same, or if we disconnected the boards from our computers, the devices would stop publishing/subscribing from PubNub. Unfortunately we realized this issue the morning of the presentation and were unable to correct this before presentation.

 

References

Battery specs:

http://www.canadarobotix.com/battery-chargers/battery-lithium-1000mah

Figuring out how to use the tactile switches:

https://www.youtube.com/watch?v=n0VbHPB_2Ws

https://www.youtube.com/watch?v=tmjuLtiAsc0

http://www.instructables.com/id/Use-a-Momentary-or-Tactile-Switch-as-a-Pushbutton-/

https://learn.adafruit.com/adafruit-arduino-lesson-6-digital-inputs/arduino-code

 

Project Context

Some more information on the Amazon Dash buttons:

https://www.cnet.com/news/appliance-science-how-the-amazon-dash-button-works/

Everyone loves coffee. The idea for the initial concept came from months spent in the studio with everyone grabbing coffee for each other. Many times we would be travelling to the studio and would think about messaging our team and asking if anyone wanted a coffee, however this requires the time it takes to send a message. It would be far more convenient if we were able to send out a signal letting our peers know we were getting coffee and enable them to ask us to bring them one with 1 click of a button. The initial concept also included a p5 page which could track how many times individual users had been for coffee so the group would know was slacking on coffee-retrieval responsibilities.

 

 

Shepko_Experiment3

Sana Shepko, Creation & Computation: Experiment 3 – Digital Bus Stop

Description: An LED notification system designed to let the user know when the 504 bus is around 15, 10, or 5 minutes away based on the color of the LED that lights up.

Code on Github

Final demo

Process Journal

Day 1 –

Thinking about what might useful for me to have. The first thing that came into my head was something that would allow me to remind me of the things going on in my own body, and how (whether i like it or not) it affects the way I think, work and feel when I’m on the computer. As a person who has experienced periods for several years i know the effect they have on my body. As far as what would be personally helpful for me, I think having some sort of notification system that is external from the computer would be useful to remind me of my physicality in an otherwise very digital working environment.

If this turns out not to work very well, I may also create a similar notification system, but instead with to do lists.

DAY 2

Struggled with even figuring out how to pull data from API into my p5 file; started by watching this https://www.youtube.com/watch?v=ecT42O6I_WI

DAY 3

After exploring implementing this API https://github.com/jessamynsmith/eggtimer-server , I think I may have to search for more usable ones. This one’s documentation is not easy to understand, or very rich in resources! However, finding other (open source) APIs that fit within the realm of women’s healthy are few and far between, and although I was very interested in the connection of body and technology I think I will have to depart slightly from this initial concept.

Moving forward, I am interested in creating a notification system specifically for Google calendar events.

 

screen-shot-2017-11-07-at-11-09-03-pm

I started with reading up on how to use the Google Calendar API:

 

https://murze.be/2016/05/how-to-setup-and-use-the-google-calendar-api/

Which to be honest was not very illuminating. As time went on, I remembered that IFTTT was an available resource, and began to explore that more fully.

As pictured above, I’ve figured out how to get the IFTTT to talk to Adafruit IO, but am having trouble getting each tweet to be considered a data point, which I believe is necessary in order to have any sort of output in the API, as you can see in this next image.  

The way that I understand it is if there’s no numeric data in the API it won’t trigger a response in Arduino.

http://www.instructables.com/id/WIFI-Notification-Flag/

^^helpful to figure out how to connect feather to adafruit io

http://www.makeuseof.com/tag/the-ultimate-ifttt-guide-use-the-webs-most-powerful-tool-like-a-pro/

^^article on using IFTTT

Trying out the code:

screen-shot-2017-11-08-at-5-38-27-pm

DAY WHATEVER

photo-on-11-8-17-at-5-02-pm

 

screen-shot-2017-11-08-at-5-08-00-pmThinking about ways to physically structure this project. Want to work on some sketches, but it would be a good idea to extend the legs of the LED light so that the light can poke out of some sort of structure that would otherwise hide the breadboard. Thinking about possibly using some sort of styrofoam bowl or rounded top.

https://learn.adafruit.com/adafruit-io-basics-analog-output/arduino-setup

^^OK THIS ONE IS ACTUALLY HELPFUL

HOWEVER, it looks like all of the adafruit examples are using wifi boards, not usb ones so this is causing some problems in my understanding of this.

Starting over from the basics to sort of begin to understand better?

Got multiple LEDs to be controlled by the mouse keys; now trying to figure out how to implement IFTTT data (google calendar data, essentially) and have that as the initiator for the LEDs.  

 

Questions for Nick during office hours:

Is the fact that the adafruit io isn’t reading the pieces of data from IFTTT as numeric data going to affect the way the feed shows in the API? How to get it to read event instances as like a 1 or 0 piece of data? Probably should figure out how to run it constantly at 0 and then when an event notification happens bring it to 1??

How to implement the functionality of output to arduino file and also pull data from the IFTTT applet (like the IFTTTconnectv2_poling)? (the numeric data might have something to do with this?)

And then like basically talk about my plan… anthropomorphic device that lives near computer, eyes light up when event notification comes up…

Materials? Etc.?

Thinking about ways to visualize this; i’m thinking it’ll be some sort of physical object/anthropomorphized cute robot thing that would live near your computer, static except for when the eyes light up to notify you…. Similar to this: https://www.geeky-gadgets.com/apripetit-the-talkative-and-cute-robot-03-06-201/

screen-shot-2017-11-08-at-6-39-19-pm

My agenda for tomorrow (thursday omg!!!!!!)

  • Figure out materials
  • Build it
  • Figure out how to connect api data to getting the lights to turn on!!!!
  • Is there a way to have the lights on some sort of timer? So they come on 15 min before event starts, and stay on until event begins or something?
  • IF I HAVE TIME, try to figure out sound activation… but realistically i probably won’t have time (unless i suddenly become a genius within the next 24 hours)

https://creative-coding.decontextualize.com/changes-over-time/

Ok so i am going to explore other APIs, AGAIN. Looking for something that will change more constantly (also something that I don’t have to constantly input into the calendar myself.)

At this point my physical setup is a wooden box upon which three “light” bulbs (actually clear christmas ornaments) are resting. My plan is to have each of these bulbs represent something different about the current weather in Toronto; blue and red represent temperature, and green represents precipitation. If green is on, that means it’s 50%+ precipitation; if red is on, temperature is >10 celsius or above, if blue is on, temperature is <=10 celsius.

I’m trying to understand how to draw data from this weather API but it’s not working for me.

screen-shot-2017-11-09-at-9-10-11-pm

THE POSITIVE IS THIS: i feel like i am getting it a little bit??? Don’t want to jinx myself but i MAY BE UNDERSTANDING THINGS YAY

Day 6

After messing around with various APIs and trying to get them to work, I finally settled on the most accessible solution to me with so little time before the presentation on Friday. I decided to use the TTC API that Nick and Kate had demonstrated in class and base my code off of that example. My biggest struggle was understanding how to get Arduino and P5 to talk to each other in a way that Arduino would be able to use that data to turn on a light. It took a lot of tweaking.

My first problem when starting to work with the TTC API was that the data was triggering the lights to turn on, which was a good start, but the problem was that they were ALL turning on at once.

dsc_0019I was struggling a lot with this portion of the project. Basically, during my meeting with Nick we had talked about serial.write in p5 and how that is the data that is sent to and read by Arduino; initially I had set my code up as something like this:

screen-shot-2017-11-09-at-10-54-41-pm

 

 

 

 

But this wasn’t working well in my case since I wanted to have 3 different types of data specifically to be read by Arduino to light up 3 different lights in various instances. I had to write some different if statements in p5 which were basically like “if minutesTilNext is [data] then serial.write([number]).” During the process of writing these statements I had to figure out the proper syntax for a range of values in an if statement, which was exciting to learn!

I was using these resources during this part of the project:

From API to LED: First Connection

https://www.arduino.cc/en/Tutorial/SwitchCase2

Ultimately, writing a random number in serial.write was not working and was not read accurately by Arduino; there was problems with the light flickering, or all/none of them turning on. What finally ended up working was using ASCII encoding; so in p5 I set the red LED as ‘R’ which was read by Arduino as 82. This worked amazingly and honestly I have no idea how I figured this out but thank goodness!

Also, a little flop on my part was trying to figure out why Arduino wasn’t working perfectly; apparently, in my Arduino if statements I had set up the if{} as one pin and then the else{} as another during one of my frantic edits and only thanks to Max (thank you Max) did I figure this out.

And with that, I finished up the project by styling the website a little bit more, and done!

Project context

Working within the age of information ensures that we have access to an enormous amount of information at all times. Often times, this data is also programmed to notify us of certain events; our phones are constantly letting us know if someone has commented, liked, reached out, etc., and it even tells us about breaking news or the weather if we want it to. One research project that I found particularly interesting was “An In-Situ Study of Mobile Phone Notifications.”  This study reinforces the idea that phone notifications create distractibility and decrease a person’s ability to focus. I would argue that my project, in its physicality, is a compromise between our need for certain information and the way that our screen technology distracts us.

Digital Bus Stop is also not quite a notification system; in my personal opinion, I would reference it as a physical data visualization, although I suppose a notification is essentially the same thing.

Another project that is closely aligned to mine (and which helped to guide me!) is this project by Caleb Ferguson. Essentially this was also an LED setup, with two lights letting you know if the price of Ethereum (not sure what this is) has gone up or down in the past hour. Ferguson took this a step further by programming the LEDs’ brightness to reflect the value of change from one hour to the next.

MadLibs_Experiment1_ShepkoShaoShinkaruk

WATER BOUNCE

An interactive music experience, Water Bounce is a device that allows the user to control the pitch of sound through light using three light sensors.

Project members: Sana Shepko, Yiyi Shao, Savaya Shinkaruk

Code:

Our Arduino code. 

Circuit Layout:

22292581_10159458279115451_1714976350_o

Supporting Visuals:

OUR PROTOTYPE VIDEO IS HERE

img_2756-1024x768

Process Journal:

 

Our blog is here! 

Project Context & Bibliography:

Water Bounce is a project which sits within the music DJ industry. As technology evolves, so does the way that we interact with it, and thus, Water Bounce allows people to engage with sound in an innovative way. Sana, Savaya, and Yiyi are all interested in evolving media as well as creating fun environments for user experience. The research we did was primarily within YouTube.

In our research, we discovered that a lot of the projects involved sound affecting light, as in the many versions of water speakers which can be seen in the YouTube videos we’ve cited. Our Water Bounce differentiates in this way because our project involves light affecting the sound, which is a reversal of the other projects we have found.

Water Bounce is a literal interpretation of our MadLibs words, which included: water as material, light as input, sound as output, and floaty as adjective. In this way, we believe that these constraints led us to create something wholly original and innovative, as we were required to think outside of the box.

Bibliography:

ShowtimeSPL. (2013, October 9). Sound,Bass,Water, Sound makes water come alive with cymatics [Video file]. Retrieved from https://www.youtube.com/watch?v=THUMdTohWkI&feature=youtu.be

Al3xxxK. (2013, September 17). Water Speakers With Martin Garrix – Animals. [Video file]. Retrieved from https://www.youtube.com/watch?v=HI0cw8M7aDE&feature=youtu.be

BSHAB. (2014, May 11). home made water speakers [Video file]. Retrieved from https://www.youtube.com/watch?v=NrVtXI7OsaQ&feature=youtu.be

What’s Inside?. (2016, February 15). What’s inside a Water Fountain Speaker? [Video file]. Retrieved from https://www.youtube.com/watch?v=oYerOxLzbKQ&feature=youtu.be

All About Circuits. (2016, November 3). Build a Touchless MIDI COntroller with an Arduino [Video file]. Retrieved from https://www.youtube.com/watch?v=lSX880PzL_A&t=114s

amandaghassaei. “Send and Receive MIDI With Arduino.” Instructables. Web, 2017.