Sana Shepko, Creation & Computation: Experiment 3 – Digital Bus Stop

Description: An LED notification system designed to let the user know when the 504 bus is around 15, 10, or 5 minutes away based on the color of the LED that lights up.

Code on Github

Final demo

Process Journal

Day 1 –

Thinking about what might useful for me to have. The first thing that came into my head was something that would allow me to remind me of the things going on in my own body, and how (whether i like it or not) it affects the way I think, work and feel when I’m on the computer. As a person who has experienced periods for several years i know the effect they have on my body. As far as what would be personally helpful for me, I think having some sort of notification system that is external from the computer would be useful to remind me of my physicality in an otherwise very digital working environment.

If this turns out not to work very well, I may also create a similar notification system, but instead with to do lists.


Struggled with even figuring out how to pull data from API into my p5 file; started by watching this


After exploring implementing this API , I think I may have to search for more usable ones. This one’s documentation is not easy to understand, or very rich in resources! However, finding other (open source) APIs that fit within the realm of women’s healthy are few and far between, and although I was very interested in the connection of body and technology I think I will have to depart slightly from this initial concept.

Moving forward, I am interested in creating a notification system specifically for Google calendar events.



I started with reading up on how to use the Google Calendar API:

Which to be honest was not very illuminating. As time went on, I remembered that IFTTT was an available resource, and began to explore that more fully.

As pictured above, I’ve figured out how to get the IFTTT to talk to Adafruit IO, but am having trouble getting each tweet to be considered a data point, which I believe is necessary in order to have any sort of output in the API, as you can see in this next image.  

The way that I understand it is if there’s no numeric data in the API it won’t trigger a response in Arduino.

^^helpful to figure out how to connect feather to adafruit io

^^article on using IFTTT

Trying out the code:





screen-shot-2017-11-08-at-5-08-00-pmThinking about ways to physically structure this project. Want to work on some sketches, but it would be a good idea to extend the legs of the LED light so that the light can poke out of some sort of structure that would otherwise hide the breadboard. Thinking about possibly using some sort of styrofoam bowl or rounded top.


HOWEVER, it looks like all of the adafruit examples are using wifi boards, not usb ones so this is causing some problems in my understanding of this.

Starting over from the basics to sort of begin to understand better?

Got multiple LEDs to be controlled by the mouse keys; now trying to figure out how to implement IFTTT data (google calendar data, essentially) and have that as the initiator for the LEDs.  


Questions for Nick during office hours:

Is the fact that the adafruit io isn’t reading the pieces of data from IFTTT as numeric data going to affect the way the feed shows in the API? How to get it to read event instances as like a 1 or 0 piece of data? Probably should figure out how to run it constantly at 0 and then when an event notification happens bring it to 1??

How to implement the functionality of output to arduino file and also pull data from the IFTTT applet (like the IFTTTconnectv2_poling)? (the numeric data might have something to do with this?)

And then like basically talk about my plan… anthropomorphic device that lives near computer, eyes light up when event notification comes up…

Materials? Etc.?

Thinking about ways to visualize this; i’m thinking it’ll be some sort of physical object/anthropomorphized cute robot thing that would live near your computer, static except for when the eyes light up to notify you…. Similar to this:


My agenda for tomorrow (thursday omg!!!!!!)

  • Figure out materials
  • Build it
  • Figure out how to connect api data to getting the lights to turn on!!!!
  • Is there a way to have the lights on some sort of timer? So they come on 15 min before event starts, and stay on until event begins or something?
  • IF I HAVE TIME, try to figure out sound activation… but realistically i probably won’t have time (unless i suddenly become a genius within the next 24 hours)

Ok so i am going to explore other APIs, AGAIN. Looking for something that will change more constantly (also something that I don’t have to constantly input into the calendar myself.)

At this point my physical setup is a wooden box upon which three “light” bulbs (actually clear christmas ornaments) are resting. My plan is to have each of these bulbs represent something different about the current weather in Toronto; blue and red represent temperature, and green represents precipitation. If green is on, that means it’s 50%+ precipitation; if red is on, temperature is >10 celsius or above, if blue is on, temperature is <=10 celsius.

I’m trying to understand how to draw data from this weather API but it’s not working for me.


THE POSITIVE IS THIS: i feel like i am getting it a little bit??? Don’t want to jinx myself but i MAY BE UNDERSTANDING THINGS YAY

Day 6

After messing around with various APIs and trying to get them to work, I finally settled on the most accessible solution to me with so little time before the presentation on Friday. I decided to use the TTC API that Nick and Kate had demonstrated in class and base my code off of that example. My biggest struggle was understanding how to get Arduino and P5 to talk to each other in a way that Arduino would be able to use that data to turn on a light. It took a lot of tweaking.

My first problem when starting to work with the TTC API was that the data was triggering the lights to turn on, which was a good start, but the problem was that they were ALL turning on at once.

dsc_0019I was struggling a lot with this portion of the project. Basically, during my meeting with Nick we had talked about serial.write in p5 and how that is the data that is sent to and read by Arduino; initially I had set my code up as something like this:






But this wasn’t working well in my case since I wanted to have 3 different types of data specifically to be read by Arduino to light up 3 different lights in various instances. I had to write some different if statements in p5 which were basically like “if minutesTilNext is [data] then serial.write([number]).” During the process of writing these statements I had to figure out the proper syntax for a range of values in an if statement, which was exciting to learn!

I was using these resources during this part of the project:

From API to LED: First Connection

Ultimately, writing a random number in serial.write was not working and was not read accurately by Arduino; there was problems with the light flickering, or all/none of them turning on. What finally ended up working was using ASCII encoding; so in p5 I set the red LED as ‘R’ which was read by Arduino as 82. This worked amazingly and honestly I have no idea how I figured this out but thank goodness!

Also, a little flop on my part was trying to figure out why Arduino wasn’t working perfectly; apparently, in my Arduino if statements I had set up the if{} as one pin and then the else{} as another during one of my frantic edits and only thanks to Max (thank you Max) did I figure this out.

And with that, I finished up the project by styling the website a little bit more, and done!

Project context

Working within the age of information ensures that we have access to an enormous amount of information at all times. Often times, this data is also programmed to notify us of certain events; our phones are constantly letting us know if someone has commented, liked, reached out, etc., and it even tells us about breaking news or the weather if we want it to. One research project that I found particularly interesting was “An In-Situ Study of Mobile Phone Notifications.”  This study reinforces the idea that phone notifications create distractibility and decrease a person’s ability to focus. I would argue that my project, in its physicality, is a compromise between our need for certain information and the way that our screen technology distracts us.

Digital Bus Stop is also not quite a notification system; in my personal opinion, I would reference it as a physical data visualization, although I suppose a notification is essentially the same thing.

Another project that is closely aligned to mine (and which helped to guide me!) is this project by Caleb Ferguson. Essentially this was also an LED setup, with two lights letting you know if the price of Ethereum (not sure what this is) has gone up or down in the past hour. Ferguson took this a step further by programming the LEDs’ brightness to reflect the value of change from one hour to the next.

One thought on “Shepko_Experiment3”

Leave a Reply