Final Project Idea Revision – Muybridge Jump

In keeping with the theme of re-examining early cinematic technologies, I have a revised project idea based on Muybridge’s early motion experiments, and essentially the first high speed photography. While reading about Muybridge’s fascination with whether or not horses’ feet left the ground while running, the outcome displays a very simple fascination with motion, flight, and gravity. We still have this simple fascination, as can be seen with the popular ‘jump’ images and video that display subjects with feet off the ground. I want to take this fascination and recreate it in a collective space to ask why it’s still used, and to re-construct Muybridge’s imaging with Processing to ask these questions.

here is a breakdown of the steps:

-LED flashes to signal participant to jump

-capture jumping action  (Arduino button, camera button or pressure sensor to activate image capture)

-set length of capture (one second or 25 frames)

-store to buffer (PImage)

-Processing buffer to replay captured images in Muybridge matrix or filmstrip template

I’ve found this Processing code to implement the matrix. Now I’m adding a boolean for the camera button, and working on how the image buffer works. Ideally it would be able to store and replay the jumps from various people and shuffle the images within the matrix. I’d like to modify it in the future to only display images with feet off the ground, but right now that seems like a second step.




AR Project Proposal

I am proposing to write software that will use live camera tracking and function as an Augmented Reality application.

I am interested in using location data and adding layers on the screen about a specific location, which will allow a person to learn about specific pieces of information about that area. The idea is to see through revealing. While viewing a specific landscape on the tablet, they can peel back layers that are overlaying on the screen. Some ideas I currently have are historical photos of the location, crime data, historical events that have taken place. I want to create a user experience that is playful but also informative. I also want to play with the idea of Augmented Reality as a pseudo reality.

-C++ and OpenFrameworks

Recently Amazon released their Flow app that scans products and finds prices, reviews and similar products in their system. Its a really smooth way of mixing “reality” with overlays of information.

Similar Work:
Bionic Eye

Final Assignment Proposal

For my final project, I want to push myself to learn more about Processing or at least really reinforce what I should already know. So with this as the goal,the emphasis will likely be on engaging the user with a screen interaction. For inspiration, I looked to some of the well known games that have engaged people for generations. They have done so in part because of their simplicity. And simplicity as a concept bodes well for my ability to actually create something interesting that works. While thinking of some of the first games I played, I also remembered some of the first toys I played with.














Believe or not, when I was a kid this was as high-tech as it got. When the etch-a-sketch came out it was the biggest thing to hit the “fun” market since the hoola-hoop. I just had to have one and I think it’s still my favourite toy today. Conceptually, I thought it would be interesting to remind the user just how far we have come with human/screen interaction. The idea of constraining a MacBook Pro to such a simplistic graphic interface would highlight the point.

THE APPROACH – A laptop would serve as the Etch-a-Sketch screen. An image representing the original etch-a-sketch would be loaded into the Processing sketch as a background image. The Arduino would be secured within a customized enclosure with two knobs or dials that function in the same way as the original. The left dial controls the horizontal x-position and the right dial, the vertical y-position. It is my hope that I can buy two potentiometers that have a bit more accuracy that the one used in our serial communication lab.

I believe this would be a fairly simple project, but I have come to learn that I am advancing more slowly than others in the class and so, I want to start something simple that I could add on to. In this case, I might add in a motion sensor on the Arduino that would erase the drawing if the “box” is shaken while turned upside down. This is how the original etch-a-sketch functioned. In addition, it would be interesting to use the camera on the laptop to capture an image of the user while they sketched. The line would sample the video capture allowing them to create a self portrait – similar to a previous Processing exercise completed in Week four, where a line was programmed to draw using the random function. I experimented a bit with this sketch altering the code in the serial communication lab. I wanted the potentiometer and light sensor to control the line. But I’m stuck on that – here is the code I tried for the Processing sketch and I’ve included the original from the Week 4 assignment. Not sure how to make this work…


























I believe that I will need the following hardware to build my etch-a-sketch:












  1. Two rotary potentiometers – one to control the horizontal value and the other for the vertical values.
  2. Two knobs to add the finishing touch. I will be looking for ones that fit onto the potentiometers.
  3. A dual-axis accelerometerr with digital PWM or Analog output. The type I looked at can measure both static and dynamic acceleration. This means it is suited well for sensing tilt and for sensing brute acceleration (rocketry and general motion sensing applications). I hope this will allow for the “erase” function when shaken.


The other idea that I am considering sprung from a project we are working on at our agency. The client is Big Brothers Big Sisters and the program we are promoting is the in-school mentoring program. They have a particular need for young men at the moment so activities will target University students since their flexible schedules fit so well with this program. The campaign theme is “Start something Big” and volunteers will be busy raising awareness at various university fares.

We are looking for ways to engage students at the Big Brothers Big Sisters trade show booth. In addition to printed brochures and a video as an overview to the program, it would be fun to engage interested volunteers with an interactive installation at their trade fair booth. They were going to give out “branded” balloons with the “Start something Big” theme and I wondered if participants could add their virtual balloon to the BBBS website to signal their commitment. This would involve building a device that people could blow into (or pump up with a bicycle pump). As they blow into the device, a balloon would inflate on-screen. After three “blows” the balloon is released where it joins the others in the sky – creating a very colourful display. I think that I would need a motion detector to sense the physical interaction of someone blowing into the device. The challenge would be in designing an interesting visual display of this action within Processing. I have not thought this one through yet but would be interested in getting feedback on the feasibility of either idea at this point.

Thanks in advance for any input.

Revised Final Project Idea: Liz, Shuting & Linh

After a bit of revision and another couple of rounds of free-form / verbal and annotated brainstorming, we’ve come up with another final project idea. We had the opportunity to run the idea by Jim on Thursday evening and he was able to give us some extra direction on the specific sensors to look into.

Our new concept is based on asking a “user” or gallery-goer to consider their personal impacts on the world in a global scale in terms of how human life affects the health and wellness of the natural world. We are often asked to think of the our natural world as “Mother Earth.” In this gallery installation, we’re turning the tables and repositioning the individual as the the caretaker or “Mother of the Earth.” The installation will consist of a replica of the earth dangling from the ceiling, which encases the Arduino, breadboard and the sensors. The user’s engagement with the ball through shaking or tilting it, the level and range of sound coming from how one speaks to the earth, and the amount of light the user shines on the earth will initiate repercussions that will be displayed on the wall (or in a frame on a white gallery wall) of video footage of natural environmental occurrences–floods, earthquakes, melting glaciers, tsunamis, etc.

We’ll be utilizing three sensors: an X, Y & Z axis sensor; a light sensor; and a sound sensor.

Shuting drew up a mock-up of the installation.


final _ nitinol_1 (ad hock experiment)

I received my .008″ diameter muscle wire; I thought a quick first experiment would be fun — I mean, does it actually work?


so I wrapped the wire around  a screw, heated it up with a blow torch and then cooled.

then I deformed the wire (stretched it out) and threw it in some warm water — presto! it regained it’s memory form (Mostly).

But does it have any strength? here’s the test…you be the judge. the washer is at the very top of the screen in the first pic…



time to run some current through the wire!



Project Proposal: Cathy Chen and Maayan Cohen

A music creation game that allow users to toggle on and off musical notes from a pentatonic scale creating a cascade of blended music.

To provide children a playful environment that encourages collaborative play, learning and discovery in music.

Intended behavior and interaction:
The grid will be assembled as a series of sewn square, in which each square is an individual note. Users trigger the notes by pressing, tapping, or putting pressure on the square. Ideally the squares can be separate or together. It would also be fun to engage each individual square in a tag game where everyone can hold a piece of square and try to tag people by lighting up the square with a press.



Few different arduino that we considered were using three arduino uno for what we would like to accomplish as we would need multiple digital and analog pins for our multiple sensors. We considered lilypad as we thought it might be easier to be sewn into our project. However, after talking to Kate, she suggested using Arduino Mega for the multiple pins that we need or we could use Row-Column Scanning ( tor multiplexor to allow for multiple inputs and outputs to a single pin. However, Arduino Mega would seem to be the most straightforward thing for us right now.

We also considered wireless connection using bluetooth, as we woud like each individual square to be remote and allow for spaces for them to be moved around; however we will need to explore that further and understand the the pros and cons of using wireless connection within our framework.

We originally wanted to do sound in Arduino sound library, but after talking to Kate, we decided to look into processing sound libraries as another option. This change actually help us simplify our overall envisioned design where we originally wanted to apply another visual element to the grid, having processing projected image as a puzzle.

Max MSP would be very lovely for this, but neither of us understand the software.

We look into few processing sound libraries to adopt. One of the interesting ones that we would like to probe into further is the Sound Cipher. It is an interface for playing notes and allows for audio play back. Other sound library alternatives: moinim, ESS, Sound Cipher

However if anyone have any good suggestions, please let us know.

Musical Apporach
We would like to use pentatonic scale for the purpose of making the sound more generally approachable to the ear. Pentatonic scales are utilized for its lack of dissonant intervals between pitches. This function allows room for random combination and random play without clashing results.

We will have a silent metronome in the back end to allow the piece to be in a consistent tempo. In our case when we have eight squares across the row, the tempo will be 2/4 time. Ideally in our grid, we would like to have three octaves of pentatonic scales (15 notes vertically), but this would alter depending on our work progress.

Pentatonic and Education
pentatonic scale is useful for education due to its simplicity and expressive quality. Children can be encouraged to improvise and play with these notes without making real harmonic mistakes, and hence allowing the music to sound pleasant.

What kind of Sensors
We looked into several touch sensors choices: force sesnors (, and peizo transducers, but it might require an amplifier. We looked into proximity sensors. The thought of being able to trigger it through distance is very intriguing; however it is very expensive. We were very interested in the soft sensor ( and would like to explore further into this kind of sensor.

Material options
The main problem for us is to think of a way to make a touchable physical object that would allow the embedding of LED lights and allow for users’ physical input without breaking the LED lights. Therefore, we looked into soft fabric, bubble wrap and we got really interested in conductive play-doh (

Few other ideas that we thought of, but might not work is to make the individual note into a 3d sphere where people squeeze the sphere to trigger the note.


Few ideas where we could employ this grid elsewhere in a different context:

  • Suggested by Jim, we could lay the grid out in a tent
  • It would be interesting to put random squares on walls of a room. People can tap on the square randomly and music will play in a set rhythmic pattern however the lights will not be in order.
  • If the square can be moved around and people can pick each square up individually, people can play tag with the squares.
  • each individual grid can be placed on pieces of leaves and when a lot of squares are lighted up, it’ll create a beautiful light up tree with pleasant sounding music.
  • Side walk (on the floor) like hopscotch

Tone Matrix (
Physical Tone Matrix (
To think of how to put the design into a different context (

Assignment 8 – Error / Video Brightness Controlled by Potentiometer


Here’s my Processing sketch with the printed sensor data at the bottom. It’s just a test to see how the potentiometer in Arduino would control video brightness. It should make the video controllable from dark to bright in a smooth predictable way, but instead it’s random amounts of brightness with no control. This isn’t my final assignment, but I can’t get past this point without some idea what the error is. I’m confused as to why a potentiometer with accurate 0-255 control in Arduino would produce this effect – erratic data input. Any help would be very much appreciated. Thanks!




Potentiometer and GSR Sensor


I found the potentiometer pretty simple to work with. I connected one pin to power, the other to ground and the middle to analog pin Ao. I added the LED to the circuit and connected the anode to 330k resistor and to pin 9 and the cathode to ground.



















I used this code to light up the LED based on the position of the potentiometer. The maximum value for the potentiometer was the brightest value for the LED and vice versa.

int LEDpin = 9;
int potentiometerPin = 0;
int LEDvalue;
int potentiometerValue;

void setup(){
pinMode(LEDpin, OUTPUT);

void loop(){
  potentiometerValue = analogRead(potentiometerPin);
  LEDvalue = map(potentiometerValue, 0, 1023, 0, 255);
  analogWrite(LEDpin, LEDvalue);

GSR Sensor

Playing with sensors is one of the most fun things I have done this semester! Inspired by my wearable technology class, I started to look into the study of emotions and came across the Galvanic Skin Response Sensor. When people experience an arousing stimulus, our bodies generate a variety of psychophysical responses. One of those responses are micro-pulses of sweat that  increases the electrical conductance of the skin. This can be measured using the GSR sensor. I found a few companies who are manifacuring GSR sensors but really wanted to build one myself. I came across a pretty simple DIY tutorial online which i followed to make my own GSR sensor.

I soldered an 2 electrical wire to 2 penny coins. It wasn’t simple to solder to a coin but after a few tries I managed to make a solid connection.


Soldering Copper Coins







I connected one side of the sensor to power and the other side to a 10k resistor to ground and to pin A0. I connected an LED to pin 9.











I used this code to translate the sensor data to the brightness of the LED, so the higher the excitement the more bright the LED and the more calm the less bright it is.  I based my code on the fade code and sensor code. Since the GSR sensor reads a relaxed state for the value of 20, I set it up as the lowest threshold so if it’s the value is higher the LED with fade in (5 increments at a time) and if lower then fade out the same way.

int ledpin = 9;

int GSRPin = 0;

int LEDvalue;
int GSRValue;
int fadeValue=0;
int fadeAmount=5;
int brightness=0;
//int i=5;

void setup(){
  pinMode(ledpin, OUTPUT);

void loop(){
  GSRValue = analogRead(GSRPin);
 // LEDvalue = map(GSRValue, 20, 50, 0, 255);
  analogWrite(ledpin, LEDvalue);

     // fade in from min to max in increments of 5 points:
  //for(int fadeValue = 0 ; fadeValue <= 255; fadeValue +=5) {
    // sets the value (range from 0 to 255):
    brightness = brightness + fadeAmount;
    analogWrite(ledpin, fadeValue);   

    // wait for 30 milliseconds to see the dimming effect

    } else if (GSRValue <20){
     // fade out from max to min in increments of 5 points:
 // for(int fadeValue = 255 ; fadeValue >= 0; fadeValue -=5) {
    // sets the value (range from 0 to 255):
    analogWrite(ledpin, fadeValue);
 fadeAmount = -fadeAmount ;
    // wait for 30 milliseconds to see the dimming effect

Here is the result:

I find the GSR sensor fascinating! although it is probably not an accurate measurement there are so many interesting things one can do with it. For my wearable technology class I experimented with building an emotional meter neckless to measure how excited/ calm  someone is in any given movement. The GSR sensor is connected to a lilypad simple using a conductive thread. I hocked up an RGB LED light to it so it lights in different colours based on your emotional state. Here it is:

Emotional Meter- front

Emotional Meter- back

Metal snaps for modularity


Serial Communication

Missing last class made things a bit more challenging for assignment 8. I started with connecting the button to the arduino for a simple read. Using the example code I managed to control the color of the processing square:













I tried using the second code example posted but kept getting a port error in processing. I moved on to the second exercise and played with the potentiometer and graph exercise. I replaced the potentiometer with the photo light sensor, and managed to get some interesting graph patterns.
















I worked with the dimmer tutorial to control the brightness of the LED as you move the mouse. I then switched the LED with a servo motor. The motor spin in relation to where I moved my mouse. It worked for a couple of times and then in processing I saw a warning about the serial port. Processing stopped working, I couldn’t close the sketches and when I got back to Arduino, it was also and I couldn’t upload anything to the board. I reboot and try again and got the same problem. Not sure what went wrong exactly. Below is the code that I used and the images of the connections:

const int ledPin = 9; // the pin that the LED is attached to

void setup()
// initialize the serial communication:
// initialize the ledPin as an output:
pinMode(ledPin, OUTPUT);

void loop() {
byte brightness;

// check if data has been sent from the computer:
if (Serial.available()) {
// read the most recent byte (which will be from 0 to 255):
brightness =;
// set the brightness of the LED:
analogWrite(ledPin, brightness);

import processing.serial.*;
Serial port;

void setup() {
size(256, 150);

println(“Available serial ports:”);
port = new Serial(this, Serial.list()[0], 9600);

void draw() {
// draw a gradient from black to white
for (int i = 0; i < 256; i++) {
line(i, 0, i, 150);







Mood light

I did the digital lab couple weeks ago, it just a simple switch and two led lights, I think it was boring ,so I didn’t summit it.

In this assignment, I want to make a mood light, I saw many examples from internet. Many examples are use the RGB led, but I am not going to use the RGB led.

I tend to use couple common led lights, and arrange the order how they turn on to achieve the mood light design.

1, I connect 4 led  and the switch in the circuit

2 This is the arduino code

void setup() {

void loop() {



Here is the effect

3Add a Cover to Diffuse the Light

Final effect

I think the final effect is not so good, because I don’t have many different color led, if I use more color and brighter led, the effect could be better.