Final Assignment Proposal

For my final project, I want to push myself to learn more about Processing or at least really reinforce what I should already know. So with this as the goal,the emphasis will likely be on engaging the user with a screen interaction. For inspiration, I looked to some of the well known games that have engaged people for generations. They have done so in part because of their simplicity. And simplicity as a concept bodes well for my ability to actually create something interesting that works. While thinking of some of the first games I played, I also remembered some of the first toys I played with.














Believe or not, when I was a kid this was as high-tech as it got. When the etch-a-sketch came out it was the biggest thing to hit the “fun” market since the hoola-hoop. I just had to have one and I think it’s still my favourite toy today. Conceptually, I thought it would be interesting to remind the user just how far we have come with human/screen interaction. The idea of constraining a MacBook Pro to such a simplistic graphic interface would highlight the point.

THE APPROACH – A laptop would serve as the Etch-a-Sketch screen. An image representing the original etch-a-sketch would be loaded into the Processing sketch as a background image. The Arduino would be secured within a customized enclosure with two knobs or dials that function in the same way as the original. The left dial controls the horizontal x-position and the right dial, the vertical y-position. It is my hope that I can buy two potentiometers that have a bit more accuracy that the one used in our serial communication lab.

I believe this would be a fairly simple project, but I have come to learn that I am advancing more slowly than others in the class and so, I want to start something simple that I could add on to. In this case, I might add in a motion sensor on the Arduino that would erase the drawing if the “box” is shaken while turned upside down. This is how the original etch-a-sketch functioned. In addition, it would be interesting to use the camera on the laptop to capture an image of the user while they sketched. The line would sample the video capture allowing them to create a self portrait – similar to a previous Processing exercise completed in Week four, where a line was programmed to draw using the random function. I experimented a bit with this sketch altering the code in the serial communication lab. I wanted the potentiometer and light sensor to control the line. But I’m stuck on that – here is the code I tried for the Processing sketch and I’ve included the original from the Week 4 assignment. Not sure how to make this work…


























I believe that I will need the following hardware to build my etch-a-sketch:












  1. Two rotary potentiometers – one to control the horizontal value and the other for the vertical values.
  2. Two knobs to add the finishing touch. I will be looking for ones that fit onto the potentiometers.
  3. A dual-axis accelerometerr with digital PWM or Analog output. The type I looked at can measure both static and dynamic acceleration. This means it is suited well for sensing tilt and for sensing brute acceleration (rocketry and general motion sensing applications). I hope this will allow for the “erase” function when shaken.


The other idea that I am considering sprung from a project we are working on at our agency. The client is Big Brothers Big Sisters and the program we are promoting is the in-school mentoring program. They have a particular need for young men at the moment so activities will target University students since their flexible schedules fit so well with this program. The campaign theme is “Start something Big” and volunteers will be busy raising awareness at various university fares.

We are looking for ways to engage students at the Big Brothers Big Sisters trade show booth. In addition to printed brochures and a video as an overview to the program, it would be fun to engage interested volunteers with an interactive installation at their trade fair booth. They were going to give out “branded” balloons with the “Start something Big” theme and I wondered if participants could add their virtual balloon to the BBBS website to signal their commitment. This would involve building a device that people could blow into (or pump up with a bicycle pump). As they blow into the device, a balloon would inflate on-screen. After three “blows” the balloon is released where it joins the others in the sky – creating a very colourful display. I think that I would need a motion detector to sense the physical interaction of someone blowing into the device. The challenge would be in designing an interesting visual display of this action within Processing. I have not thought this one through yet but would be interested in getting feedback on the feasibility of either idea at this point.

Thanks in advance for any input.

Revised Final Project Idea: Liz, Shuting & Linh

After a bit of revision and another couple of rounds of free-form / verbal and annotated brainstorming, we’ve come up with another final project idea. We had the opportunity to run the idea by Jim on Thursday evening and he was able to give us some extra direction on the specific sensors to look into.

Our new concept is based on asking a “user” or gallery-goer to consider their personal impacts on the world in a global scale in terms of how human life affects the health and wellness of the natural world. We are often asked to think of the our natural world as “Mother Earth.” In this gallery installation, we’re turning the tables and repositioning the individual as the the caretaker or “Mother of the Earth.” The installation will consist of a replica of the earth dangling from the ceiling, which encases the Arduino, breadboard and the sensors. The user’s engagement with the ball through shaking or tilting it, the level and range of sound coming from how one speaks to the earth, and the amount of light the user shines on the earth will initiate repercussions that will be displayed on the wall (or in a frame on a white gallery wall) of video footage of natural environmental occurrences–floods, earthquakes, melting glaciers, tsunamis, etc.

We’ll be utilizing three sensors: an X, Y & Z axis sensor; a light sensor; and a sound sensor.

Shuting drew up a mock-up of the installation.


final _ nitinol_1 (ad hock experiment)

I received my .008″ diameter muscle wire; I thought a quick first experiment would be fun — I mean, does it actually work?


so I wrapped the wire around  a screw, heated it up with a blow torch and then cooled.

then I deformed the wire (stretched it out) and threw it in some warm water — presto! it regained it’s memory form (Mostly).

But does it have any strength? here’s the test…you be the judge. the washer is at the very top of the screen in the first pic…



time to run some current through the wire!



Project Proposal: Cathy Chen and Maayan Cohen

A music creation game that allow users to toggle on and off musical notes from a pentatonic scale creating a cascade of blended music.

To provide children a playful environment that encourages collaborative play, learning and discovery in music.

Intended behavior and interaction:
The grid will be assembled as a series of sewn square, in which each square is an individual note. Users trigger the notes by pressing, tapping, or putting pressure on the square. Ideally the squares can be separate or together. It would also be fun to engage each individual square in a tag game where everyone can hold a piece of square and try to tag people by lighting up the square with a press.



Few different arduino that we considered were using three arduino uno for what we would like to accomplish as we would need multiple digital and analog pins for our multiple sensors. We considered lilypad as we thought it might be easier to be sewn into our project. However, after talking to Kate, she suggested using Arduino Mega for the multiple pins that we need or we could use Row-Column Scanning ( tor multiplexor to allow for multiple inputs and outputs to a single pin. However, Arduino Mega would seem to be the most straightforward thing for us right now.

We also considered wireless connection using bluetooth, as we woud like each individual square to be remote and allow for spaces for them to be moved around; however we will need to explore that further and understand the the pros and cons of using wireless connection within our framework.

We originally wanted to do sound in Arduino sound library, but after talking to Kate, we decided to look into processing sound libraries as another option. This change actually help us simplify our overall envisioned design where we originally wanted to apply another visual element to the grid, having processing projected image as a puzzle.

Max MSP would be very lovely for this, but neither of us understand the software.

We look into few processing sound libraries to adopt. One of the interesting ones that we would like to probe into further is the Sound Cipher. It is an interface for playing notes and allows for audio play back. Other sound library alternatives: moinim, ESS, Sound Cipher

However if anyone have any good suggestions, please let us know.

Musical Apporach
We would like to use pentatonic scale for the purpose of making the sound more generally approachable to the ear. Pentatonic scales are utilized for its lack of dissonant intervals between pitches. This function allows room for random combination and random play without clashing results.

We will have a silent metronome in the back end to allow the piece to be in a consistent tempo. In our case when we have eight squares across the row, the tempo will be 2/4 time. Ideally in our grid, we would like to have three octaves of pentatonic scales (15 notes vertically), but this would alter depending on our work progress.

Pentatonic and Education
pentatonic scale is useful for education due to its simplicity and expressive quality. Children can be encouraged to improvise and play with these notes without making real harmonic mistakes, and hence allowing the music to sound pleasant.

What kind of Sensors
We looked into several touch sensors choices: force sesnors (, and peizo transducers, but it might require an amplifier. We looked into proximity sensors. The thought of being able to trigger it through distance is very intriguing; however it is very expensive. We were very interested in the soft sensor ( and would like to explore further into this kind of sensor.

Material options
The main problem for us is to think of a way to make a touchable physical object that would allow the embedding of LED lights and allow for users’ physical input without breaking the LED lights. Therefore, we looked into soft fabric, bubble wrap and we got really interested in conductive play-doh (

Few other ideas that we thought of, but might not work is to make the individual note into a 3d sphere where people squeeze the sphere to trigger the note.


Few ideas where we could employ this grid elsewhere in a different context:

  • Suggested by Jim, we could lay the grid out in a tent
  • It would be interesting to put random squares on walls of a room. People can tap on the square randomly and music will play in a set rhythmic pattern however the lights will not be in order.
  • If the square can be moved around and people can pick each square up individually, people can play tag with the squares.
  • each individual grid can be placed on pieces of leaves and when a lot of squares are lighted up, it’ll create a beautiful light up tree with pleasant sounding music.
  • Side walk (on the floor) like hopscotch

Tone Matrix (
Physical Tone Matrix (
To think of how to put the design into a different context (

final project

I am putting sensors in place of the controllers on my helicopter. the sensors will detect the distance from my hand and the helicopter should maintain the set level of distance and follow my hand while I move it around. I am looking into putting a camera on the end to replace the LED’s that are currently on the Helicopter. The more I play with this thing the more I want to add to it. I will not know if the helicopter can handle the weight of the camera just yet, so thats phase 2.

ardunino Uno
Transmitter (camera)

final project _ interactive lotus flower

my project is inspired by a beautiful kinetic electronic flower – kukkia -I discovered while filming a documentary a few years ago in Montreal (props to Marc de Pape for archiving the project).


             …the Kukkia flower is kinetic and animated …

                                                   …however, it is not responsive or interactive…



Overview: lotus is a electronic sculpture that provides physical and audio feedback based on external stimuli.

Step 1 (ie. final assignment) is prototyping a lotus flower that responds to an analogue sensors (probably photocells) … Ultimately, lotus will provide an interface for biofeedback technologies, particularly EEG meditation values …


What it looks like: lotus is about 30 cm in diameter, made of felt (fire retardant and beautiful;), and shaped like a flower — much like Joey Berzoska’s kukkia (but larger).


How it works:

1)     Movement: lotus opens and closes, following a scale of 1-100 … Ultimately, lotus will provide eeg biofeedback on a user’s meditation state — zombie to zen  … ie., as a user’s meditative state increases, the petals of the flower open up.

2)     Sound: An analogue input triggers acoustic feedback, again following a scale from 0-100.

3)     Zen: When both the movement scale and sound scale peak together, a secret message is unlocked (enlightenment).

The Lotus flower as a design principle:

The choice of a lotus flower as a motif of enlightenment is predictable. The flower is an enduring   symbol of purity and non-attachment.

  • The unfolding of its petals suggests the expansion of the soul.
  • The lotus flower is able to regulate the temperature  of its flowers to within a narrow range just as humans do. Likewise lotus the sculpture will regulate its internal temperature (the temperature of its Nitinol wires) to ensure the safe functioning of the petals.


  • Felt and thread.
  • Conductive thread
  • Muscle wire – diameter 0.008″; Transition Temperature: 70°C
  • Arduino board, etc.
  • Field-effect transistor / transistor
  • External power source for Muscle Wire
  • Temperature sensor
  • 555 timer
  • Access to a kiln


“Kukkia” Kinetic Electronic Garment, XS Labs, Concordia University


1) Muscle Wires® project book v3.01

2) Kukkia and Vilkas: Kinetic Electronic Garments





Final Project Proposal – Faysal, Shino and Cathy

1. Concept

We wanted to work on a project that was practical, a device  that would be used on a daily basis. As a result, with all the sensors that we had access to, the idea of creating a little weather station came up. Weather applications and widgets are widely used and forecast the weather for a city, a region or a country. What about having access to the weather conditions just outside your house?

Looking outside the window was the first most intuitive action that one would do to check the weather conditions. Today our nifty mobile weather app definitely does the trick with more accuracy, but doesn’t convey the weather conditions just outside your door. Let’s face it, small collapsable umbrellas are practical but break after their 3rd use, large umbrellas are solid but cumbersome. So if was going to carry my big umbrella with me, it better rain!

That’s where our weather station comes in handy. By measuring the temperature, humidity, wind and light, the device will reflect the current outside conditions and display them along with the weather forecast retrieved from an internet source. By comparing the two “weather sources” we would have a pretty accurate idea of how our day will look like and plan accordingly.

Finally, the icons reflecting the weather (sun, clouds with rain, sun with clouds…) are somehow limiting. An ideal way to visualize the weather would be an animated abstract drawing that changes colors and shapes according to the sensor data. The details collected by the weather station will translate into intuitive shades of colors illustrating cold, warm, windy or rainy weather. Also with processing taking snapshots of these drawings during specific time intervals a timeline could be created and tweeted at the end of the day displaying the day’s weather.

Ideally the animated drawings along with the internet weather report would be projected onto a window making the user’s access to accurate weather as easy as looking out the window.

2. Approach

As hardware, we plan to create a small station that would be easily mounted on a vertical or horizontal surface enclosing the Arduino board along with the sensors. A battery would be attached to the board so that the station could power itself. Solid and water resistant materials will be used so that the device would withstand  the sever weather conditions. Sensors would collect the data and wirelessly feed it to processing via bluetooth.

As software, processing would take the data and assign a certain shape and color according to the different ranges provided by the sensors. It would also take a snapshot of the animated drawings every 2 hours or so to then consolidate them and tweet them as one image (the day weather report). Processing will also display in it’s sketch the weather forecast taken from an internet source  (like the weather channel or yahoo weather).

3.  Similar work/projects

4.  Material

  • Arduino Uno
  • Laptop
  • Light sensor
  • Humidity sensor
  • Temperature sensor
  • Wind sensor
  • Wireless connector (Xbee)
  • Battery
  • Wires
  • Resistors

Final Project Proposal

1) Concept

My first idea, the more likely one- is about questioning the viewing experience of cinema by adding an element of effort to change or discover. A camera mounted on 1-2 servo motors is connected to processing, running a motion detection sketch. A projection shows the image being captured within the space. When a participant tries to get into the range of capture, the camera will move away. The technology controls the image, and makes the participants discover that they have to try to be involved. I also want to complicate the ease and expectation of being on camera, as in daily life it can be a mundane expectation. This project gives the microcontroller a human quality that challenges viewers and gives personality to the machine.

A second idea- I’ve seen several great projects that convert information to a matrix of LEDs. What if instead there was a matrix of tactile sensors that send information to a visual matrix in Processing? I like the idea of replacing a camera as a visual tool and making it correspond to other senses instead (touch with fabric etc) with analog tools. This might have to be complicated to be visually appealing, or maybe it needs another layer.

2) Hardware / Software approach

webcam/ CMOS camera (my questions are if this need to be a webcam wired into a computer as opposed to these CMOS camera)

Processing with a motion detection sketch, servo library and video library

Arduino Uno

Two servo motors (rotation and tilt)


Video Projector
3) Broader context / related works

I found the description to this workshop called Vision Play from Media Lab Prado really inspiring-

They talk about the mediation of image-making; it being so quick, ubiquitous, taken for granted, to do a slower shutting down of our access to cinema in order to appreciate it. The substitution of tools and technology create an unexpected experience with imaging that makes us question its qualities, its role in our lives. I’m fascinated by the history of cinema viewing experiences and the various technologies, so this is a great starting point for idea generation.

For example- what if we had to put in physical effort to view a video? Crank, press, jump, wind, etc. The action will produce a imagery with a comparable content, and playback speed and style

I appreciate work in which the video is both the artifact of the interaction or process, as well as a more integral part of the experience.

Here are a few pieces that I’ve found very useful in the process of learning about interaction and imagery mediated and affected by technology.

Truncated – Simone Jones/ Lance Winn (

“Truncated is a projector that reveals sections of an image of a torso looping in synch with the up and down movement of a motorized tripod head. As the projection device moves up and down in its looping motion, sections of the torso are simultaneously revealed and erased; the viewer never has the opportunity to see the entire image of the torso at once. Truncated reveals that cameras are forced to cut in order to capture; by offering only glimpses of an image, the projection device points to the limitations of the frame as a purveyor of information.”

Removal Studies  – M. Kontopoulos (

“Removal Studies are a series of videos made using time-lapse photography. These videos are sleep studies that observe the reaction of the unconscious body to the negative stimulus of removing the covers.
The covers are removed by a machine that attaches to the bed and tugs a slight amount off in increments throughout the night. By studying the sleeping body, my aim was to capture something very honest and very animal about human beings. I was interested in this gesture of removal — and subsequently, exposure — and how it could function as a larger metaphor.

The imagery is generated by a DSLR camera taking 30 second exposures every two minutes. This video is what I view to be the most successful iteration from a series of studies. It was shot during a full moon.”

These projects use LED matrices; one as a wearable with inputs, and the other interprets video and outputs it into an LED matrix- in cider bottles.

Leah Buechley – LED Tank Top

Arduino Eye Shield – Reflections in Cider Installation

360-Degree Wall, 4000 LEDs, Made with Cardboard, Paper, and Needles, in Action

Really looking forward to this!

Final Project Proposal: Shuting, Linh & Liz

Creation and Computation: Final Project

 Group Members: Shuting Chang, Linh Do, Liz Coleman

Title: TBD

Project Concept:

Our senses are often forcibly segregated when we enter an art gallery—we are asked to look with only our eyes at a painting, a sculpture, an artifact or a photograph; to listen and view a film without touch; and hear music without ever being able to speak back. Our senses acting in unison allow us to have rich, meaningful experiences that become layers of coded memory.

Music, in the traditional sense, is a direct creation of the body—hitting a drum, playing a piano, shaking a tambourine, manipulating sound through the vocal cords, pulling a bow across a strings instrument. With the emergence of digital technology and its implicit impact on music making, production and consumption, the body has been greatly removed from all notions of music. We hope to reestablish the age-old connection between the body and music through an interactive sound and digital visual gallery installation manipulated by the body.

The user will initiate the installation into life by stepping into the sensor field. Automatically, the music begins to play and the projected image slowly begins t to change. The user follows and matches the tempo of the music being played in by stepping on buttons placed on the ground. Each of the three placed buttons will correlate to a specific drum sound. This part of the installation will correspond to the user’s interaction with the music: each time the correct beat is achieved the image displayed will morph and become more aesthetically enhanced. Simultaneously, as the user hits the correct tempo, the paper LED lamps will turn on one by one. The more times the tempo is matched, the brighter the LED lap lights will become. The final outcome, given the user completes matches the beat of the song sufficiently, will be that the physical space of the installation will be completely illuminated by the LED lamplights.


  • Hardware
    • Projector
    • Speakers
    • Audio Cables
    • LED Lights
    • Paper Lamps
    • Laptop
    • Arduino
    • Sensors:
      • Buttons
      • Infrared Sensor
      • Coverage for Buttons
      • Lots of Wire
  • Software
    • Processing
    • Arduino
    • Sound: Music
    • Processing Music Library
    • Processing Image Library
    • Digital Image of the Painting

















Idea Tree: 3 Final Project Branches

I was never totally satisfied with my week 3 processing sketch, titled Elastic String(s). Though it worked, the physics were off (especially on the Y axis) and the control mechanism was kind of clumsy. I’ve spent the weeks since thinking about what I would do to fully realize the sketch… and ultimately thinking about what form a fully realized sketch would take.

As of today, the seed that was planted in week three has spawned multiple branches. They each sequentially build upon the previous design… so we’ll see how far I can get before the end of the semester.













Branch one is closest to the root. Here I want to turn my sketch into a light sensor controlled ambient harp. The harp seems like the most natural metaphor for turning this sketch into a musical instrument. I am choosing to work with light sensors mainly because I have been exploring them for a few weeks now, but more importantly I want to challenge my Arduino coding abilities; I want to see if I can finesse the data from a minimum of 2 to a maximum of 4 (either TSL235R Light to Frequency Converters or TEMT6000 Ambient Light Sensor) to create an ambient sensor interface that can detect passing bodies. It will either be a left and right sensor array, or a cross like setup which is programmed to react to differences in intensity. I will then send the data to Processing where it will provide the input that strums the graphical harp strings. Here I will have to fix the physics and create pull and release functions that don’t relate to the mouse. The trick will be to make these functions behave intuitively based on sensor input because true pull and release type gesture input is not available.

To turn the sketch into an instrument, I will first write a melodic loop (something in the spirit of Erik Satie) and map that composition to its equivalent midi notes. In Processing, I will use the Ruin & Wessen MIDI library to control the output of these midi notes (sequenced in an array). To do so, I will have to construct a trigger that moves through the array, sending the corresponding note out through the midi library. At the moment, I imagine the notes will be triggered by the changing of direction of the strings; in other words, when a string reaches it’s maximum elasticity, it triggers the next midi note in the sequence.

Because of the number of strings swinging back and forth, this harp will likely be incredibly multi-timbral (like a true harp) and thus will require a sophisticated audio processor. Consequently, I will be running one of my IK Multimedia Virtual Instruments (likely the SampleTron) as midi receiver and audio generator.

The installation would ideally be quadraphonic, utilizing natural reverberation (though, a stereo setup with simulated reverberation is also acceptable). I think the visualization would greatly benefit from an increase in scale. A large wall projection (with a proportional increase in the number of sensors to ensure input fluently translates) would be the ultimate “high-rent” version of this interactive. In the end, a large LCD will likely be the most realistic display option. Either way, the light sensors will be hidden in order to draw attention to the graphic representation of the instrument, which will hopefully provide low-latency feedback, encouraging experimentation and ultimately comfort, and maybe even mastery, of the input parameters producing a complex harmonic sonic experience.


1 Arduino Uno
2 to 4 Light sensors (TSL235R or TEMT6000)
1 LCD Screen or Proejctor
1 Stereo Amplifier w/speakers
1 MacBook















Branch two explores the idea of light data controlling the cranking of a music box. I found a DIY music box kit which would allow me to use the same composition as the one created for the Ambient Harp as sound output (potentially microphone amplified using an Electret Microphone combined with an Opamp, conveniently combined into a breakout board by Sparkfun, to broadcast the tiny music box maybe using FM?) The difference in this instance is that I would redirect the light sensor data from the Ambient Harp setup to ultimately control a 360 degree servo motor, connected to the Music Box’s crank handle.

As an added feature, I would like to have the light sensor wirelessly send crank control information. This can easily be done with an XBee. The trick is what form will the sensor take. At the moment, I imagine the sensor being fixed to a window using a suction cup. I want the control to be the ambient light changes created by passing clouds; the perfectly smooth gradient transitions that occur when a clouds pass by the sun.









1 Arduino Uno
1 TEMT6000 Ambient Light Sensor
1 XBee Wireless Kit
1 DIY Music Box
1 Servo – Medium Full Rotation
1 Breakout board for Electret Microphone 
1 Suction Cup















Building on the remote music box, branch three uses the same wireless window light sensor, but instead of controlling a music box, it would control a kinetic sculpture. The sculpture would hark back to branch one in form, only more inspired by Naum Gabo‘s meticulous geometric string sculptures, Reuben Margolin‘s kinetic waves, and ordinary everyday wind chimes. However, instead of responding to wind, ambient light changes make the kinetic sculpture chime.

This is where I get way outside my comfort zone as I have no discernible material skills, but really want to explore rapid prototyping. In the end, I’m hoping I can come up with a simple form that will at the very least conjure visions of those sculptures that inspire me.

I imagine the sculpture being turned like a mobile by a servo motor, with tines taken from a toy xylophone, making contact with a metal or wooden spine. This core would be wrapped by a Gabo like string pattern which would hopefully have wave like kinetic motion.

Again, this is way outside my skill set… but that’s the point right?

I am very interested in ambient presence and how it can relate to way-finding in interior spaces. In the same way that the wind chime translates the invisible waves of the world’s air currents in to audible waves, and thus identifiable in space, I want to translate light in a similar way. Ultimately I want to do so in the hopes of creating a useful beacon or signal for navigation. Also, most importantly, I love the randomness of wind chimes and would love to have one indoors.


1 Arduino Uno
1 TEMT6000 Ambient Light Sensor
XBee Wireless Kit
Servo – Medium Full Rotation
1 Toy Xylophone
1 Spool of thread (?)
1 Wooden/Metal Pole (?)
1 Set of Practical Skills (?)
















Wired’s documentary Creating The Nedbula: Part [1] [2] [3] [4]

Yellow Spiral


Fine Collection of Curious Sound Objects


Electric Wind Chimes