Author Archive

Digital Bell Tower


Screen Shot 2014-12-10 at 12.30.53 PM


1. Project Context

For this project we wanted to re-imagine a bell tower for the digital age. Bell towers are a relic from a period in history where community was very location based. During their inception, the bell tower was used to keep a community on the same rhythm through notifications of the time. Today, in a society where everyone has the time easily available to them through their phone, a bell tower that measures time is no longer of value in the way that it once was. To add to this, community is no longer strictly location based. For example, the OCAD University community has several campus buildings, and for certain programs (such as inclusive design) the classes are also available online.Screen Shot 2014-12-07 at 6.23.44 PM

For our redesign of what a bell tower could be, we wanted to create an object that would allow for interaction between people in the physical space, as well as people in the community who are not present. Instead of measuring the time, our bell tower measures the mood of people within that community, and broadcasts it to all. Overall, instead of a bell tower measuring the tempo of time passing, our bell tower measures the tempo and mood of the people within it.

Screen Shot 2014-12-07 at 6.33.00 PM


2. Project Description



The major part of this installation is the bell itself which was built based on the shape of a balloon. Several layers of paper were added on the surface of it using paper matrix method and the next stage was to wait for completely dry and to spray the whole object into metal silver. Tinfoil tapes was ultimately used to tape on the formed bell and smooth the surface.
After being hung on the ceiling through two groups of hooks as the connection by chains, the bell was able to swing. The purpose of this project is to receive the current mood on twitter of #Toronto citizens, which are happy 🙂 or sad 🙁 , and then visualize them into different colours as well as bell sounds.

There are two pods on the ground with LED strips inside, which could be triggered by the bell when it swing and arrived above them. Once being triggered, the pods will respond by lighting up bright white. According to the mood type from tweets, the EL wires wrapped around the pods will either turn into pink, which represents the happy mood, or turn into blue, which stands for the sad mood. Simultaneously, the sound will change during the swing of the bell based on the tweet mood as well.
Four XBees are divided into two groups with each group has a sender and a receiver to achieve the expected communication.

3. Process


The first step in our project was making the digital bell. To do so, we knew that we needed a hollow interior in order to put our accelerometer and speaker inside. We purchased a large balloon and used the paper-mache technique with latex paint. The latex made it much harder and more durable than regular paper-mache, and we did this so that viewers would be able to hit the bell around without worrying about damage. At the same time, we needed the bell to be light in order for it to not pull on the ceiling. Overall, the structure we created was very durable. In order to make it look like a bell, we spray painted it at 100 McCaul to create a silver sheen on the structure. Unfortunately, the spray paint flaked off, so we decided to use tin foil tape to create the same effect. This ended up looking better and creating a smoother surface on the bell.



In terms of electronics on the bell, we placed a wireless bluetooth speaker inside the hollow concave and then used large screws to bolt it in place. At the top of the structure we added a light and accelerometer with the x bee to control the sound.


For the pods: we used two plastic lamps, ripped out the insides, and put in a LED strip and IR Proximity Sensor in each. Then we went to the metal shop, drilled holes in 2 metal bowls for the wires, and put the pods inside the bowls for a cleaner look.


Originally we wanted the EL wire to run across the bell. Unfortunately, when plugged into a battery instead of the wall the strength of the light got much weaker. With a higher battery voltage, the EL board began to burn. So with this dilemma (one too high, and the other too low), we decided to put the EL wires around the pods.


To broadcast the digital bell tower we created a website at  On the website is a quick description of the bell tower, buttons to influence the bell tower, and a live stream that is connected to the webcam facing the bell tower. We created the website with a template from Tumblr, added in the live streaming and twitter buttons with some html/css edits, and then purchased the domain and set it up. To stream we used the service uStream, which worked very well and was easy to use.

4. Code

The code part of the Digital Bell Tower project consists of a few individual programs. A Max/MSP patch that generates the bell sound, two Python scripts that read twitter data and sends OSC messages to Max/MSP and to one of the Arduinos, and three Arduino sketches – one for reading the accelerometer data which sent to the computer via XBee, one for reading distance data and triggering an LED strip every time the bell swings on top of one of the two sensors, and one for controlling the EL wires and receiving serial messages from Python via XBee.

The Arduino handling the accelerometer has a very simple sketch and won’t be discussed in this section.

const int xPin = A5;
const int yPin = A4;
void setup(){

void loop(){

4.1 Python

Even though the github links to a single python script, the actual project used two very similar scripts. This may have increased the project complexity, but simplified the configuration process of the XBees. The script runs a twitter search and gets the results for the last “Toronto :)” and “Toronto :(“ twits. It then compares the time each one of them were posted and returns 0 or 1 depending on the result. Lastly, it sends the value to Max/MSP via an OSC message and to Arduino by writing to the serial port. The script that was run on the Max/MSP computer did an additional search and through the same process described above, sends another OSC message, this time with three possible values returned: 0, 1 and 2.

4.2 Max/MSP


The Max/MSP patch is where all the sounds are generated. It is divided in six colour coded sections to make it easier to read and each section does a specific task.

The first red section receives the Arduino accelerometer data and parses it. The accelerometer is only reading and sending X and Y acceleration data. The first section is where the data can be calibrated by changing the values on the two “scale” objects.

The yellow section bellow receives OSC messages from Python. There are two incoming messages, one can receive a 0 and 1 and the second can receive 0, 1 and 2. These values will be used to control to type of sound that will be played. Mode determines which section of the Max/MSP patch will be used to generate the sound and posMode changes the sound in one of the modes (sad tweet mode).

The second of the upper sections, the green one, calculates the acceleration between incoming data. The initial idea was to find a way to calculate the difference between two subsequent incoming values, but ‘timloyd’ from the cycling74 forum presented a simple solution using just a couple of objects. This acceleration data is later used to trigger the bell sound when the patch is running in Mode 0 (happy tweet mode).

The third section mixes between four different buffered bell sounds based on the X and Y values coming from the accelerometer. For this, an algebraic expression from Dave Haupt of forum was used. The audio technique used in this project is very similar to the one used in vector synthesis ( but instead of mixing waveforms as a vector synthesizer would normally do, the patch mixes between four bell sounds of the same length. All four sounds were originally Tibetan bowl sounds previously sampled which were edited in Ableton Live for this project. While the patch is running in sad tweet mode, this section will mix between four shorter segments of the buffered sounds and, thus, making a sound that resembles a traditional vector synthesizer. Also, vector synthesizers are traditionally controlled with a joystick that controls how the waveforms are mixed based on X and Y position. The Digital Bell Tower projects builds on this idea and uses an accelerometer to gather X and Y data instead of a joystick. Or the bell itself can be understood as being a joystick.

The fourth section is the happy tweet mode sound generator. As already mentioned, it is activated based on incoming OSC messages from the Python script. If the message is 0 the “ggate” objects opens the audio signal patch and all four bell sounds are triggered every time the acceleration value is higher than the threshold. Each sound gain is then multiplied by a value from 0 to 1 based on the vector algebraic expression. The sum of all four audio gains is always 1.

The fifth and last upper section is the sad tweet sound generator. It works in a very similar way to what was described in the previous paragraph. The main difference is that instead of using a “~groove” object to play the buffered sound from beginning to end, a “~wave” object is used to play just a section of the sound. The section start and end position are determined by the posMode three possible values which are the second OSC message coming from Python. This mode is activated when the mode value is equal to 1. That value is used to control the “~phasor” object.

4.3. Pods

4.4 EL wires

5. Sketches

5.1 Accelerometer

Screen Shot 2014-12-10 at 6.47.22 PM

 5.2 – Pods – LED Strip and Proximity Sensors 

Screen Shot 2014-12-10 at 7.48.06 PM

5.3 EL Wire &  Twitter



6. Final Shots

Screen Shot 2014-12-10 at 8.36.58 PM

Screen Shot 2014-12-10 at 8.37.37 PM

Screen Shot 2014-12-10 at 8.38.42 PM

Screen Shot 2014-12-10 at 8.39.53 PM

Screen Shot 2014-12-10 at 8.40.23 PM


7. Feedback

Feedback from teachers & students (future aims):

– more variation within the physical push/accelerometer (make it more obvious when it is on, and when it is being pushed)

– make it more about the people in the space rather than those on twitter

– move EL wire to the bell (original plan, but work out the kinks)

8. Case Studies

8.1 – Public Face II


Public Face II is an installation created by artists Julius von Bismarck, Benjamin Maus and Richard Wilhelmer and displayed in the city of Lindau, Germany. The installation consists of a giant smiley built on top of a lighthouse.

A camera mounted on top of the lighthouse and facing down towards the crowd located on ground level uses computer vision algorithms the analyse people’s face and the smiley will react to emotion data in real time.

Public Face II uses emotion data coming from a group of people’s faces and displays it in a very straightforward manner: using a monumental smiley.The Digital Bell Tower aims to display emotion data not in an iconic way but turning data into symbols. (Link1, Link2)

8.2 – D-Tower

Screen Shot 2014-12-10 at 7.05.53 PM


D-Tower is a threefold project. It consists of an architectural tower, a questionnaire and a website and all three elements works together. The inhabitants of the city of Doetinchem located in Netherlands, the city who commissioned the art piece answers a questionnaire that records the overall emotions of the city. Happiness, love, fear and hate are the emotions being measured by the art piece. The results of the questionnaire are then displayed on the website and the tower also reflects the city of Doetinchem mood.

Our project aims to also be reactive the a city overall feeling. Colours and lights installed on the bell and the installation will change following the mood of Toronto (link1, link2, link3).

8.3 – Aleph of Emotion


Aleph of Emotion visualizes worldwide emotions. Twitter is the source of the collected emotions the website data was collected for 35 days in 2012 using openFramworks. The D.I.Y. physical object consists of an arduino and a tablet and it is built to resemble the appearence of a photography camera. The user points the object towards any direction and the tablet will display the emotion information for that particular region.

The Digital Bell Tower will similarly use Twitter to collect emotion data. Geography will be restricted to Toronto and won’t source worldwide data such as Aleph of Emotions. Data will be collected in real time and the project will also respond to twitter data in real-time as opposed to collect data during a specific time period and display that data (link).

8.4 – Syndyn Artistic Sports Game

Screen Shot 2014-12-10 at 6.59.07 PM


It is a badminton game created and named after Syndyn to be played in an indoor place. The speeder was colored by red LED while the racquets of each players are connected electro luminescent wire(and sensors) to reflect motion by light effect. The players are those who turn physical movement into audiovisual performance during playing the games(or doing the sports). With signals being transmitted from the sport equipment to computers, the games are performed by sound and light effects.

Similar to this installation, our project is also going to have a swing bell connected by accelerometer to detect its motion and XBee as the sender and receiver for data, as well as electroluminescent wires to perform the changes of conditions. In terms of the presentation, visual and sound effects are expected to emerge during the motion. All these mentioned function will be triggered while the bell swings and interacts with two objects on the ground.

In order to have a visual memory of the badminton game, the EL wires on the rackets are also going to be wrapped around the wrist of the player al long as the video camera will record the motion of the players lighting up by the wires. Plus, the speeder is lighted up in red LED so the camera can also catch the moving track of it and forms a image like this.

An iPod touch is used at the beginning of the game for the players to choose the theme color and other visualizations of the game. By contrast, our plan about the theme is to let the users hashtag their mood before sending a tweet. We have two mood types–happy and sad which could be triggered by tweets and will represent via pink and blue EL wires along with the objects with white LEDs on the ground (link). 

Your Body of Water – By Lee Jones – Project 2


Project Background: In undergrad I started a publication called Art & Science Journal that focuses on artworks with themes of science, nature and technology. My research specialized in artworks that used exaggeration of natural processes to present broader themes (a quick summary can be viewed in my talk at TEDxUOttawa).

When we were informed for our creation and computation class that our task was to use water in our next project, I wanted to do something  that went along with my research,  and that used the water to connect to and visualize our inner body.

References: During my research I came across two projects that were influential to the making of Your Body of Water; Teo Park’s May the Force be With You and Heart Bot (a collaboration between Intel, SMS Audio and Sid Lee). What drew me to these works was how they approached visualizing the body, and the body’s movements. Park’s work was about a fish tank that tilted in relation to how you moved. So if you moved left it would tilt left; if you moved right it would tilt right. Overall, Park used the natural movements of water to visualize your movements with kinect. This work excels in connecting the viewer to the water in a very visceral way, and you lose the sealed concept of yourself as your body “expands” to include the water.

Screen Shot 2014-11-10 at 12.13.57 PM

 Figure 1: Teo Park, May the Force Be With You (2012)

In Heart Bot the effect is quite similar, but the circular form is used and is an apt way of visualizing heart rate, as we often see the heart as the “centre” of our body. In this work the viewer’s heart rate is recorded and then drawn by a robot. The overall effect is that many people’s heart rates are combined to create one heart.


Screen Shot 2014-11-10 at 12.13.05 PM

Figure Two: Intel, SMS Audio, Sid Lee, Heart Bot (2013)


Screen Shot 2014-11-10 at 3.05.56 AM

Figure Three: Initial Project Plan

Your Body of Water: Our mind and body are connected. When you’re excited or stressed out your heart rate goes up, when you’re meditating your heart rate goes down. In this project a bowl of water spins based on your heart rate. A live video of the water is then projected and visualized on a large screen. As your heart rate increases the water swirls faster, and the music of crashing waves begins. As your heart rate decreases the water swirls slower, and the music turns to a slow trickling river. When no one is touching the work, the water goes still. The water’s movements and sounds become a visualization of how calm or excited you are.

Schematic Diagram with Arduino:


 Inside the Box:



  • Pulse Sensor
  • Continuous Servo
  • LED
  • Toggle Switch

Interaction: When someone turns on the machine with the toggle switch the blue LED light goes on to signify to the user that the pulse sensor can now begin its readings. Once the pulse sensor is put on, the servo motor spins based on the beats per minute of the user.


On the processing side, as your pulse increases the webcam watches the bowl of water on the servo spin. As your pulse increases and the servo increases speed, processing will play music going from a quiet stream to a loud storm.


Code for Arduino and Processing is available on GitHub at

Draft Setup:



Experiment 1 – Webcam and Text

For this experiment the aim was to figure out how to stream an external webcam into processing and add layers on top. This was the first step in creating the processing interface for the projection part of the project. The reason for the external webcam input was so that it could overlook the spinning fishbowl, and by amplifying the image onto a larger surface also amplify the visualization of your heart beat. The “second screen” in this case exaggerates the image for greater effect. To go with the project’s theme – to see your heart beat not only visualized in the water, but also made to overwhelm/surround you.

The text was also important to give viewers a clue about what the project was about. A metal box with a spinning fish bowl and pulse sensor line isn’t very intuitive, so by adding the centred text title, I hoped to add some clarity to the projection, and give viewers a sense of what they are looking at/involved in. Overall, adding a webcam and text elements ended up being rather easy, and the processing website was very helpful.

Experiment 2 – Framing Circle

For this next step, I wanted to create a frame to surround the webcam stream. As I mentioned while discussing the Heart Bot project, a circular image is very apt for the subject matter of your heart, your “core”. Also by drawing upon the centre of the body and visualizing that image for the viewer, I hoped to create an extension of the body beyond our sealed selves. To create the frame I added a .png file made in photoshop and uploaded it to processing. I wanted to make it the whole screen, but this proved rather difficult. I could double the size of the .png file by ratios/percentage, but it wasn’t very smooth, and didn’t apply for different screen sizes.

Experiment 3 – Responsive Page for Project Screen

Right from the start, I planned on using a projection, and was aware of the differences when moving what may be full screen on your computer to what is full screen on the projector. To prepare for these size differences I needed to make the processing page responsive to changes in size. To do so I changed the code so it would work for all sizes (i.e., using 0.5 * width, instead of 450 width), and changed the .png frame for a really thick white circle. It had the same effect of the frame, but was responsive and would change size to always be centre. I got this idea from – a really great resource for processing examples and code.

Experiment 4 – Your Pulse

One of the harder challenges was getting the pulse to connect to processing. Because I didn’t have previous experience with a pulse sensor, I wrongly assumed that when your finger was not on it the pulse would be zero. This, I learned, was not how a pulse sensor worked, and when your finger was not on it you could get “random” readings – such as 34 or 133. I was never quite sure what to expect, and at first thought my pulse sensor was broken. Luckily, when my finger was on it the range was usually between 60-80 so I was able to figure out that it was working.

The pulse sensor makers have provided some handy code for using processing and the pulse sensor (available at I incorporated this code [with some help from Phuong, thanks! 🙂 ] and was able to get a reading in processing. This was very important for next steps, and making sure that everything worked accurately.

Experiment 5 – Sounds Based on Your Pulse

To enhance the experience of your water spinning as your pulse rate increases, I also wanted to add sound so that there would be a steady trickle if your heart rate was low and building up to a rain storm as your heart rate increased. To do so, once the beat rate was included in the processing code, I incorporated the minim code in order to play songs based on the beat rate. For this experiment, Stephen and Hector taught me how to use the minim Audio Player code (available at At first, we used the Audio Sample code (which is why the clips are so short and repetitive), but later on through using Audio Player I was able to get more natural and smooth sounds.

Experiment 6 – Pulse Sensor and Bowl

In this experiment, I coded the continuous servo motor to spin based on my heart rate. To do so, I grouped the heart rates into 10s (40-50,50-60,etc) and made it so that as the heart rates went up, the continuous servo speed went up (up to the maximum of 0). The continuous servo is interesting to code and not really intuitive ( 0 = full speed one way, 90 is stop, 180 = full speed the other way). For effect, I used 90 to 0 for heart rates up to 100, and then once they passed the threshold I went to 180 the other way to create ever more turbulence. So the bowl would be swinging back and forth once you reached a certain heart rate.

In this experiment I wanted to control the “start and stop” of the viewer’s experience. As previously mentioned, the heart rate gets rather random when it’s not on a finger (with numbers ranging from 33 – 136) and I didn’t want the bowl spinning around if no one was using it. To fix this problem, I incorporated a toggle switch so that the viewer could turn on the machine, put their finger on the pulse, and by the time they had done so the pulse sensor would begin taking readings.

When you hit the toggle switch a blue LED light also goes on within the box so that you know it is on and running. This was important so that I could monitor that the machine was working properly throughout, and would also clue viewers in that the machine was running.

In this experiment, I used more natural sounding audio – i.e., stormy winds, trickling waters. One problem I realized at the last minute is that the servo is rather loud. If I were to continue this project I would look into a quieter servo so that it doesn’t overpower the water sounds.

Experiment 7 – Webcam on Water

In this video you can see the water spinning below. Because of the lights and the white background, you can really see the water as it moves. This made me realize that I needed to add more effects so that you would be able to see the water moving.

Experiment 8 – Blue Light and Pearls

To solve the problem of the see-through water I added a blue light and pearls. The pearls made it easier to see the spinning motions and enhanced the visuals for the viewer. I used pearls because they are a common reference for thoughts (share your “pearls” of wisdom, etc). So in reference to the projects overall goal of  measuring how stormy your sea, and the emotional aspects of heart rate, the pearls were a fitting solution for better visuals.


Compartmental. Minim. Available at

Intel, SMS Audio, Sid Lee. Heart Bot (2013). Available at

P5Art. Experiments in Processing. Available at

Park, Teo. May the Force Be With You (2012)

Processing 2. Processing Reference. Available at

Pulse Sensor.  Code & Guide. Available at

Reas, Casey & Ben Fry. Getting Started with Processing. O’Reilly, Cambridge: 2010.


Special thank you to Phuong, Stephen and Hector for their coding help!


These Motors Got Moves!

Using timing instead of delay in Arduino to move servos and light up LEDs to the beat of Michael Jackson’s Billie Jean.


Team: Lee, Elliott, Sachi and Jason




The challenge: to make a ‘robot’ dance to the beat of a song.
The equipment: Two servo motors gussied up as Starbucks sirens/go-go dancers, and 3 LEDs standing in for a discoteque display
The song: Billy Jean by Michael Jackson (at 120 bmp)



#include <Servo.h>

// lights

int ledPin1 = 5;
int ledPin2 = 7;
int ledPin3 = 12;

Servo myservo; // dancer 1
Servo myservo2; // dancer 2

int pos = 0; // seems to work for 120 bmp
int target = 103;
long lastMove;
int moveRate = 5;

long lastChange1;
int blinkRate1 = 500;
boolean ledState1;
long lastChange2;
int blinkRate2 = 250;
boolean ledState2;
long lastChange3;
int blinkRate3 = 1000;
boolean ledState3;
void setup()
myservo.attach(9); // attaches the servo on pin 9 to the servo object

pinMode(OUTPUT, ledPin1);
digitalWrite(ledPin1, HIGH);

pinMode(OUTPUT, ledPin2);
digitalWrite(ledPin1, LOW);

pinMode(OUTPUT, ledPin3);
digitalWrite(ledPin1, HIGH);

void loop()


pos+=1; //pos++ pos = pos+1

pos-=1; //pos++ pos = pos+1


//then do something
ledState1 = !ledState1; //toggle the value
lastChange1 = millis(); //store the time that you changed

//then do something
ledState2 = !ledState2; //toggle the value
lastChange2 = millis(); //store the time that you changed

//then do something
ledState3 = !ledState3; //toggle the value
lastChange3 = millis(); //store the time that you changed

digitalWrite(ledPin1, ledState1);
digitalWrite(ledPin2, ledState2);
digitalWrite(ledPin3, ledState3);

Black Box – Arduino Project 1 – Lee Jones

Concept – Black Box: For this project we were tasked with using the Arduino platform to create an object for a haunted space ship. In the process, I created a black box that was activated by proximity so that it would turn on as people walk by. This in turn activates the speaker and the music file I created of someone calling in from another space station for help. The person calling in is then attacked and the line goes dead.
In terms of technology, I used Arduino to program so that as people walked by the speaker was turned on. The sensor I used was an Infra Red Sensor. The speaker was attached to my iPhone, and played an audio file that I had uploaded to it.

Inspiration: Aether Artifact (2012) by Michael Importico

In Michael Importico’s project Aether Artifact (2012) the aim was to create the illusion of a radio picking up sounds from the past. In the context of our space ship task, I immediately thought of the black box in airplanes and space ships, and was inspired to make one with sounds from a mysterious attack. Instead of using lights to reflect the sounds being made, I wanted it to be based on picking up people as they walked by. The idea was that once the black box picks up their proximity they would be able to hear how a previous ship was attacked.


What is a Black Box?

  1. (Aeronautics) an informal name for flight recorder (Collins English Dictionary)
  2. A device or theoretical construct with known or specified performance characteristics but unknown or unspecified constituents and means of operation (The American Heritage Dictionary)
  3. Any small black box containing a secret, mysterious, or complex electronic device (Webster’s College Dictionary)

Black Box: Cultural Context 

1. Witness to Events

“I am particularly fascinated by the “black box” obsession that follows each airplane crash – the wish (which I share) to witness the last moments, especially the moment that reveals the certainty of death entering the pilot’s consciousness. Why do I want to know this, over and over?”

– James Berger, “Trauma and Literary Theory”, Contemporary Literature Vol. 38 (1997).


2. Banality and Existential Crisis


– William Matthews, “Black Box” (1989).

3. Mystery

“That final surviving piece of technology in any disaster, that item so mysterious as to warrant the name ‘black box,’ gives us a notion of what went wrong, or at least, who to blame.”

– Douglas K. Currier, Black Box: Poems, Harvard Review (1999).

As these quotes demonstrate, in our cultural context a black box is associated with fatality, mystery, and existential crisis, which makes it a well-suited object for a haunted space ship. You can only listen to a black box once tragedy has already struck, so it leaves you with absolutely no hope for the situation. The term ‘black box’ highlights it’s mystery. You can only hear the audio of what occurred, and in this case not being able to see the scene creates a sense of fear. A black box is also a banal item. It is always in the background, listening to everyday things, until it reveals how everything can go wrong.

Project Concept:


Situation: You are an astronaut walking through your spaceship when suddenly you hear a dial tone coming from the black box nearby. A man picks up and you hear his message for help, but cannot respond. The person’s voice is from the past, and you know you will never be able reach them. You then hear them being attacked and feel helpless. The man yells out a final call for help before the line goes dead and the dial tone returns. You are now alone.

Design Process – Creating a Black Box:

  1. Hacking a speaker – how can I make it make noise?
  2. Figuring out sensors – how can I make people’s bodies turn it on?
  3. Getting led astray! Trying to fit too much in and keeping on track.
  4. Music and Narrative

Designs and Diagrams:

Material Design:
  • Pantyhose (speaker)
  • Wire grating book end (speaker)
  • metal “project box”
  • Canadian Tire speaker unit
Watch the video of the awful screech here –

Video Pre-Critique:

Post-Critique Conclusions and Next Steps:

When I moved the black box from my quiet room to the loud classroom everything changed. With all the other noises you could no longer hear the dial tones and dialogue, only the monster’s yells and radio blips. Through this project I’ve realized how important it is to think about context. To fix this problem of sound levels my colleagues recommended an amplifier, which is something I will incorporate for all sound projects moving forward.

Another aspect I noticed through the critique environment was that the size of my project was very small and rather discrete. In the future, I think adding LEDs behind the speaker screen would be something to consider. This way when the black box is alert and playing people will really know it is on.


Lee Jones Black Box 1
Berger, James. “Trauma and Literary Theory.” Contemporary Literature 38.3 (1997): 569-82. JSTOR. Web. <>
 Currier, Douglas K. “Black Box: Poems”. Harvard Review. 17 (1999): 165-167. JSTOR. Web. <>
Importico, Michael. “Michael Importico – Final Project – Aether Artifact.” CMU EMS2 Fall 2012 Section A. Prof. Golan Levin’s Introduction to the Electronic Media Studio: Computation for the Arts, 9 Dec. 2012. Web. 06 Oct. 2014. <>.
Matthews, William. “Black Box”. New England Review and Bread Loaf Quarterly. Middlebury College Publications. 5.4 (1984): 519. JSTOR. Web. <>

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.