Digital Bell Tower


Screen Shot 2014-12-10 at 12.30.53 PM


1. Project Context

For this project we wanted to re-imagine a bell tower for the digital age. Bell towers are a relic from a period in history where community was very location based. During their inception, the bell tower was used to keep a community on the same rhythm through notifications of the time. Today, in a society where everyone has the time easily available to them through their phone, a bell tower that measures time is no longer of value in the way that it once was. To add to this, community is no longer strictly location based. For example, the OCAD University community has several campus buildings, and for certain programs (such as inclusive design) the classes are also available online.Screen Shot 2014-12-07 at 6.23.44 PM

For our redesign of what a bell tower could be, we wanted to create an object that would allow for interaction between people in the physical space, as well as people in the community who are not present. Instead of measuring the time, our bell tower measures the mood of people within that community, and broadcasts it to all. Overall, instead of a bell tower measuring the tempo of time passing, our bell tower measures the tempo and mood of the people within it.

Screen Shot 2014-12-07 at 6.33.00 PM


2. Project Description



The major part of this installation is the bell itself which was built based on the shape of a balloon. Several layers of paper were added on the surface of it using paper matrix method and the next stage was to wait for completely dry and to spray the whole object into metal silver. Tinfoil tapes was ultimately used to tape on the formed bell and smooth the surface.
After being hung on the ceiling through two groups of hooks as the connection by chains, the bell was able to swing. The purpose of this project is to receive the current mood on twitter of #Toronto citizens, which are happy 🙂 or sad 🙁 , and then visualize them into different colours as well as bell sounds.

There are two pods on the ground with LED strips inside, which could be triggered by the bell when it swing and arrived above them. Once being triggered, the pods will respond by lighting up bright white. According to the mood type from tweets, the EL wires wrapped around the pods will either turn into pink, which represents the happy mood, or turn into blue, which stands for the sad mood. Simultaneously, the sound will change during the swing of the bell based on the tweet mood as well.
Four XBees are divided into two groups with each group has a sender and a receiver to achieve the expected communication.

3. Process


The first step in our project was making the digital bell. To do so, we knew that we needed a hollow interior in order to put our accelerometer and speaker inside. We purchased a large balloon and used the paper-mache technique with latex paint. The latex made it much harder and more durable than regular paper-mache, and we did this so that viewers would be able to hit the bell around without worrying about damage. At the same time, we needed the bell to be light in order for it to not pull on the ceiling. Overall, the structure we created was very durable. In order to make it look like a bell, we spray painted it at 100 McCaul to create a silver sheen on the structure. Unfortunately, the spray paint flaked off, so we decided to use tin foil tape to create the same effect. This ended up looking better and creating a smoother surface on the bell.



In terms of electronics on the bell, we placed a wireless bluetooth speaker inside the hollow concave and then used large screws to bolt it in place. At the top of the structure we added a light and accelerometer with the x bee to control the sound.


For the pods: we used two plastic lamps, ripped out the insides, and put in a LED strip and IR Proximity Sensor in each. Then we went to the metal shop, drilled holes in 2 metal bowls for the wires, and put the pods inside the bowls for a cleaner look.


Originally we wanted the EL wire to run across the bell. Unfortunately, when plugged into a battery instead of the wall the strength of the light got much weaker. With a higher battery voltage, the EL board began to burn. So with this dilemma (one too high, and the other too low), we decided to put the EL wires around the pods.


To broadcast the digital bell tower we created a website at  On the website is a quick description of the bell tower, buttons to influence the bell tower, and a live stream that is connected to the webcam facing the bell tower. We created the website with a template from Tumblr, added in the live streaming and twitter buttons with some html/css edits, and then purchased the domain and set it up. To stream we used the service uStream, which worked very well and was easy to use.

4. Code

The code part of the Digital Bell Tower project consists of a few individual programs. A Max/MSP patch that generates the bell sound, two Python scripts that read twitter data and sends OSC messages to Max/MSP and to one of the Arduinos, and three Arduino sketches – one for reading the accelerometer data which sent to the computer via XBee, one for reading distance data and triggering an LED strip every time the bell swings on top of one of the two sensors, and one for controlling the EL wires and receiving serial messages from Python via XBee.

The Arduino handling the accelerometer has a very simple sketch and won’t be discussed in this section.

const int xPin = A5;
const int yPin = A4;
void setup(){

void loop(){

4.1 Python

Even though the github links to a single python script, the actual project used two very similar scripts. This may have increased the project complexity, but simplified the configuration process of the XBees. The script runs a twitter search and gets the results for the last “Toronto :)” and “Toronto :(“ twits. It then compares the time each one of them were posted and returns 0 or 1 depending on the result. Lastly, it sends the value to Max/MSP via an OSC message and to Arduino by writing to the serial port. The script that was run on the Max/MSP computer did an additional search and through the same process described above, sends another OSC message, this time with three possible values returned: 0, 1 and 2.

4.2 Max/MSP


The Max/MSP patch is where all the sounds are generated. It is divided in six colour coded sections to make it easier to read and each section does a specific task.

The first red section receives the Arduino accelerometer data and parses it. The accelerometer is only reading and sending X and Y acceleration data. The first section is where the data can be calibrated by changing the values on the two “scale” objects.

The yellow section bellow receives OSC messages from Python. There are two incoming messages, one can receive a 0 and 1 and the second can receive 0, 1 and 2. These values will be used to control to type of sound that will be played. Mode determines which section of the Max/MSP patch will be used to generate the sound and posMode changes the sound in one of the modes (sad tweet mode).

The second of the upper sections, the green one, calculates the acceleration between incoming data. The initial idea was to find a way to calculate the difference between two subsequent incoming values, but ‘timloyd’ from the cycling74 forum presented a simple solution using just a couple of objects. This acceleration data is later used to trigger the bell sound when the patch is running in Mode 0 (happy tweet mode).

The third section mixes between four different buffered bell sounds based on the X and Y values coming from the accelerometer. For this, an algebraic expression from Dave Haupt of forum was used. The audio technique used in this project is very similar to the one used in vector synthesis ( but instead of mixing waveforms as a vector synthesizer would normally do, the patch mixes between four bell sounds of the same length. All four sounds were originally Tibetan bowl sounds previously sampled which were edited in Ableton Live for this project. While the patch is running in sad tweet mode, this section will mix between four shorter segments of the buffered sounds and, thus, making a sound that resembles a traditional vector synthesizer. Also, vector synthesizers are traditionally controlled with a joystick that controls how the waveforms are mixed based on X and Y position. The Digital Bell Tower projects builds on this idea and uses an accelerometer to gather X and Y data instead of a joystick. Or the bell itself can be understood as being a joystick.

The fourth section is the happy tweet mode sound generator. As already mentioned, it is activated based on incoming OSC messages from the Python script. If the message is 0 the “ggate” objects opens the audio signal patch and all four bell sounds are triggered every time the acceleration value is higher than the threshold. Each sound gain is then multiplied by a value from 0 to 1 based on the vector algebraic expression. The sum of all four audio gains is always 1.

The fifth and last upper section is the sad tweet sound generator. It works in a very similar way to what was described in the previous paragraph. The main difference is that instead of using a “~groove” object to play the buffered sound from beginning to end, a “~wave” object is used to play just a section of the sound. The section start and end position are determined by the posMode three possible values which are the second OSC message coming from Python. This mode is activated when the mode value is equal to 1. That value is used to control the “~phasor” object.

4.3. Pods

4.4 EL wires

5. Sketches

5.1 Accelerometer

Screen Shot 2014-12-10 at 6.47.22 PM

 5.2 – Pods – LED Strip and Proximity Sensors 

Screen Shot 2014-12-10 at 7.48.06 PM

5.3 EL Wire &  Twitter



6. Final Shots

Screen Shot 2014-12-10 at 8.36.58 PM

Screen Shot 2014-12-10 at 8.37.37 PM

Screen Shot 2014-12-10 at 8.38.42 PM

Screen Shot 2014-12-10 at 8.39.53 PM

Screen Shot 2014-12-10 at 8.40.23 PM


7. Feedback

Feedback from teachers & students (future aims):

– more variation within the physical push/accelerometer (make it more obvious when it is on, and when it is being pushed)

– make it more about the people in the space rather than those on twitter

– move EL wire to the bell (original plan, but work out the kinks)

8. Case Studies

8.1 – Public Face II


Public Face II is an installation created by artists Julius von Bismarck, Benjamin Maus and Richard Wilhelmer and displayed in the city of Lindau, Germany. The installation consists of a giant smiley built on top of a lighthouse.

A camera mounted on top of the lighthouse and facing down towards the crowd located on ground level uses computer vision algorithms the analyse people’s face and the smiley will react to emotion data in real time.

Public Face II uses emotion data coming from a group of people’s faces and displays it in a very straightforward manner: using a monumental smiley.The Digital Bell Tower aims to display emotion data not in an iconic way but turning data into symbols. (Link1, Link2)

8.2 – D-Tower

Screen Shot 2014-12-10 at 7.05.53 PM


D-Tower is a threefold project. It consists of an architectural tower, a questionnaire and a website and all three elements works together. The inhabitants of the city of Doetinchem located in Netherlands, the city who commissioned the art piece answers a questionnaire that records the overall emotions of the city. Happiness, love, fear and hate are the emotions being measured by the art piece. The results of the questionnaire are then displayed on the website and the tower also reflects the city of Doetinchem mood.

Our project aims to also be reactive the a city overall feeling. Colours and lights installed on the bell and the installation will change following the mood of Toronto (link1, link2, link3).

8.3 – Aleph of Emotion


Aleph of Emotion visualizes worldwide emotions. Twitter is the source of the collected emotions the website data was collected for 35 days in 2012 using openFramworks. The D.I.Y. physical object consists of an arduino and a tablet and it is built to resemble the appearence of a photography camera. The user points the object towards any direction and the tablet will display the emotion information for that particular region.

The Digital Bell Tower will similarly use Twitter to collect emotion data. Geography will be restricted to Toronto and won’t source worldwide data such as Aleph of Emotions. Data will be collected in real time and the project will also respond to twitter data in real-time as opposed to collect data during a specific time period and display that data (link).

8.4 – Syndyn Artistic Sports Game

Screen Shot 2014-12-10 at 6.59.07 PM


It is a badminton game created and named after Syndyn to be played in an indoor place. The speeder was colored by red LED while the racquets of each players are connected electro luminescent wire(and sensors) to reflect motion by light effect. The players are those who turn physical movement into audiovisual performance during playing the games(or doing the sports). With signals being transmitted from the sport equipment to computers, the games are performed by sound and light effects.

Similar to this installation, our project is also going to have a swing bell connected by accelerometer to detect its motion and XBee as the sender and receiver for data, as well as electroluminescent wires to perform the changes of conditions. In terms of the presentation, visual and sound effects are expected to emerge during the motion. All these mentioned function will be triggered while the bell swings and interacts with two objects on the ground.

In order to have a visual memory of the badminton game, the EL wires on the rackets are also going to be wrapped around the wrist of the player al long as the video camera will record the motion of the players lighting up by the wires. Plus, the speeder is lighted up in red LED so the camera can also catch the moving track of it and forms a image like this.

An iPod touch is used at the beginning of the game for the players to choose the theme color and other visualizations of the game. By contrast, our plan about the theme is to let the users hashtag their mood before sending a tweet. We have two mood types–happy and sad which could be triggered by tweets and will represent via pink and blue EL wires along with the objects with white LEDs on the ground (link).