by Nadine Valcin



Trumpet is a fitting instrument as the starting point for an installation about the world’s most infamous Twitter user. It combines a display of live tweets tagged with @realDonaldTrump with a trumpet that delivers real audio clips from the American president. The piece is meant to be installed at room scale and provide a real-life experience of the social media echo chambers that so many of us confine ourselves to.

The piece constantly emits a low static sound, signalling the distant chatter that is always present on Twitter. A steady stream of tweets from random users, but always tagged with the president’s handle, are displayed on the screen and give a portrait of the many divergent opinions about the current state of the presidency.

Visitors can manipulate a trumpet that triggers audio. A sample of the Call to the Post trumpet melody played at the start of horse races can be heard when the trumpet is picked up. The three trumpet valves, when activated, in turn play short clips (verbal equivalent of tweets) from the president himself. Metaphorically, Trump is in dialogue with the tweets being displayed on the screen in the enclosed ecosystem. The repeated clips create a real live sonic echo chamber physically recreating what happens virtually online.


My initial ideas were centered on the fabrication of a virtual version of a real object: a virtual bubble blower that would create bubble patterns on a screen and virtual kaleidoscope. I then flipped that idea and moved to the idea of using a common object as a controller, giving it a new life and hacking it in some way to give it novel functionalities. Those functionalities would have to be close the original use for the object yet be surprising in some way. The ideal object would have a strong tactile quality. Musical instruments soon came to mind. T

hey are designed to be constantly handled, have iconic shapes and are generally well-made and feature natural materials such as metal and wood.

Image from Cihuapapalutzin

In parallel, I developed the idea of using data in the piece. I had recently attended the Toronto Biennial of Art and was fascinated by Fernando Palma Rodriguez’s piece Cihuapapalutzin that integrated 104 robotic monarch butterflies in various states of motion. They were built to respond to seismic frequencies in Mexico. Every day, a new data file is sent from that country to Toronto and uploaded to control the movement of the butterflies. The piece is meant to bring attention to the plight of the unique species that migrates between the two countries. The artwork led me to see the potential for using data visualisation to make impactful statements about the world.

Image from Just Landed

I then made the connection to an example we had seen in class. Just Landed by Jer Thorp shows real time air travel patterns of Twitter users through a live map. The Canadian artist, now based in New York, used Processing, Twitter and MetaCarta to extract longitude & latitude information from a query on Twitter data to create this work.

Image from Listen and Repeat

Another inspiration was Listen and Repeat by American artist Rachel Knoll, a piece featuring a modified megaphone installed in a forest that used text to speech software to enunciate tweets labeled with the hashtag “nobody listens”.

As I wanted to make a project that was closer to my artistic practice which is politically-engaged, Twitter seemed a promising way to obtain live data that could then be presented on a screen. Of course, that immediately brought to mind one of the most prolific and definitely the most infamous Twitter user: Donald Trump. The trumpet then seemed to be a fitting controller, both semantically and its nature as a brash and bold instrument.


Step 1: Getting the Twitter data

Determining how to get the Twitter data required quite a bit of research. I found the Twitter 4J library for Processing and downloaded it, but still needed more information on how to use it. I happened upon a tutorial on British company Coda Sign’s blog about Searching Twitter for Tweets. It gave an outline of the necessary steps along with the code. I then created a Twitter developer account and got the required keys to use their API in order to access the data.

Once I had access to the Twitter API, I adjusted the parameters in the code from the Coda Sign website, modifying it to suit my needs. I set up a search for “@realDonaldTrump”, not knowing how much data it would yield and was pleasantly surprised when it resulted in a steady stream of Tweets.

Step 2: Programming the interaction

Now that the code was running on Processing, I set up the code to get data from the Arduino. I programmed 3 switches, one for each valve of the trumpet and also used Nick’s code to send the gyroscope and accelerator data to Processing in order to determine which data was the most pertinent and what the thresholds should be for each parameter. The idea was that the gyroscope data would trigger some sounds when the trumpet was moved and the 3 trumpet valves would manipulate the tweets on the screen with various effects on the font of the text.

I soon hit a snag as it at first seemed like Processing wasn’t getting any information from the Arduino. Looking at the code, I noticed that there were several delay commands at various points in the code. I remembered Nick’s warning about the delay command and how it was problematic and realized that this, unfortunately, was a great example of it.

I knew the solution was to program the intervals using the millis function. I spent a day and a half attempting to find a solution but failed and required Kate Hartman’s assistance solving the issue. I has also discovered that the Twitter API would disconnect me if I ran the program for too long. I had to test in fits and starts and often found myself unable to get any Twitter data sometimes for close to an hour.

I attempted to program some effects to visually manipulate the tweets that would be triggered by the activation of the valves. I had difficulty affecting only one tweet as the effects would affect all subsequent tweets. Also, given that the controller was a musical instrument, it felt like sound was a more suited effect than a visual. At first, I loaded cheers and boos from a crowd that users could trigger in reaction to what was on screen, but finally settled on some Trump clips as it seemed natural to have his very distinctive voice. It was suitable because he takes to Twitter to make official declarations and because of the horn’s long history as an instrument to announce the arrival of royalty and other VIPs.

As the clock was ticking, I decided to work on the trumpet and return to working on the interaction when the controller was functional.

Step 3: Hacking the trumpet

Trumpet partly disassembled

I was fortunate to have someone lend me a trumpet. I disassembled all the parts to see if I could make a switch that would be activated by the piston valves. I soon discovered that angle from the slides to the piston valves is close to 90 degrees and given the small aperture connecting the two, it would be nearly impossible.

Trumpet parts
Trumpet valve and piston
Trumpet top of valve assembly without piston

The solution I found was taking apart the valve piston while keeping the top of the valve and replacing the piston with a piece of cut Styrofoam. The wires could then come out the bottom casing caps and connect to the Arduino.



I soldered wires to 3 switches and then carefully wrapped the joints in electrical tape.

Arduino wiring


A cardboard box was chosen to house a small breadboard. Holes were made so that the bottom of the valves could be threaded through and the lid of the box could be secured to the trumpet by using the bottom casing caps. Cardboard was chosen in order to keep the instrument light and as close to possible to its normal weight and the balance.

Finished trumpet/controller

Step 5: Programming the interaction part 2

The acceleration in the Y axis was chosen as a trigger for the trumpet sound to play. But given the imbalance in the trumpet weight, it tended to trigger a rapid succession of the trumpet sound before stopping. Raising the threshold didn’t help. With little time left, I then programmed the valves/switches to trigger some short Trump clips. I would have loved to accompany them with a visual distortion but the clock ran out before I could find something appropriate and satisfactory.


My ideation process is slow and was definitely a hindrance in this project. I attempted to do something more complex than I had originally anticipated and the bugs I encountered along the way made it really difficult. One of the things that I struggle with when coding is not knowing when to persevere and when to stop. I spent numerous hours trying to debug at the expense of sleep and in hindsight, it wasn’t useful.  It also feels like the end result isn’t representative of the time I spent on the project.

I do think though that the idea has some potential and given the opportunity would revisit it to make it a more compelling experience. Modifications I would make include:

  • Adding a number of Trump audio clips and randomize their triggering by the valves
  • Building a sturdier box to house the Arduino so that the trumpet could possibly rest on it and contemplate having it attached to some kind of stand that would control its movements somewhat
  • Have video as a background to the Tweets on the screen or a series of changing photographs and make them react to the triggering of the valve.

Link to code on Github:


Knoll, Rachel. “Listen and Repeat.” Rachel Knoll – Minneapolis Filmmaker, Accessed October 31, 2019

Thorp, Jer. “Just Landed: Processing, Twitter, MetaCarta & Hidden Data.” Blprnt.blg, May 11, 2009. Accessed October25, 2019

“Fernando Palma Rodríguez at 259 Lake Shore Blvd E”. Toronto Biennial of Art. Accessed October 24, 2019

“Processing and Twitter”. CodaSign, October 1, 2014. Accessed October 24, 2019.

“Trumpet Parts”. All Musical Instruments, 2019. Accessed November 2, 2019




Leave a Reply

Your email address will not be published. Required fields are marked *