Generative Harp… and Robo Music Box.

 

 


 

I set out to create a sensor based version of my WEEK 3 experiment using light sensors. The objective was to produce a generative music instrument, like a wind chime, that would play a 9 note melody I had written. Ideally, such an instrument would be passive, set up as an ambient experience in a hall or hallway type environment. For the purposes of this demo, the trigger sensitivity is tuned to user gestures (swipes changing the light readings), as opposed to subtle light changes (which would be more analogous to a wind chime… and totally possible).

In order to accomplish this goal, I leaned heavily on my WEEK 7 light sensor experiments (though the sensor I chose was not part of that test) and my WEEK 8 Processing + Arduino creation.

STEP 1 – REWRITE THE ELASTIC STRINGS CODE

This took much longer than I had anticipated; creating an object out of my original sketch was a challenge as I had to convert the many (and I mean MANY) arrays that stored all of the individual elastic variables in to a single class that would represent each elastic. As I mentioned I leaned heavily on my WEEK 8 creation as it was my first object oriented sketch. I knew I wanted to make this Generative Harp prior to WEEK 8, so I purposefully used the Hoops creation to explore OOP and Classes, but I also used it to figure out how I needed to structure my code in order to trigger/modify a single object while the other objects execute their properties uninterrupted. After numerous pots of tea, I managed to convert WEEK 3 into a SKETCH reminiscent of WEEK 8, except with the elastics still trigger by the mouse. Sensor input was the next step.

STEP 2 – CHOOSE A LIGHT SENSOR

I purchased 6 TSL235R Light to Frequency sensors mainly because were more sensitive than a photoresistor, and cheaper than a TEMT6000 (though I bought a few more of those as back-ups along with a couple proximity sensors). After testing the TSL235Rs I quickly realized that I had luckily made the perfect choice as these sensors were particularly sensitive to low light. Where the TEMT6000 reads light levels as an analog output with a positive correlation (more light = higher value), the TSL235R sensor is digital and outputs a negative correlation to the light reading; an average daily light lit indoor room read beween 20 and 50 with pure daylight returning a value of 1. This was all very fortunate because I imagined these sensors as light based triggers based on a reduction in light. The TSL235R produced a wide range of low light values by covering it with my hand, anywhere between 200 to 1500+! This was nearly identical to the range of the TEMT6000, only inverted. Perfect.

I now had to program the Arduino to average the 6 sensors upon booting in order to calibrate system based on the environment. I modified the Smoothing example code and created boot averages for each sensor and for all 6 sensors. I then had to send these 14 variables (6 live sensors, 6 sensor boot averages, 1 all sensor boot average, and 1 live all sensor average) to Processing as input for my newly coded elastics.

STEP 3 – CREATE LIGHT TRIGGERS

I had learned early on in the semester that coding hard values into your code was ultimately very limiting because it wasn’t scalable. Consequently, my code is riddled with variables in order to allow for scalability, but also to account for the calibration I had just programmed above in my Arduino sketch. Without going into great detail, the individual sensor averages are used as the control variables by which any change to the live sensor is judged. This is how I created light sensor triggers to replace the mouse interactions found in the object oriented sketch in STEP 1. If a swipe was detected, the FreqCheck class would move the Elastic array forward one spot, send the sensor data through to the Elastic class instance corresponding to that array location in order to draw and animate the bezier. Simply put, this substitutes the mouse for light sensors, triggering X AXIS movement.

Because the light TSL235R sensors were so sensitive to low light, I was able to use the live all sensor average variable as a Y AXIS and Colour (R: Bezier X point, G: Bezier Y Point, B: Light Average) manipulator. I built a function that checks for changes in the all sensor average base on the all sensor boot average control, exactly the same method as above.

STEP 4 – TRIGGER MIDI NOTES

There are 3 different libraries available for MIDI in Processing, but I chose to use the MidiBUS for no reason other than their example worked and the code was legible. I created an array of notes, fed it through their demo code, and soon had my melody playing on my virtual instrument (IK Multimedia’s Sonik Synth running a wonderful Pianet sample, an electric keyboard I used to own and one I still very much love the sound of). The code included a delay between the onNote and offNote commands, a delay which would also pause my animation were I to copy it directly in to my sketch. But unfortunately, it’s not as simple as just removing the delay command as a onNote paired with an offNote on the next line triggers a MIDI note so short that barely any sound is produced. I thus had to remove the offNote command. This may seem obvious, but its actually a big problem if it is not accounted for as without an offNote command a sound with an infinite sustain (like an organ) would play forever. I thus had to make sure I chose a tine like sound with a strong attack and a relatively fast decay.

I ran into some trouble when trying to make the melody play in stereo. I knew I had to split the L and R notes in to separate MIDI channels, but for some reason inside Logic and Ableton Live, the virtual instrument would only read channel 1 (or channel 0 in Processing). I thus had to go into Audio MIDI Setup and create a second IAC Driver, double up my MidiBus instances in my code (one for each channel) and duplicate my virtual instruments on Ableton. Once I had figured this out (I’m still mad at Logic for not handling channel command properly!) I was able to trigger a hard panned left version of the Pianet, and a hard panned right version of the Pianet. Now I could have used a sound generation library in Processing, but I wanted to craft the sonic experience with a little more care, so for me this extra frustration was worth it: the Pianet is triggered in stereo while being run through a nice reverb and a channel compressor and a hard master limiter.

It was a slightly more complicated than a simple copy and paste job when inserting the above MIDI code into my main processing sketch. I knew I wanted the MIDI notes to be triggered at maximum elastic extension, so finding the right place to insert the command was simple (the X AXIS direction switch function), but I spent quite a bit of time ensuring that it only triggered once (my directionX “if” constraints were not tight enough to ensure that the direction switch only happened once along the animation, producing multiple consecutive trues or falses before reaching the maximum elastic extent, and truly flipping). Once I refined the code, the whole system sang… almost exactly as I imagined it would.

PRESENTATION

I struggled trying to find a way of presenting the system that was clean and functional. The sensors are so small, and needed to be spread out in a such a way that I became very limited in my mounting options. I decided to embrace the wires because my animation is all about moving wires. I ended up buying a pack of suction cups which held the braided wire quite nicely, and most importantly adhered to the HDTV perfectly. In the end I liked the look, but I also liked the idea that the sensors could be moved. I think if I were to refine this idea a bit more I would design it so that the sensors were not set up as a left to right continuum, but rather as independent triggers that anyone can move anywhere they want. Doing so would also bring the system that much closer to the intended generative purpose by removing the direct cause and effect of triggering sensors direct affixed to the displaying screen.

 

 

 

PROCESSING CODE

TAB 1: Main ProgramTAB 2: Elastic ClassTAB 3: Frequency Check Class

ARDUINO CODE

Light Sensor Read + Average

------------------------------------------------------------

ROBO MUSIC BOX

 

http://vimeo.com/33010630




 

My ultimate goal for this whole project was to create a system that would generate music. Although the Generative Harp is capable (it was playing on its own in bright afternoon sun light when pointed out the window), I wanted to explore a more physical system that would also respond to light changes, specifically ambient light. I bought a DIY music box and scored the full melody on one of the supplied sheets. I then hooked a continuous 360 degree servo up to the crank (crudely joined by hot glue), plugged in a TEMT6000 to an Arduino analog pin and sat back. The code in the Arduino is constructed to store the difference between 2 averaged light readings in to a container variable that when filled, would trigger two turns of the servo, and consequently a note or two from the music box. The TEMT6000 is fantastic at measuring ambient light in a room (don’t point it out a window as it will max out easily on direct sunlight). The problem came from the competing noise of the motor: you couldn’t hear the music box! This much I anticipated. Yet this rather crude exploration only further reinforced the beauty of the wind chime by reminding me that it isn’t only about translating one type of wave (wind) in to another (sound), in my case light (sensor) to sound (music box), but also about doing so without drawing attention to how. Though endearing in its own way, Robo Music Box wasn’t quite as magical an experience as I had hoped… I know this because when the Generative Harp played on its own without anyone around it did feel and sound like there was magic in the room.

 

ARDUINO CODE

 Ambient Light Read + Average + Store Difference + Trigger Motor

------------------------------------------------------------


THE MELODY

 

http://vimeo.com/33010378

Comments are closed.