Experiment 3: Blue Moon


Project Title: Blue Moon
Names of group of members: Lilian Leung

Project Description

Blue Moon is a reactive room light that detects signs of anxiety through hand gestures and encourages the participants to take time away from their screen and practice mindfulness. The gold mirror emits a blue glow when participants are detected to be clenching their fist. To achieve a warm light, the participants right hand needs to be unclenched. The second activated switch is pressing their right hand over the left, which causes the music to start playing. The joint hand switch keeps the participant focused and relaxed and stops them from attempting to use a mobile device or computer. The project environment is based within a bedroom for times before rest or when trying to relax or meditate. The screen projection is intended to be on the ceiling as the viewer should be laying down with their hands together. 

Project Process

October 23 – 24
Beginning my initial prototype for the experiment, I began by mapping two potentiometers with two LEDs. By creating a small scale prototype I could slowly upgrade each section of the project into larger outputs such as upgrading the small LEDs to an LED stripe and replacing the potentiometer with flex sensors.


October 25
By using Shiffman’s example of creating rain ripples, I was having difficulty controlling multiple graphic elements on the screen as the ripples affected all the visible pixels. Exploring OpenProcessing, I found a simpler sample that I could build from by having ellipses generated based on frame rate which I could control by N.Kato (n.d.) Rather than having the animation begin abruptly, I added an intro screen that was triggered by mouse click to move onto the main reactive animation.

Using my current breadboard prototype with the potentiometers and LED, I updated the switch from potentiometer to a pressure sensor using aluminum foil and velostat. Adjusting the mapping of the values to reduce the noise from the sensor I was able to map the pressure sensors for two hands:

Switch 1: Left Hand Increase Rate of Rain
Switch 2: Right Hand Control Audio Volume

* I used the Mouse.h library during initial exploration to trigger the intro screen and connected it to a physical switch though started having trouble controlling my laptop so I temporarily disabled it.

October 26
Initial testing to upgrade my breadboard prototype LED, I purchased a neon blue LED strip. I initially followed Make’s Weekend Projects – Android-Arduino LED Strip Lights (2018) guide to review how I could begin connecting my LED strip (requiring 9V) to my 3.3V Arduino Nano. One problem I didn’t expect when using Make’s video was that they used an RGB LED strip which had a different circuit while mine was a single colour.

RBG LED 12v, R, G, B
LED Strip 12v Input, 12v Ouput

I went to Creatron and picked up the neon led strip, tip31 transmitter, and an external 9 volt power source and power jack.Rewiring the breadboard prototype proved difficult as I had to use a separate rail of the breadboard for solely 9V and learn how to hook up the TIP31 transmitter with my digital pins to send values to the LED stripe.

One realization using from online references using different types of transmitters was that the 3 pins (base, collector, emitter) are laid out in a different order depending on the type used.

Pin 1 Base (Info + Resistor)
Pin 2 Collector (Negative end of the LED strip)
Pin 3 Emitter (Ground)

The diagram reference that ended up being the most useful was this:


Figure 1. Sound Activated Light Circuit, n.d.

After being able to properly hook up the LED strip with my Arduino and mapping the potentiometer with the LED strip brightness, I began rewiring the potentiometer to use a flex sensor. To test the sensor, I first used two copper coins and velostat.




From there I made two types of pressure sensors, the first one, using aluminum foil and velostat as a flex sensor that I could hide away inside a glove and use to detect pressure when clenching my fist with pressure along my index finger from my first and second knuckle. Another factor when attaching the flex sensor inside the glove was making sure the sensor was still attached when I clenched my fist, as the sensor would pull forward, following my finger joints and easily detached the sensor cables.


October 27-29
Expanding with the physical Arduino component of the project, I still needed to laser cut acrylic to make the circular ring that my LED stripes would wrap around. I’ve never laser cut before but with some help from Neo I was able to layout my two files, the first a 17” acrylic circle to be the front and the second being five 11.5” ring using 0.25” board that would house my Arduino and cables while also being a form that my LED stripes could wrap around while creating enough space between the wall and the light.

Additional updates were made to the prototype, such as adding the second warm neon LED to use as the main light, as well as updating all the wiring for my sensors with stranded wire instead of solid core wire so that the cable was more flexible and comfortable when participants put on the glove sensors.

The wooden ring was then sanded with holes drilled in from the side for the power cables to connect to as well as for the LED stripes strand wire to move from the exterior of the ring into the interior where the Arduino is placed. As all the wiring was finished, I moved all the components from my breadboard onto a protoboard to have everything securely in place. I also needed terminal blocks so that I could repurpose my LED stripes instead of soldering the wires into the protoboard.



Once the physical pieces were cut, I went back to redesigning the intro page to create more visual texture rather than just use solid coloured graphics. Within Processing, I added liquid ripple effect from an OpenProcessing sketch by oggy (n.d.) and adjusted the code to map the ripples based on the hand sensors instead of mouse co-ordinates.

The last issue to solve was how to get the intro screen to disappear when my hands were together in a relaxed position. Because of the multiple audio files being played, I had issues of audio looping when calling each page as a function and using booleans. In the end, I laid out the Sketch using sensor triggers and if statements. Possible due to the number of assets loaded within Processing, I was struggling to get the animation to load consistently as assets would be missing without any alterations to the code. In the end, I removed the rippling image effect.

Project Context

With the days getting darker and nights getting longer, seasonal affective disorder is around the corner. I’m faced with the problem that my current room doesn’t have a window and doesn’t have an installed light within the room, making the space usually dark with little natural light coming in. This combined with struggling with insomnia and ability to relax at night lead to creating this project.


One finding for treating seasonal affective disorder is the use of light therapy. This treatment usually involves the use of artificial sunlight with a UV bulb or finding time to get natural sunlight. Terman (2018) suggests that treatment ranging from 20-60 minutes a day can lead to an improved mood over the course of four weeks.  

With a bad habit for multitasking, I find it difficult to concentrate and simply make time to rest without the urge of being productive. This proved to be a problem as using social media can also lead to feelings of depression and loneliness (Mammoser, 2018) due to feelings anxiety and fear of missing out and constant ‘upward social comparisons’ depending how frequently checks their accounts. 

To force myself to create time to relax and and removed from my digital devices, the experiments main functions were to first, detect signs of anxiety and second to force myself to stay still, present within my space and immersed in my brief moment of meditation. Nevarro (2010) from Psychology Today describes nonverbal signs of stress being rubbing hands together or clenching as a sign of “self massaging” or pacifying the self. To force myself to stay still and present, I decided my second trigger would be having both my hands together in a relaxed position. 


Aesthetically, I chose a circular form, both symbolic as a sun and moon, as it would be the main light source in my room. Having the alternating lights, the circle appears both like a sun and an eclipse when the neon blue light is on. The visual style was inspired by Olafur Eliason’s Weather Project (2004) and the artist’s elemental style and his use of mirrors and lights to create the sun and sky within the Tate Modern’s Turbine hall. The Weather Project installation explored how weather shapes a city, and how the city itself becomes filter as to how locals experience fundamental encounters with nature.


The neon light is emitted when my right hand is clenched, limiting the light in the room, and prompting my to unclench my fist. The Processing visuals are supported by white noise of wind and rain as a distraction to my surroundings as white noise is beneficial for cutting out background noises by providing stimulation to the brain without being too overly excited (Green, 2019).


The audio from the Processing sketch is also mapped to my right hand, with the audio being lower when my first is clenched and louder when my hand is open, prompting deeper immersion into the meditation. Opting to bring my hands together with my right hand over my left in a relaxed position, the sketch dissolves the intro screen to fill the ceiling with the imagery of rain falling and louder audio. Within a few moments a selected song will play, for my Processing sketch, I used one of my favourite songs, Wide Open by the Chemical Brothers (2016). 


During my exploration phase, I tried to trigger Spotify or Youtube to play, but because it would cut outside the processing program and bring me back to viewing social media and the internet, I opted to have the audio play within the sketch.

Additional Functionality for Future Iterations

  1. Connecting the Processing Sketch to a possible Spotify API that could be controlled with physical sensors. 
  2. Connecting to a weather API and having the images and audio switch depending on the weather.
  3. Adding additional hand gesture sensors to act as a tangible audio player.






Project Code Viewable on GitHub

Project Video

Citation and References

Sound Activated Light Circuit. (n.d.). Retrieved from https://1.bp.blogspot.com/-2s2iNAOFnxo/U3ohyK-AjJI/AAAAAAAAADg/gAmeJBi-bT8/s1600/8C37AE69E775698CB60B99DB1DCC86EA.jpg

Ali, Z., & Zahid. (2019, July 13). Introduction to TIP31. Retrieved from https://www.theengineeringprojects.com/2019/07/introduction-to-tip31.html.

Green, E. (2019, April 3). What Is White Noise And Can It Help You Sleep? Retrieved October 31, 2019, from https://www.nosleeplessnights.com/what-is-white-noise-whats-all-the-fuss-about/.

Make: (2013, August 23) Weekend Projects – Android-Arduino LED Strip Lights. Retrieved from https://www.youtube.com/watch?v=Hn9KfJQWqgI

Mammoser, G. (2018, December 14). Social Media Increases Depression and Loneliness. Retrieved October 31, 2019, from https://www.healthline.com/health-news/social-media-use-increases-depression-and-loneliness.

Nevarro, J. (2010, January 20.). Body Language of the Hands. Retrieved October 31, 2019, from https://www.psychologytoday.com/us/blog/spycatcher/201001/body-language-the-hands.

N.Kato (n.d.) rain. Retrieved from https://www.openprocessing.org/sketch/776644

oggy (n.d.) Liquid Image. Retrieved from https://www.openprocessing.org/sketch/186820

Processing Foundation. (n.d.). SoundFile::play() \ Language (API) \ Processing 3 . Retrieved October 31, 2019, from https://processing.org/reference/libraries/sound/SoundFile_play_.html.

Studio Olafur Eliasson (Photographer). (2003). The Weather Project. [Installation]. Retrieved from https://www.tate.org.uk/sites/default/files/styles/width-720/public/images/olafur_eliasson_weather_project_02.jpg

Tate. (n.d.). About the installation: understanding the project. Retrieved November 2, 2019, from https://www.tate.org.uk/whats-on/tate-modern/exhibition/unilever-series/unilever-series-olafur-eliasson-weather-project-0-0.

Tate. (n.d.). Olafur Eliasson the Weather Project: about the installation. Retrieved November 2, 2019, from https://www.tate.org.uk/whats-on/tate-modern/exhibition/unilever-series/unilever-series-olafur-eliasson-weather-project-0

Terman, M. (2018, December). Seasonal affective disorder. Retrieved October 31, 2019, from https://www-accessscience-com.ocadu.idm.oclc.org/content/900001.

The Coding Train (2018, May 7) Coding Challenge #102: 2D Water Ripple. Retrieved from

Leave a Reply

Your email address will not be published. Required fields are marked *