Author Archive

FLUORESCE

CODE: 

Arduino / LED Matrix Control

– https://github.com/chrols2014/UVLEDMatrixArduinoCode/blob/master/UVLEDDrivers.ino

Processing / LED Model / Trackpad Interface

-https://github.com/chrols2014/UVLEDMAtrix

 

DIAGRAMS/SKETCHES: 


PHOTOS

 

VIDEO


DESCRIPTION

Several years ago, while I was still living in South Africa, I experienced a beautiful natural phenomenon. While walking through the shallow waters of a salt-water river mouth, I noticed a shimmer in the water as it swirled around my legs and feet. Thinking it was merely a figment of my imagination I dismissed it, however, moments later, the water unmistakably illuminated all around me and I realized that in fact there was something beautiful occurring. Bioluminescent plankton contained within the water from a Red Tide earlier that day, were lighting up in amazing displays of colour as the water was disturbed. Fascinated by the effect I decided to collect some of the water and the organisms within it, so I could observe them further at home, unfortunately they didn’t last long and soon died the next day. This experience stuck with me from that moment on and further fuelled my fascination with the natural world and the astonishing beauty it contains.

When starting this project with the assignment theme of water as an interface, my mind instantly went back to this childhood experience. I wanted to somehow convey this experience through means of technological methods. I already knew that tonic water contains quinine, a substance known to fluoresce when exposed to ultraviolet light, therefor I wanted to exploit this property to achieve the effect. By using UV LEDs and tonic water, my hope was to create an interactive experience that would recreate the sense of wonder that I felt back then as a child.

To aid me in this endeavour I managed to find various sources to help influence my design choices. The first of these sources, a YouTube video depicting the same effect I witnessed, provided a refresher into the visual aesthetics that these bioluminescent creatures manifest. I wanted to try and achieve the same level of vivid colours in my project. Another source of inspiration I found was an exhibition created by the artist Shih Chieh Huang, after he had been studying bioluminescent creatures in the deeps of the ocean. He, similar to my approach, used neon coloured LEDs to try and convey the same effect as those creatures.

This was accompanied by more research into bioluminescence as well as other artworks that have incorporated the phenomenon. Furthermore, I was able to find good examples of existing, non art based projects that successfully used UV LEDs and tonic water to create glowing, liquid interfaces, one of which is referenced below, this was significant because it demonstrated how my intended use of the materials would work.

References:

http://ocean.si.edu/ocean-news/when-art-meets-science-exhibition-inspired-bioluminescence

https://www.youtube.com/watch?v=7kyP0XsF0zM&spfreload=10

http://www.instructables.com/id/Quantum-UV-LED-Display/?ALLSTEPS

Nemerov, Alexander1. “The Glitter Of Night Hauling.” Magazine Antiques 179.3 (2012): 146-155. Art Source. Web. 4 Dec. 2014.

 

 

PROJECT EXPERIMENTS

 

1 – UV LED Through Tonic Bottle

The first thing I decided to test was the effect the UV LEDs had on the tonic water. The desired effect was for the tonic water to fluoresce a bright blue when the UV light would shine through it. My concern was that the low-power LEDs would not be a strong enough light source for the quinine in the tonic water to react.

As seen in the video, the UV LED did have an effect on the tonic water albeit not as strong as I had hoped. I noticed that the effect of the light was intensified when I aimed the light through the opening of the bottle (avoiding the plastic). This would suggest that the plastic of the bottle is filtering out some of the weak UV light.

My initial design had the LED’s below the plastic container in an attempt to avoid the tricky work involved in trying to submerge the LEDs and make their electrical connections water-tight. After these results I think I may have to submerge the LEDs.

2 – UV LED Submerged vs Through Plastic

After the previous test results, it made sense to progress to actually submerging the LED in the tonic water to see if the results were good enough to justify submerging the LEDs in the liquid and committing time to do the waterproofing.

I wasn’t sure how best to test the UV LED in liquid so I just decided to rig up the light using crocodile clips and then simply dipping the the encased surface of the LED into the tonic water, making sure to not get moisture between the electrical contacts.

As soon as the LED met the tonic water and began to submerge, I could tell that the effect was much better than the LED simply being on the exterior. The light seem to create a cleaner beam, illuminating a clear patch of the tonic water. I decided after this test that I would have to submerge the LEDs in order to get the best effect.

3 – Testing UV LED Matrix With Arduino Code

This was a simple test of the LED matrix. I used a simple piece of Arduino code to write a PWM value to each LED in order so that I could make sure each channel was working and that the corresponding LED was in the position I expected it to be. Essentially it was a test of my dressing system. As you can see in the video, I was fortunate that each LED worked the first time around. At this point I knew that if I could get the Processing sketch and the Arduino to communicate via serial, I could achieve the desired effect.

4 – Processing Sketch

Another major part of the project is the middleware Processing sketch that would interpret the input (yet to be determined) and plot those coordinates onto a matrix, updating the Arduino’s UV LED output, depending on position. I wanted to take this approach because I was apprehensive about coding this project without a visual aid of what the output would be. It also allowed me to test the system with a simple input, my computer’s mouse.

A crucial part of the program was the collision detection between the input coordinates and the various LEDs position. I was unsure how to code collision detection but after some research I found the solution. This test represents the testing of the collision as well as the addition of a buffer around each LED, allowing me to configure the distance a finger would need to be from the each light to activate it.

The test was a success and I’m confident it will be able to handle the visual aspects of the LED’s, brightness, fade, location, etc…

5 – Serial Comms

Another major part of the project was to test out the serial communication between the Arduino and the Processing sketch. The Processing sketch was responsible for controlling all of the light values and then outputting the value for each LED via serial to an Arduino. The Arduino’s only job was to receive the array of values from Processing and update the PWM driver, in turn controlling the current value of the matrix.

My concern was that the Arduino would not be able to receive all 16 values at the same time as serial communication can be a bit buggy. I knew that some tweaking would be necessary in order to get the timing right. As you can see in the test, there were initially problems with serial communication. It seemed the Processing sketch was transmitting to fast for the Arduino to keep up. I made some changes and by the end of the test you can see I was able to get a pretty fluid motion across the LED matrix.

6 – Kinect Test

The kinect was another option to use for the input method. I had planned to potentially position a Kinect, pointing at the container of tonic water and then use hand and finger tracking to watch users interacting with the liquid. I had concerns about this approach because of the limitations of it. The angle would have to be very precise and there would be issues with detecting the finger tips when they were submerged in the liquid.

This test was to find out the complexities of setting up the Kinect with my software and the types of constraints I would have to employ in order to get the desired results to work with. From all the data the Kinect can provide I only needed the points of fingers in the liquid.

As you can see in the experiment video, the Kinect worked relatively well however there would have to be some major additions to the library used. In the end I decided against using the Kinect as I felt it was overcomplicating the project. I really wanted the interface to be self-contained, using and external sensor seemed to detract from that.

7 – Cap-Touch Basic Test

The initial plan for the project was to include capacitive touch sensors within the liquid itself in order to calculate the position of contact between the finger of a user and the tonic water. This seemed like the best method to create and actual interface out of the liquid. I was unsure that this method could work and knew it would take a while to work out the complexities of the hardware as well as software.

This test was the first step in trying to determine if this approach would not only work but would be viable for the time frame of the project. I wasn’t sure if tonic water would even conduct the same way water would. I knew that water had been used like this so I had hope it would also work.

I was able to get the capacitive sensor to work well, however, due to time constraints I had to abandon the plan. I hope to do more testing though and perhaps try to incorporate the feature into a “version 2.0”.

 

 

 

Incoming Doom

 

 

(My apologies for the late post everyone!!)

 

Project Description

The premise behind the project involved teleportation. A space traveller accidentally initiates reception of a rogue “teleport” signal and is horrified when it is revealed that there is a space ghost coming through. The user experience is as follows: user walks by installation, an interesting visual appears in an enclosure of sorts – this is the teleport receiver pod, as the user walks by, the touch screen lights up, notifying them of an incoming teleport, the user is prompted to touch the screen. If the user touches the screen a sequence begins, detailing the incoming transmission and how there is something non-human coming through. At this point the teleport receiver begins revealing the monster coming through as it lets out a harrowing cackle.

This project required various components to function as an interactive experience. Firstly, in order to initiate the entire interaction, an Arduino was used in combination with an IR depth sensor to sense a potential user passing by. When a user is detected it announces announces it via serial. The Arduino also doubles as a controller for ambient/environmental lighting, which allowed for extra atmosphere to be added, for example, flashing red when the teleport alarm goes off.

The remaining components were, the project light surface (aka the teleport receiver), this was a Processing sketch running the incoming ghost visuals, projected onto a translucent mesh surface from behind. A second, separate Processing sketch was running a pre-scripted on screen dialogue accompanied by audio read out of the events occurring. This is the primary method of interaction. Once this application receives the serial data from the Arduino, reporting that a user is nearby it will prompt the user to touch the large screen. Upon receiving a touch, the application begins a series of timed events and audio queues. During this period it will then communicate with the application running the projected surface and start the video of the ghost appearing.

This combination of on screen dialogue with projected video and choreographed lighting in the surrounding environment, I hope helped contribute towards creating an immersive experience.

 

Circuit Diagrams

Assembly List

Label Part Type Properties
J1 Infrared Proximity Sensor
LED1 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
LED3 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
LED4 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
LED5 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
Part1 Arduino Uno (Rev3) type Arduino UNO (Rev3)
Q4 NPN-Transistor package TO92 [THT]; type NPN (EBC)
Q5 NPN-Transistor package TO92 [THT]; type NPN (EBC)
Q6 NPN-Transistor package TO92 [THT]; type NPN (EBC)
R2 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%
R3 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%
R4 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%
R5 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%

Shopping List

Amount Part Type Properties
1 Infrared Proximity Sensor
4 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
1 Arduino Uno (Rev3) type Arduino UNO (Rev3)
3 NPN-Transistor package TO92 [THT]; type NPN (EBC)
4 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%

 

Code

Arduino IR Distance and LED Code – https://github.com/chrols2014/IRDistanceforSpaceKook/blob/master/IRDistanceSensor.ino

Main Processing Application – https://github.com/chrols2014/SpaceKookDesktopApp

2nd Processing App, Responsible for syncing video playback on projector – https://github.com/chrols2014/SpaceKookVideo_Player

 

Sketches / Design Files

 

 

 

Photographs

 Video

https://vimeo.com/108974484

Process

The theme assigned of a “Haunted Space Station” immediately brought up a memory of one of my favourite episodes of Scooby-Doo as child. It involved a “Space Kook” that would appear out of nowhere and terrify people living in the area. I remembered finding it hilarious that the show explained the glowing, levitating and enormous “ghost saucer” as merely a projection on a the clouds overhead. This further influenced my decision to take this primarily Arduino based project and make it a combination with more screen-based and projected elements. I had the idea to project onto the mesh screen from the get go and planned on using it to allow for the correct height of the “space ghost”.

Unfortunately, it turned out to be much more trickier to implement two screens in a single Processing sketch. I spent some time trying the various methods and eventually gave up, opting to create a work around. So instead, I wrote two separate applications, one that would be responsible for all timing of the experience and one that would remotely play a video one instructed to. This idea sounded daunting at the time but I quickly found the reference on the server and client functions in processing and it was easier than expected. I simply used 127.0.0.1 instead of a public ip address to setup the server and it worked.

The Arduino aspect required serial communication between it and the main Processing sketch. I decided to use an infra read depth sensor to detect if a person was walking by. Upon being triggered it would send a simple serial command back to Processing. I wanted to add a level of atmosphere with lighting, so, in addition to sensing for a user, the Arduino also allowed me to set up an array of RGB LEDs. I was able to change the colour of these lights by sending simple instructions from Processing to the Arduino. This part of the process went smoothly.

The multimedia aspect of the project turned out to be a lot more work than I expected. Exporting the text-to-speech files and synchronizing them with the on screen dialogue was a tedious task. Furthermore, tying in the projected video timing and the triggers for lighting and sounds, choreography was the trickiest thing to achieve and I wish that I had set aside more time to work on it as it cost other areas of the project.

Challenges / Improvements

As mentioned above, the amount of time spent on trying to perfect choreography cost me seriously in the presentation aspect of the problem, this was mostly  time management issues and could have been avoided if I had a more realistic break down of the assets required for each component.

I did not expect to have to write two application for my displays instead of two. I had no idea that the multi-screen situation in Processing was so tricky. Not accounting for that was another underestimate on time.

I wanted the final “teleportation chamber” to look better for critique. Given the time frame and my proposed idea, I worked with what I had but it could have been much better.

 

References

Scooby-Doo Space Kook Source footage – http://youtu.be/swhAv9VcBWc (Used without permission of Youtube account holder and Hannah-Barberra.)

Some handy Processing references:

Server Code – http://processing.org/reference/libraries/net/Server.html

Client Code – http://processing.org/reference/libraries/net/Client.html

Arduino Code Samples – referenced within github files.

 

 

 

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.