“LightGarden” by Chris, Hart, Phuong & Tarik

Image1

 

 

 

 

 

 

 

 

 

A serene environment in which OCAD students are invited to unwind and get inspired by creating generative artwork with meditative motion control.

The final Creation & Computation project -a group project- required transforming a space into an “amusement park” using hardware, software, and XBee radio transceivers.

We immediately knew that we wanted to create an immersive visual and experiential project — an interactive space which would benefit OCAD students, inspire them, and help them to unwind and meditate.


The Experience

experience1

 

 

 

 

 

 

 

 

Image credit: Yushan Ji (Cynthia)

We chose to present our project at the Graduate Galley as the space has high ceilings, large unobscured empty walls and parquet flooring. We wanted the participants to feel immersed in the experience from the moment they entered our space. A branded plinth with a bowl of pulsing and glowing rock pebbles greets the participants and invites them to pick up one of the two “seed” controllers resting on the pebbles. These wireless charcoal coloured controllers, or “seeds”, have glowing LEDs inside them and the colours of these lights attract the attention of participants from across the space.

Two short-throw projectors display the experience seamlessly onto a corner wall. By moving the seeds around, the participant can manipulate a brush cursor, and draw on the display. The resulted drawing uses complex algorithms that create symmetric, mandala-like images. To enhance the kaleidoscope-like visual style, the projection is split between two walls with the point of symmetry centered on the corner, creating an illusion of three-dimensional depth and enhancing immersion.

By tilting the seed controller up, down, left, and right, the participant shifts the position of their brush cursor. Holding the right button draws patterns based on the active brush. Clicking the left button changes the selected brush. Each brush is linked to a pre-determined colour, and the colour of the brush is indicated in the LED light on the seeds as well as the on-screen cursor. Holding down both buttons for 3 seconds resets the drawing of that particular participant but will not affect the other user.

To compliment and enhance the meditative drawing experience, ambient music accompanies the experience and wind chime sounds are generated as the player uses the brushes. Bean bags were also available in the space to give the participants the option to experience LightGarden while standing or sitting.

The visual style of the projections we were inspired by:

  • Mandalas
  • Kaleidoscopes
  • Zen Gardens
  • Fractal patterns
  • Light Painting
  • Natural symmetries (trees, flowers, butterflies, jellyfish)

experience2

 

 

 

 

 

 

 

Image credit: Yushan Ji (Cynthia)


Interactivity Relationships

LightGarden is an interactive piece that incorporates various relationships between:

  • Person to Object: which exhibits in the interaction between the player and the “seed” controller.
  • Object to Person: the visual feedback (i.e. the cursor on the screen responses predictably whenever the player tilts or click a button through the change in its location/appearance on the screen), as well as auditory feedback (i.e. the wind chimes sound fades in when draw button is clicked) let user know that he is in control of the drawing.
  • Object to Object: our initial plan was to use Received Signal Strength Indicator to illustrate the relationship between the controller and the anchor (e.g. the shorter of the distance between the anchor and the “seed” controller is, the faster the pulsing light on the anchor goes).
  • Person to Person: Since there are two “seed” controllers, two players can use each individual controller to collaboratively produce a generative art with different brush and color.

The Setup

  • 2 short-throw projectors
  • 2 controllers, each has an Arduino Fio, XBee, accelerometer, two momentary switches, one RGB LED, and lithium polymer battery.
  • 1 anchor point with an Arduino Uno, a central receiver XBee, and RGB LED
  • An Arduino Uno with Neo Pixel strip

System Diagram

SystemDia

 

 

 

 

 

 

 

 

 

 

 

 

 


The Software

Processing was used to receive user input and generate the brush effects. Two kinds of symmetry were modeled in the program, bilateral symmetry across the x-axis, and radial symmetry from ranging from two to nine points. In addition to using different colors and drawing methods, each brush uses different kinds of symmetry, to ensure that each one feels significantly different.

Each controller is assigned two brushes which it can switch between with the toggle button. A class was written for the brushes that kept track of its own drawing and overlay layers and was able to handle all of the necessary symmetry generation. Each implemented brush then extended that class and overrode the default drawing method. There was also an extension of the default brush class that allowed for smoothing, which was used by the sand brush.

One major downside discovered late in development was that the P2D rendering engine won’t actually make use of the graphics card unless drawing is done in the main draw loop of the Processing sketch. Most graphics work in the sketch is first rendered off-screen, then manipulated and combined to create the final layer, so as a result the graphics card was not utilized as effectively as it could have been.

Here is a listing of the four brushes implemented for the demonstration:

brushes

 

 

 

 

 

 

 

 

 

1. Ripple Brush

This brush uses a cyan color, and fades to white as it approaches the center of the screen. It uses both bilateral symmetry and a sixfold radial symmetry, which makes it well suited for flower-like patterns. It draws a radial burst of dots around each of the twelve cursor positions (six radial points reflected across the x-axis), and continually shifts their position to orient them toward the center of the screen. With smoothing effects applied, this creates multiple overlapping lines which interweave to create complex patterns.

2. Converge Brush

This brush uses a dark indigo color, and draws lines which converge toward the center of the drawing. It has bilateral symmetry and eightfold radial symmetry. As the lines approach the edge of the screen, a noise effect is applied to them, creating a textured effect. Because all lines converge, it creates a feeling of motion, and draws the viewer toward the center of the image

3. Sand Brush

This brush uses a vibrant turquoise color, and like the starburst brush fades to white as it nears the center of the image. It draws a number of particles around the brush position; the size, number, and spread of these particles increases as the brush approaches the outer edge, creating a scatter effect. This brush uses sevenfold radial symmetry, but does not have bilateral symmetry applied, which allows it to draw spiral patterns which the other brushes cannot make.

4. Silk Brush

This brush uses a purple color and has the most complex drawing algorithm of the brushes. It generates nine quadratic curves originating from the position the stroke was started to the current brush position. The effect is like strands of thread pulled up from a canvas. The brush has bilateral symmetry but only threefold radial symmetry so that the pattern is not overwhelming. Because it creates such complex designs, it is well suited for creating subtle backgrounds behind the other brushes.


The Controllers and Receiver

controller2

 

 

 

 

 

 

 

 

Image credit: Yushan Ji (Cynthia) and Tarik El-Khateeb

Seed Controller – Physical Design

When considering the design and functionally of our controllers, we started the endeavour with a couple goals from the start. These goals were determined by very realistic limitations of our intended hardware, most notably, the XBee transceiver and the 3-axis accelerometer.  We knew we needed accelerometer data for our visuals and in order to have reliable, consistent data, the base orientation of the accelerometer needed to be fairly standardized. Furthermore, the Xbee transceiver signal strength severely drops when a line of sight relationship is blocked, either by hands or other physical objects. Taking this into consideration, we designed a controller that would suggest the correct way of being held. The single affordance that we used to do this was a RGB LED that would illuminate and signify what we wanted to be the “front” of the controller.

controller3

 

 

 

 

 

 

 

 

Image credit: Tarik El-Khateeb and Phuong Vu.

Initially we started with the hopes of creating a 3D-Printed, custom shaped controller (by amending a ready 3D module from thingiverse.com), however after some experimentation and prototyping, we quickly came to the conclusion that it was not the right solution given the time constraints associated with the project. In the end, we decided to go with found objects that we could customize to suit our needs. A plastic soap dish became the unlikely candidate and after some modifications, we found it to be perfect for our requirements.

To further suggest controller orientation, we installed two momentary, press-buttons that would act as familiar prompts as how to hold it. This would prevent the user from aiming the the controller with just one hand. These buttons also engaged the drawing functions of the software and allowed for customization of the visuals.

The interaction model was as follows:

  1. Right Button Pressed/Held Down – Draw pixels
  2. Left Button Pressed momentarily – Change draw mode
  3. Left and Right Buttons held down simultaneously for 1.5 seconds – clears that user’s canvas.

controller1

 

 

 

 

 

 

 

 

Image credit:  Tarik El-Khateeb

Seed Controller – Electronics

We decided early on to use the Xbee transceivers as our wireless medium to enable cordless control of our graphical system. A natural fit when working with Xbees, is the Arduino Fio, a lightweight, 3.3v microcontroller that would fit into our enclosures. Using an Arduino meant that would could add an accelerometer, RGB LED, two buttons and the Xbee without a concern of shortage of IO pins, as is the case with using a Xbee alone. By programming the Fio to poll the the momentary buttons we could account for duration of each buttons presses. This allowed some basic, on-device processing of data before sending instructions over wireless, helping reduce unnecessary transmission. Certain commands like “clear” and “change mode” were handled by the controllers themselves, significantly increasing reliability of theses functions.

In the initial period of development, we had hoped to use the Xbee-Arduino API, certain features seemed very appealing to us but as the experimenting began, it was clear that even though it was an API there were still several low-level functions that significantly complicated the learning processing and overall interfered with our development. We made a strategic decision to cut our losses with the API and rather use the more straight forward, yet significantly less reliable method of broadcasting serial directly and parsing it on the other end in Processing, after the wireless receiver relays it. Here is an example of the data being transmitted by each controller:

<0,1,1,-1.24,0.56>

<controllerID,colorModeValue,isDrawingValue,X-axis,Y-axis>

 

LightGardenControllerSchematic LightGardenControllerBreadBoard

 

 

 

 

Circuit diagrams for the Seed Controllers.
Wireless Receiver

In order to receive the wireless commands from both of our controllers, we decided to create an illuminated receiver unit. The unit is comprised of an Arduino Uno, RGB LED and an Xbee; it acts as a simple relay, forwarding the the serial data received via the Xbee to the USB port of the computer for the Processing sketch to parse. We used the SoftwareSerial library to emulate a 2nd serial port on the Uno so we could transmit the data as fast as it was being  received. In terms of design, instead of hiding the device we decided to feature it prominently in the view of the user, a pulsing white LED indicates that it serves a functional purpose and our hope was for it to remind users that wireless transmission is occurring, something that we take for granted nowadays.

LightGarden_Reciever_Schematic LightGarden_Reciever_BreadBoard

 

 

 

 

Circuit diagrams for the Wireless Receiver.


Design

Branding strategy:

The LightGarden logo is a mix of two fonts from the same typeface family: SangBleu serif and sans serif. The intentional mix of serif and sans serif fonts is a reference to the mix and variety of effects, colours and brushes that are featured in the projections.

The icon consists of outlines of four seeds in motion symbolizing the four cardinal directions, four members of the group as well as the four main colours used in the visual projection.

logo

Image credit: Tarik El-Khateeb

Colour strategy:

Purple, Turquoise, Cyan and Indigo are the colours chosen for the brushes. The rationale behind using cold colours instead of warm colours is that the cold hues have a calming effect as they are visual triggers associated with water and the sky.

Purple reflects imagination, Turquoise is related to health and well-being, Cyan represents peace and tranquility and Indigo stimulates productivity.


Sound

Sound plays a major role in our project. It is an indispensable element, without it the experience cannot be whole. Because the main theme of our project is to create a meditative environment, it was important to choose the type of sound which was meditative: enhancing rather than distracting the players from the visual experience. We needed to find a sound that was organic, can be looped, and yet does not become boring to the participants in the long run.

In order to fulfill all of aforementioned requirements, we decided to go with Ambient music, an atmospheric, mood inducing musical genre. The song “Hibernation” (Sync24, 2005) from Sync24, was selected as the background music. Using Adobe Audition (Adobe, 2014), we cut out the intro/outro part of the song, and beatmatched the ending and the beginning of the edited song so that the entire song can be seamlessly looped.

sound

 

 

 

 

 

 

 

 

Image credit: Screen captures from Adobe Audition

Sound was also used as a means of giving the auditory feedback to the user of our “seed” controller, i.e., whenever player clicks the draw button on the “seed” controller, a sound is played with the purpose of notifying player that the action of drawing is being carried out. For this purpose, we employed the sound of wind chimes, whose characteristic is known for inducing the atmospheric sensation as used in Ambient Mixer (Ambient mixer, 2014). In our application, the ambient song is played repeatedly in the background  whereas the wind chimes sound fades in and out every time the player clicks and releases the draw button allowing the wind chimes to organically fuse into the ambient music. To do so, we utilized the Beads, a Processing library for handling real-time audio (Beads project, 2014). Beads library contains several features for playing the audio file and for generating a sequence of timed transition of the audio signal, i.e., sequence of changes in amplitude of the audio signal. So when the draw button is clicked the amplitude of wind chimes audio signal increases, and conversely, when the draw button gets released the amplitude of wind chimes audio signal decreases.


Code

https://github.com/phuongvu/LightGarden


Case Studies

One: Pirates of the Caribbean: Battle for Buccaneer Gold

Pirates of the Caribbean: Battle for Buccaneer Gold is a virtual reality ride and interaction game at DisneyQuest, an “indoor interactive theme park”, located in Downtown Disney at the Walt Disney World Resort in Florida. (Wikipedia. 2014)

The attraction is 5 minutes long and follows a linear storyline in which Jolly Roger the Ghost Pirate appears on screen and tells the participants that their pirate ship is seeking treasure and that they can win this treasure by sinking other ships and taking their loot. The ship sails through different islands and comes across many ships to battle. 4:30 minutes into the ride, the pirate ghost re-appears and informs the players that they have to battle him and his army of skeletons in order to be able to keep any treasure they won by battling the ships. Once all the ghosts and skeletons have been defeated the final score appears on the screen.

The attraction can be experienced by up to 5 participants. One individual steers the pirate ship by using the realistic helm on the attraction inside of a detail rich 3D computer generated virtual ocean with islands and ships. Up to four players control cannons to destroy other ships. The cannons use wireless technology to “shoot” virtual cannons on the screen.

The attraction uses wrap-around 3D screens, 3D surround sound, and a motion platform ship that fully engages the participants and make them feel like real pirates on a real ship. (Shochet. 2001)

https://www.youtube.com/watch?v=5cgtnXMbJcw

 

Two: Universal Studios Transformers: The Ride 3D

Transformer: The Ride 3D (Universal Studios, 2011) is a 3D indoor amusement ride situated in Universal Studios Hollywood, Universal Studios Florida and Universal Studios Singapore. The ride 3D is an exemplary case study of how a thrill ride when combined with visual, auditory and physical simulation technologies can create such immersive experience that it clears the borderline between fiction and reality.

The setup of this attraction consists of a vehicle mounted on motion platform that runs for 610 meters track. Each vehicle can carry up to 12 riders, who, throughout the ride, will be exposed to different kind of effects like motion, wind-blowing including hot air and air blast, water-spraying, fog, vibration, and a 18 meters high 3D projections that shows various Transformers characters (Wikipedia, 2014). Along the ride, participants will have a chance to “fight along side with Optimus and protect the AllSpark from Decepticons over four stories tall” (Universal Studios, 2011).

 

Three: Nintendo Amiibo

The Nintendo Amiibo platform is a combination of gaming consoles and physical artifacts, which take the form of well-known Nintendo characters in figurine form (Wikipedia, 2014). The platform is one of many following the same trend, that is, small, NFC (near-field-communication) equipped devices that, when paired with a console, add additional features to that console or game. NFC is actually a technology built up from RFID (Radio Frequency Identification) and most smart phones are now equipped with it (AmiiboToys, 2014).

The Amiibos have memory capability (only 1Kb-4Kb) and allow certain games to store data on the figurine itself (AmiiboToys, 2014). One example of this is with the newly released, Super Smash Bros game for the Wii U. The figurines actually “contain” NPC (non-playable characters) that match the resemblance of the character. These characters actually improve their abilities based on your own playing habits, apparently they actually become quite hard to beat! (IGN, 2014).

The interesting aspect of the Amiibo line and others like it, is the interaction between the digital representation of the character and the physical figurine itself. By using NFC, the experience seems almost magical, something that a physical connection would most likely ruin. There is a relationship between the player and the object but also between the player and the on-screen character, especially when said character is aggravating the player because its skills are improving. The transparency of the technology helps dissolve the boundaries between the physical object and the fully animated character.

 

Four: Disney MagicBand

The fourth case study will not be focusing on an attractive in an amusement park but rather on a new one billion dollar wearable technology that has been introduced in the Walt Disney parks: the MagicBand. (Wikipedia. 2014)

The MagicBand is a waterproof plastic wristband that contains a short range RFID chip as well as bluetooth technology. They come in adult and child sizes and store information on them. The wearer can use them as their hotel room key, park ticket, special fast pass tickets, photo-passes as well as a payment method for food, beverages and merchandise. (Ada. 2014)

The MagicBands also contains a 2.4ghz transmitter for longer range wireless communication, it can track the band’s location within the parks and link on-ride photos and videos to guests’ photo-pass account.

Thomas Staggs, Chairman of Walt Disney Theme Parks and Resorts, says that the band in the future might enable characters inside the park to address kids by their name. “The more that their visit can seem personalized, the better. If, by virtue of the MagicBand, the princess knows the kid’s name is Suzy… the experience becomes more personalized,” says Staggs. (Panzarino. 2013)

 


References & Project Context

3D printing:

http://www.thingiverse.com/thing:376028

 

Sounds:

Adobe. 2014. Adobe Audition. Retrieved from https://creative.adobe.com/products/audition

Ambient Mixer. (2014). Wind chimes wide stereo. Retrieved from http://ambient-music.ambient-mixer.com/ambient-sleeper

Beads project. 2014. Beads library. Retrieved from http://www.beadsproject.net/

Sync24. “Hibernation.” Chillogram. Last.fm, 22 December 2005. Web. 01 Dec. 2014. Retrieved from http://www.last.fm/music/Sync24/_/Hibernation

 

Case Study 1:

Disney Quest – Explore Zone. Retrived from http://www.wdwinfo.com/downtown/disneyquest/dquestexplore.htm

Shochet, J and Banker T. 2001. GDC 2001: Interactive Theme Park Rides. Retrived from http://www.gamasutra.com/view/feature/131469/gdc_2001_interactive_theme_park_.php
Wikipedia. 2014. Disney Quest. Retrived from http://en.wikipedia.org/wiki/DisneyQuest

 

Case Study 2:

Inside the Magic. 2012. Transformers: The Ride 3D ride & queue experience at Universal Studios Hollywood. Retrived from https://www.youtube.com/watch?v=ARJHpBgu1vM

Universal Studios. 2011. Transformers: The Ride 3D. Retrieved from http://www.universalstudioshollywood.com/attractions/transformers-the-ride-3d/.
Wikipedia. 2014. Transformers: The Ride. Retrieved from https://en.wikipedia.org/wiki/Transformers:_The_Ride

 

Case Study 3:

AmiiboToys, (2014) Inside Amiibo: A technical look at Nintendo’s new figures. Retrieved from http://www.amiibotoys.com/2014/10/17/inside-amiibo-technical-look-nintendos-new-figures/

IGN, (2014). E3 2014: Nintendo’s Amiibo Toy Project Revealed – IGN. Retrieved from http://ca.ign.com/articles/2014/06/10/e3-2014-nintendos-amiibo-toy-project-revealed?abthid=53972d736a447c7843000006 [Accessed 10 Dec. 2014].

Wikipedia. (2014). Amiibo. Retrieved from http://en.wikipedia.org/wiki/Amiibo

 

Case Study 4:

Ada. 2014. Making the Band – MagicBand Teardown and More. Retrieved from http://atdisneyagain.com/2014/01/27/making-the-band-magicband-teardown-and-more/

Panzarino, M. 2013. Disney gets into wearable tech with the MagicBand. Retrieved from http://thenextweb.com/insider/2013/05/29/disney-goes-into-wearable-tech-with-the-magic-band/

Wikipedia. 2014. MyMagic+. Retrieved from http://en.wikipedia.org/wiki/MyMagic%2B#MagicBand

 

Project Inspirations / Context:

http://www.nearfield.org/2011/02/wifi-light-painting

http://artacademy.nintendo.com/sketchpad/

http://www.procreate.si/

http://digital-photography-school.com/light-painting-part-one-the-photography/

http://www.wired.com/2011/01/bioluminescent-sea-creatures/