Experiment 1 – Camp Fire

Project Title: Campfire
Team Members: Nilam Sari (No. 3180775) and Lilian Leung (No. 3180767)

Project Description:

Our experiment was an exploration as to how we could create a multi-screen experience that would speak to the value of ‘unplugging’ and having a conscious and present discussion with our classmates using the metaphor symbolism of the campfire.

From the beginning, we were both interested in creating an experience that would bring people together and be able to have a sort of digital detox and engage in deeper face-to-face conversation. We wanted to play along with the current trend of digital minimalism and the Hygge lifestyle focused on simpler living and creating deeper relationships.

bookexamples

While our project would only provide about a 10 minute reprieve from our connected lives, we wanted to bring to attention, while we’re in a digitally-lead program, that face-to-face conversation and interaction is just as important for improving our ability to empathize with one another.

Visual inspiration was taken from the visual aesthetic of the campfire as well as the use of abstract shapes used for many meditative and mental health apps such as Pause, developed by ustwo.

screenshot-2019-09-19-at-4-39-22-pm

ustwo, Pause: Interactive Meditation (2015)

How it Works:

The sketch is laid out with three main components: the red and orange gradient background, the fire crackling audio, and the transparent visuals of fire. 

On load, the .mp3 file of the audio plays with a slow fade in of the red and orange gradient background. The looped audio file’s volume is dependent on mic input, so that the more discussion from the group participating amplifies the volume. The visual fire graphics at the bottom adjust in size dependent on the volume of the mic input, creating a flickering effect similar to a real campfire.

To lower the volume and fade the fire, users can shake their devices and the acceleration of the x-axis causes the volume to lower and the tint of the images to decrease to 0. This motion is to recreate shaking out a light match.

Development of the Experiment

September 16, 2019

Our initial thought about visually representing the camp fire was to recreate an actual fire. Though we realized from our intentions to have all the phones laid flat on a surface, that fire is seen from a vertical perspective, while the phones would lay horizontally so we went with a more abstract approach instead by using gradients. 

The colours chosen were taken from the natural palette of fire though we also explored sense of contrast with the use of gradients. 

Gradient Studies

Righini, E. (2017) Gradient Studies

Originally we tried working with a gradient built in RGB, though while digging into control of the gradient and switching values, Lilian wasn’t quite comfortable yet with working with multiple values once we needed to go having them change based on audio level input.

Instead we began developing a set of gradients we could use as transparent png files, this allowed us more control over what they visually looked like and allowed the gradients to become more dynamic and also easier to manipulate.

assetsamples

Initial testing of the .png gradients as a proof of concept, worked as we managed to get the gradient image to grow using the mic Audio In event.

While Lilian was working on the gradients of the fire, Nilam was trying to figure out how to add on the microphone input and make the gradient corresponds to the volume of the mic input. One of her solution was using mapping.

The louder the input volume the higher the Red value gets and the redder the screen become. This way they could change the background to raster image, and instead of lowering the RGB value to 0 to create black, this changed its opacity to 0 to show the darker gradient image on the back of it.

Nilam made edits on Lilian’s version of experimentation and integrated the microphone input and mapping part into the interface she already developed.

September 19, 2019

Our Challenges

We were still trying to figure out why mic and audio input and output was working on our laptops but not on our phones. The translation of mic input on to increase the size of the fire seemed laggy even attempting to down-save our images.

On our mobile devices, the deviceShake function seemed to be working, while laggy on Firefox, playing the sketch on Chrome provided better, more responsive, results. Other issues were once we started changing the transition of the tint for our sketch that sometimes the deviceShake would stop working entirely.

We wanted a less abrupt and smoother transition from the microphone input. So we tried to figure out if there are functions like delay. We couldn’t find anything so we decided to try using if statement instead of mapping.

We found out from our google searches that there is a possibility of a bug that stopped p5.js certain functions like deviceShaken from working after the recent iOS update in this past summer. Because, while laggy, it still worked on Lilian’s Android Pixel 3 phone, while it just completely never worked Nilam’s iPhone.

Audio Output

Nilam (Iphone 6) Lilian (Pixel 3)
Chrome No Yes 
Firefox No Yes
Safari No N/A


deviceShaken Function

Nilam (Iphone 6) Lilian (Pixel 3)
Chrome No Yes 
Firefox No No
Safari No N/A


Additionally, Lilian started working on additional exploration like mobile rotation and acceleration to finesse the functionality of the experiment. She also began exploring how we could incorporate noise values to recreate organic movement. We were inspired by
these examples using Perlin noise.

scrns12

To add the new noise graphic, we used the createGraphics() and clear() functions to create an invisible canvas on top of the gradient to let the bezier curve trails so it looks like a flame. It clears itself and repeat the process again after the 600 frame count to decrease loading problems in the sketch.

September 21, 2019

After reviewing our code we realized some of the issues we were having with the audio were because of Chrome’s privacy restriction with disabling auto-playing audio, as well as our mic problem was because we placed our code within the function setup { section and was only running once, as compared to once we moved it to function draw {, the audio seemed to be working better.

September 23, 2019 – The Phone Stand

physicalprototype

After getting feedback for our prototype, we started creating a stand that we could place everyone’s phones during the presentation. We laid out two rows of stands, the outer circle holding 12 phones, and the inner circle holding 8 phones, as we explored how we could better re-create the ‘fire’ when we have out multi-screen presentation. 

scrns13

We started out by sketching the layout for the phone stand. The size is based on the widest phone width someone has in our class. We then went to the Maker Lab and drilled into the circular foam and chiselled out the middle sections to create an indent that the phones could sit within.

physicalprototype-copy

The next step is to apply finishing to the foam. We used black matte spray paint to cover the foam. The foam deteriorated a little from the aerosole of the spray paint, which we foresaw, but after a test paint it didn’t seem to damage the structure much so we decided to proceed. 

September 26, 2019 – deviceShaken to accelerationX

screenshot-2019-09-26-at-6-03-00-pm

Finding the mobile deviceShake event wasn’t working, Lilian created a new sketch testing the opacity and audio level using accelerationX as the new variable. The goal was to test changes in acceleration cause the audio volume to decrease and the images to fade out. AccelerationX seemed to providing more consistent results and was added into the main Experiment 1 sketch.

User Flow

This experiment is primarily conversation lead. Set-up from the facilitators are creating a wide open space for everyone to sit and dimming or turning off the lights to recreate a night scene. Users are then asked to load in the P5 experiment and join together in a circular formation in the room.

Users should allow the fire animation to load in and place their phone into the circular phone stand. The joint phones coming together recreate the digital campfire. Facilitators then can  speak to the importance of coming together and face-to-face conversation.

The session can run as long as needed. When the session is finished, users can shake their phones to dim the fire and lower the volume of the fire crackling audio.

However, it did have an impact on the foam around the slots. It melted the foam with the paint and it wouldn’t dry. We thought we could have use gesso before the spray paint for future reference, but we had to improvise for this one so we used paper towels.

The Presentation

img_7854_edited

Photo Credit: Nick Puckett

doc2

doc1

doc3

experiment1campfire_iniphone

The code for our P5 Experiment can be viewed on GitHub
https://github.com/nilampwns/experiment1

Project Context

The project’s concept was taken from our mutual interest in creating a multi-screen experience that would cause classmates to come together in an exercise rather than be just an experiment using P5 events. After brainstorming a couple of ideas and possibilities within our limited personal experience with programming, we came up with an idea about ‘unplugging’ and having a full attention to the people around us without distraction of devices, except that it is facilitated by screens.

We wanted the experience to be about ‘unplugging’ to recognize the value (even within a digital media program) that time away from screens is just as beneficial and an opportunity for self-reflection. While technology allows us to extend ourselves within the virtual space, there are also many consequences to our real life relationships and physical composure.

Described within a Fast Company article What Really Happens To Your Brain And Body During A Digital Detox (2015) experts explained that our connectedness with our digital devices alters our ability to empathize, read each others emotions as well as maintain eye contact in real life interactions.

After our presentation, we looked Sherry Turkle’s work in Reclaiming Conversation: the power of talk in the digital age (2016). Turkle describes face-to-face conversation is the most humanizing method of communication, and allows us to develop a capacity for empathy. People use their phones for the illusion of companionship when real life relationships may feel lacking, our connectedness online leads us to discredit the potentiality of empathy and intimacy of face-to-face conversation. 

We chose a campfire as the visual inspiration for our P5 sketch due to the casual ritual performed today that provides both warmth and comfort while people connect with nature. Fire is pervasive across all human history, but within the present context, we use it as a symbol to voluntarily disconnect with technology and give one’s self the opportunity to nurture our relationships with nature and those close to us. 

Expanding upon the campfire and going into the ceremonial practice of a bonfire, fire has been used across history as a method to bring individuals together for a common goal, whether it be celebrations or a folk lore custom.

Rather than working with the literal visual depiction of fire, we chose to take visual cues from mobile meditation apps. 

We don’t believe our experiment will provide all these benefits, but we wanted to use it as an opportunity that while we’re in a digitally-lead program, face-to-face interaction is just as important. To provide each of our classmates a moment of self-reflection, we’re provided the unique opportunity to evaluate what we would like to offer for one another and to create.

We think that our presentation helped our classmates to take a break from the hecticness over constantly having to look at multiple screens while working on our experiment one projects. One of the feedback that we got was that the presentation would have been more successful if we had presented towards the end of the class when everybody had spent more time looking at screens.

If we were to develop the experiment further, we could explore how we could use the phones camera input ability to dim the fire based on eye contact to encourage users to look away from their screens when in conversation together. Further improvements could be functionality to blow on the phone to dim the fire which would require ranges of mic input to detect the difference between conversation and blowing on the phone. 

Citations

2D and 3D Perlin noise. (n.d.). Retrieved September 21, 2019, from https://genekogan.com/code/p5js-perlin-noise/.

Righini, Evgeniya. “Gradient Studies.” Behance, 2017, www.behance.net/gallery/51830921/Gradient-Studies.

Turkle, S. (2016). Reclaiming conversation: The power of talk in a digital age. NY, NY: Penguin Books.

ustwo. (2015) Pause: Interactive Meditation, apps.apple.com/ca/app/pause-interactive-meditation/id991764216.

Experiment 1: Digital Interactive Sound Bath

Abstract

Our project is a digitized version of the experience of a sound bath. The objective was the same – to explore the ancient stress-relieving and sound healing practice. However we sought to achieve this using laptops and phones, which are often associated with being the cause of stress and anxiety. Our experiment made use of motion detection,  WEBGL animation, sound detection and emission.

Repo: https://github.com/jevi-me/CC19-EXP-1

Demo: https://jevi.app/CC19-EXP-1/


Table of Contents

1.0 Requirements
2.0 Planning & Context
3.0 Implementation
3.1 Software & Elements
3.1.1 Libraries & Code Design
3.1.2 Sound Files
3.2 Hardware
4.0 Reflections
5.0 Photos
6.0 References


1.0 Requirements

The goal of this experiment is to create an interactive experience expandable to 20 screens.

2.0 Planning & Context

 

schedule
Schedule

Stress is something that affects many. The constant hustle-bustle of work deadlines, fast-paced city life, and overachievement may push you to the edge and most could benefit from self-care, meditation, relaxation and pause from the busy life. Enter sound baths.

Sound baths use music for healing and relaxation. It is defined as an immersion in sound frequency that cleans the soul (McDonough). From Tibetan singing bowls to Aboriginal didgeridoos (Dellert), music has always been used for therapeutic uses for over decades now. The ancient Greeks also used sound vibrations to aid in digestion, treat mental disturbance and induce sleep. Aristotle’s ‘De Anima’ also shows how flute music can purify the soul.

Ever since the late 19th century, researchers have begun focusing on improving the correlation between sound and healing. These studies proved that music could lower blood pressure, decrease pulse rate and also assist the parasympathetic nervous system.

So, essentially a sound bath is a meditation class that aims to guide you into a deep meditative state while you are enveloped in ambient sounds.

brainstorm
Brainstorming

Sound baths use repetitive notes at different frequencies to help bring your focus away from your thoughts. These sounds are generally created with crystal bowls, cymbals and gongs. Similar to a yoga session, the instructor of a sound bath creates the flow of a sound bath. Each instrument creates a different frequency that vibrates in your body and helps guide you to the meditative and restorative state. Some people believe bowls made from certain types of crystals and gems can channel different restorative properties.

Our project is a digitized version of the experience of a sound bath. The objective was the same – to explore the ancient stress-relieving and sound healing practice. However we sought to achieve this using laptops and phones, which are often associated with being the cause of stress and anxiety . We allowed those experiencing it a moment to pause, reflect and reconnect with their inner soul.

requirements-list
Requirements

The concept of our experiment was to let the user interact with 4 primary zones in order to experience them:

  •     Zone A – Wild Forest – Green
  •     Zone B – Ocean Escape – Blue
  •     Zone C – Zen Mode – Purple
  •     Zone D – Elements of life – Pink
  •     Projections – a) Visually soothing abstract graphics. b) Life quotes
zone-maps
Zone Maps

We carefully segregated the different experiences based on their soothing experiences in the four corners of the space. The Zone A consisted of motion sensitive sounds of rain, chirping and crickets along with the motion sensitive zonal colour of green. Similarly, Zone B consisted of motion sensitive seascape sounds like ocean waves and seagulls along with the ambient lighting of blue tones. The Zone C being the zen zone, had meditation tunes as well as flute and bell melodies that were triggered by people passing by.  The zone also had the ambient lighting of pink touch to it. The final zone D represented elemental sounds such as rain, fire and earth which would be triggered by motion, however, we ultimately opted for silence within that zone, providing a brief audio escape. The colours were drawn together with the use of  the colour-cycling lamps on nears the floors.

The experience also consisted of projections – eye pleasing visualizations which were projected over the ceiling. These projections were volume sensitive. So, based on the interactive audience, the visualizations would become brighter and more prominent. To go along with the theme of digital sound bath, we also projected quotations about life which would instill the faith and provide inspiration to the users upon reading them.

Once all these elements came together, the space became a digital sound bath wherein users could come and relax their mind. The experience was made into a dark space where only upon detecting motion, would the room light up with different colours and project different ambient sounds. The result was a soothing and mind relaxing experience for the audience.

3.0 Implementation

3.1 Software & ELEMENTS

3.1.1 Libraries & Code Design

For the zones, the  library Vida was used for motion detection. The light emitted was a simple rectangle than slowly fades when motion is detected. The volume of the audio files mimics this as well.

WEBGL was used to generate the calming projection which was a slowly rotating cosine and sine animated plot in 3D suing spheres. It was sound activated and glowed brighter when the level increased.

The life quotes used an array and a set interval to redraw new quotations.

The code is designed to be centralised, so although there are 14 unique programs running, they share the base code where possible. For efficiency of set up, a home page was created with buttons for each program.

3.1.2 Sound Files

For sounds, the following were tested, but only the bold were implemented as they were the most audibly pleasing combination. These high quality sounds were purchased and licensed for use.

pink-zen(D): gentle-wind.wav, ambience.wav, bells.wav

purple-elements(C): wind.wav, rain-storm.wav, thunder.wav

blue-ocean(B): humpback-whales.wav, sea-waves.wav, california-gull.wav

green-forest(A): thrush-bird.wav, robin.wav, forest-leaves.wav, cricket.wav

3.2 Hardware
Some of the hardware used.
Some of the hardware used. (Plain dim LED lamp, glass jar, wax paper)

We used round table lamps on the floors with remote controlled  LEDs that cycled through the rainbow. Two plain dim LED lamps for used for safety in dark areas. Two projectors were used, one to project the life sayings onto the screen, and another to project the soothing animation onto the ceiling. Glass jars wrapped with decorated wax paper  held the phones as they light up. The wax paper was chosen to coincide with each of the zone themes, and the glass jars were tall enough to hide the majority of the screen to provide a soft glow, and short enough to keep the camera exposed as it is used for motion detection. An iPad was used at the entrance to provide context of the space. The space was decorated to simulate a Sound Bath.

4.0 Reflections

When approaching this topic, our group set out to examine explore a solution where the participants would not have to physically touch their phones themselves, but instead have it as part of an experience where they walk away from it, while it aids in relaxing themselves and others. While meditatively walking around the space, their motion act as the trigger for the light and soundscape. We noticed some some participants becoming enveloped in the experience, and lying down as one would in a traditional sound bath to absorb the experience with their senses. Others entranced by the lights and affirmation, were curious about what different pleasing sounds and colours could be produced. Due to the amount of the hardware and number of programs involved, there was a lot of set up required before the room could be entered. An additional complication is that this type of set up is one where the phones are accessible to the creators, and not something  the attendees bring with them into the experience.

The room initially requested was RHA 318, a smaller and more intimate space that would allow for more interaction between the lights by having them closer together, and a better layout for the projections. The room has recently gone out of service, and with the larger room, RHA 511, some of that interaction was diluted, as pointed out in the post-discussion.

Additionally, despite mentioning that participants were to just walk around to trigger the sound, many unfamiliar with the concept of a sound bath, still tried to manipulate the devices or holder, or use sound to the effects. This is likely due to the memories from the previous tactile experiments where manipulation of the elements within the experiment produced positive results.

5.0 Photos

6.0 References

https://www.allure.com/story/sound-bath-meditation-benefits

https://www.elitedaily.com/p/what-is-a-sound-bath-5-thing-to-know-before-you-bathe-in-the-sound-2975477

https://articles.aplus.com/wtf-is-it-and-should-you-try-it/what-are-sound-baths-benefits?no_monetization=true

https://www.washingtonpost.com/lifestyle/wellness/tune-in-and-chill-out-what-are-sound-baths-and-why-you-should-try-one/2017/05/02/e74c697c-2b7c-11e7-a616-d7c8a68c1a66_story.html