Category: Experiment 4

Experiment 4: Call & Response

screenshot-2022-02-07-at-12-23-33

This project was created as a response to the extended periods of physical distancing we’ve been facing as a result of a number of pandemic related conditions. Even as we slowly return to shared spaces, we still have to maintain some sort of social distancing and cover up our faces with masks making communication slightly harder than it should be.

Call & Response is a project that allows 2 people in a shared space communicate with musical messaging by sending synth phrases back and forth in response to each other, creating an ongoing and unending sonic conversation between the 2 participants.

Call & Response is inspired by a method of communicating in music by the same name where 1 performer plays a sonic phrase and a second performer or the audience replies with a second musical phrase.

The use of sounds as a messaging mechanism allows us bypass language barriers in traditional communication tools while also respectfully circumventing COVID restrictions by eliminating the need for verbal communication through a face mask.

This project uses a PHONE to PHONE interaction with the sonic messaging app being developed in p5.js. The participants scan a QR code present in the shared space which opens the p5.js app link and the real time connection between the participants phones is created using p5.party, a library for creating multiplayer apps with p5.js The synth sounds are also generated using Tone.js, a JavaScript library for creating interactive sounds in the browser using the Web Audio API

 

Experience Video: https://youtu.be/SF0W1fUtb-E

 

How it Works: https://youtu.be/VoyMg0WICSA

 

Link to Code on p5: https://editor.p5js.org/demilade/sketches/8NYLrFP6a

 

Link to the code on github: https://github.com/DemiladeOla/call-and-response

 

 

screenshot-2022-02-07-at-12-32-45

screenshot-2022-02-04-at-21-16-53

 

Experiment 4: Game Mode

mygif

 

  • Project Description

“Game Mode” is an upgraded digital version of a game I used to play with my cousins as a kid. In the beginning of the game, we would choose a certain object to be the indicator. Then we blindfolded the player and asked them to walk around the room. Another player started tapping on a surface. When the first player went near the indicator, the second player tapped harder and faster. And when the first player got far from it, the tapper tapped slower and lower. By only listening to the tapping, the first player has to find the indicator and the game ends. Then another player takes their place, and the game starts again.

In this project, I decided to upgrade this game to a digital interactive version. The indicator is a cube that contains the Arduino circuit and a battery to power it up. I gave the game a “The Incredibles” theme since I noticed that the ultrasonic sensor I’m using in this project, reminds me a lot of the big round glasses of Edna Mode, a character in that movie.

When the player moves their hands around the table to find the object, a song is playing on the laptop screen. When their hands move near the sensor, the volume goes up. they have to keep moving their hands – since they are blindfolded – to find the object.

 

  • Final Project Images

20211231_131857    20211231_131537

 

  • Development Images

20211218_0123326 20211230_211913 20211217_2134166

 

 

  • Circuit Diagram

frit

Beautiful Now: Glowing Heart

Present Video: https://www.youtube.com/watch?v=A1bcsZj27S0

https://www.youtube.com/watch?v=HNWXCSz5GaY

 

For the Proximities project, we decided to create a piece of multi-color glowing intimate clothing whose colors can be controlled through a digital interface. We embed Arduino-controlled heart-shaped LED strings into a wearable piece of velvet intimate camisole.

screen-shot-2021-12-06-at-7-24-26-pmscreen-shot-2021-12-12-at-5-59-50-pm

How it works: https://www.youtube.com/watch?v=d4sWTWiissc&ab_channel=YTG
The color of the LED light is controlled by the number input on the digital screen. The color code is in RGB order. The camisole is black velvet when the light isn’t shined. When you put in a color code and activate the LED, it shines through the velvet material into a centered heart shape. Every time you change a number, the LED lights up in a different color. The main color scheme that controls the range of colors can only be modified in the LEDcontrol code. In this piece for right now, we chose to put blue on zero to increase our chances of getting red and purple-ish colors.

screen-shot-2021-12-12-at-6-00-43-pm screen-shot-2021-12-12-at-6-00-19-pm
Design Concept:

The wearer must be close to the digital interface that controls it. In our case, it is the laptop. This requires a shared-spatial relationship for both the controller and the wearer. Limited by the length of the USB cable, the wearer and the controller can be the same person. We chose the wearable piece to be an intimate piece because we want to emphasize and embrace the loving relationship between couples and ourselves, our mind and body.

When worn by two people, the lights-up revelation of a heart is a warm message from the controller to the wearer about their love for them. When it is worn and played with by one user, it refreshes the user’s experience of wearing underwear and reminds them to love themselves.  Underwear/intimate wear is often overlooked as the basic we wear everything else over. It doesn’t embellish the look of us nearly as much as what we wear over it. And we often rush to put on other clothes instead of taking a moment to appreciate our most real selves.  We want to create a fun piece that promotes a healthy and positive visual interaction between one and oneself. Through wearing and playing our piece, we encourage our users to see themselves from a new perspective. To notice the glowing heart as a nice reminder of the importance of self-love and confidence. You are beautiful the way you are. We picked LED under velvet material because the darker the surrounding is, the more obvious the LED shines through the black velvet. Just like how the darker it gets, the more important it is for us to hold onto our faith in ourselves to pull through.

This piece, by sending a visual message and provoking interaction with the wearer, constructs this special spatial relationship that aims to put a smile onto the wearer’s face and some positive vibe alongside it!

Arduino Code: https://github.com/BECKY-BAY/arduino/blob/main/Experiment4-Arduino

P5JS Code: https://github.com/BECKY-BAY/arduino/blob/main/Experiment4-P5

 

Potential future development:

1639352876wechatimg201

Orchestral Beat

badge

Group Members: Adit Verma & Arjun Dutt

PROJECT DESCRIPTION

The original intent for this BLE-based sound installation was to create a ‘digital orchestra’.  On a central device, the video and sound of a (live) orchestra would be playing. Each person who would want to interact with the installation would be given a pin-badge as an entry ticket or identifier that they are participating in the installation. Behind the badge we have fixed an Arduino to a CR2032 watch battery- given its size, it was the only power source we could use that was small and compact enough to fit behind a badge. The central device is programmed to scan for peripheral devices (Arduino) and subsequently, the Arduino (Peripheral) is programmed to Broadcast its RSSI to the Central device continuously.

To each individual Arduino, we would assign a particular instrument (drums, guitar, etc.) from the soundtrack we created on Garageband through P5.js. And once you are in the vicinity of the central device, the proximity of the Peripheral is read by the Central device and that triggers a certain manipulation in the music. In this case, to the volume of the particular instrument that has been assigned to the Arduino. For example, if you have been given a badge with a drum on it, once you start walking away from the Central device the volume of the drums starts to get softer and as you keep moving towards the Central device, the volume of the drums will get louder. So, in essence, the volume of the instruments would continually vary as people continually walked around the room.

As you can imagine, we ran into a lot of hurdles while working with the RSSI and trying to program the Peripherals and the Central. There were a lot of hurdles being trying to figure how we could assign a single instrument from a soundtrack (with multiple instruments) to a single Arduino, Once switched on the Arduino wouldn’t automatically try to pair with the central device, in fact, it would need to be manually connected. This really took away from our hope of seamless, fluid installation and the ability of Peripherals to talk to each other.

Ultimately, with time ticking down, we split the song into the sum of its instrumental parts and assigned an Arduino to each of the three instruments. Instead of a video with an orchestra conductor, we split the laptop screen into 3 distinct areas to correspond to three distinct instruments. Once the Arduino badge is manually connected, its assigned sound will start playing. The sound will go higher or lower in volume depending upon the viewer’s proximity to the Central device.

Experience Video | How it Works Video | Code Link

Project Images:

orhchestra

untitled-1   20211211_234501

Development Images:

20211211_234341  20211212_223843

1

Network Diagram:

network-diagram

There is Magic in the Air…and it’s called Bluetooth

1

2

With the world spiraling out of control in the past year or so, we felt so out of control with our lives. We didn’t have a say with staying employed, with whether that vacation can actually happen before another lockdown. We felt so small compared to the world. That’s why our group wanted to explore the notion of “man controlling nature.” We wanted to use arduinos to give users the opportunity to change the weather and feel like they are in control of their lives and even the sky. Our goal was to explore the proximities between people and their environment, and how this can impact the way we view the world and share connections. This was the interaction that held the most meaning for us.

The installation we created is made up of two parts: the cloud and the wand. We designed and built everything from scratch using a variety of materials. The cloud comprises of a wooden structure made of bamboo sticks covered in a LED strip and cotton. The cloud was hung on the ceiling using wire; inside the wooden structure we placed an arduino and circuit which could send instructions to the LED strip. The wand was made using a stick and plastic bottle and covered with hot glue and acrylic paint. Inside the wand we placed another arduino and batteries which would collect and send information to the arduino in the cloud when in proximity to it. For this project we started out with two command gestures: point down, and twirl which would send different signals to the cloud. The first gesture will cause the cloud to have a thunderstorm-like effect with flashing blue and purple lights and a thunder sound, while the second signal will cause the cloud to display a rainbow of changing colours.

For this assignment, we used two Arduino BLE devices with the one in the cloud being the central that reads the data and generates an output, and the one in the wand is the peripheral which collects the data and sends it along. Because the cloud is fixed in location, the user can make their way around the cloud and cast spells from any angle given they are in proximity to the cloud. The cloud acts as both the technical beacon and the beacon for the magic to happen. Like J.K Rowling once said: “it is important to remember that we all have magic inside us.”

Experience Video: https://clipchamp.com/watch/ZvYsHTc8RwI

How it Works Video: https://clipchamp.com/watch/92o5SPeDhna

Final Project Images: 

3

4

5

6

7

Development Images:

img_20211206_201331 img_20211208_173101 img_20211208_180703

img_20211208_175954

The Code: https://github.com/ichangimathprayag/Exp4_MagicInTheAir

The Network Diagram: 

microsoftteams-image-10

microsoftteams-image-11

 

img_20211208_171414

Group members: Kelly Xu, Zhino Yousefi, & Prayag Ichangimath

Brains Talking

cover2

 Group members: Siyu Sun, Mehdi Farahani,  Anantha Chickanayakanahalli

 “Brains being by themselves tend to get noisy especially with all the electronic devices to blame. When they are brought in proximity or closeness to other brains a sense of synchronization is possible and there is a scope of calmness. In this experiment, we are trying to represent this proximity by means of a physical installation of two objects that represent the two brains. They are activated by changes in their positions and when moved they demonstrate the noise of everyday sounds reverberating. These sounds come to a standstill and attain a silence when the brains are brought together (touching each other). This is meant to translate the effect people have on each other.”

 

Project Description

Brains Talking is an interactive artwork based on the study of  Bluetooth Low Energy (BLE). In this installation, we are digging into the term of Consciousness in the human mind. Consciousness means some kind of experience, cognition, feeling, imagination, and volition. We can perceive and receive the information in the world, and the brain is the most crucial tool to let it can be trued. There are so many nerve systems in our brain, and we can do the sensation during that, just like we can speak, we can touch, we can taste and smell the flavor, which can help us interact with the world. But compared with the un-consciousness, it is very easier to understand, just like we don’t have any self-thoughts, awareness, and can’t do the interactive. For instance, stones, sculptures, installations, even like our technology world. So we combine the still acrylic material and then use laser cutting to cut off the outline of the human brain, to mimic future human brains. when consciousness becomes unconscious, what can we humans do?

Output

  1. We put some sound from our life. Neurofeedback is a form of alternative treatment wherein brain waves are retrained through auditory or visual stimuli. Everything can be turned to sound in our physical world, and they can activate our neuro system to make the feedback. So we are thinking about mimicking a virtual environment that can interact with future brains. The sound in this audio track came from the original physical sound, like the movement elevator, boiling water, the shaking of keys, flapping the handrail, closing the door, and so on. All of these inadvertently form our communication system. In this installation, we are set up a defined distance that can activate the sound, people can use the distance to control the sound value, which means the distance from each other is never far away. As we all know, following the development of technology in our recent world, the relationship between humans is turned into a silent world. We always focus on intelligent devices which ignore the people around us. So we want to use this kind of idea to critique this phenomenon.

 

2.  Mimic the hand’s power

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212123241

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211213133923

microsoftteams-image-5

 

Experience Video

 

 

 

How it Works Video

 

Final Imagines

bcc88733fb35530c3812c8867a76146

3d600c505cc8621dccd883d8beb6966

d9c9115ed70566d38ca40af830e241b

light

 

 

Development  Images

Laser Cutting Sketch

laser

 

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212040232

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212040323

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212040307-%e6%8b%b7%e8%b4%9d

e540fcde067ada1c3289151ee67129b

 

 

Circuit Diagram

 

 %e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212132222

 

Github

Processing File:

https://github.com/rewritablehere/digitalfutures/blob/main/brains_talking/brains_talking.pde

Arduino Code:

https://github.com/rewritablehere/digitalfutures/blob/main/brains_talking/central.ino

 

 

Biography

1-   Jade, Laura. “INTERACTIVE BRAINLIGHT- Brain Sculpture Lights Up Your Thoughts.” YouTube, Date accessed: December 03, 2021

URL: https://www.youtube.com/watch?v=ZQmA5X47ewU

2-   “BRAINLIGHT.” Laura Jade, Date published: August 29, 2018, Date accessed: December 03, 2021

URL: https://laurajade.com.au/interactive-brain-light-research-project/

3-    Jade. Laura – BRAIN LIGHT PROJECT – RESIDENCY 2015, Date published: July 08, 2019, Date accessed: December 03, 2021

URL: https://www.cultureatwork.com.au/laura-jade-culture-at-work-artscience-residency-2015/

4-     Kanai, R. (2010, August 31). Are Two Heads Better Than One? It Depends. Retrieved from https://www.scientificamerican.com/article/are-two-heads-better-than/#

“Who’s There?”

screen-shot-2021-12-11-at-2-44-18-pm

Much like with any final assignment, we had our fair share of ups and downs. Originally we had interpreted the assignments use of proximity more in line with what was apparently done last year (given distance for COVID), we had created 3 statues with 3 LEDs each that would respond to a P5.js “send message” in a fake text threat that would light up one of the LEDs in response to draw attention to the visual of people having a conversation while we were apart as the 3 of us live completely spread out from each other.

We instead changed the assignment to be more so representative of being together and what happens when that action is fulfilled. Given our attempts and misunderstanding with our first concept our desperation to salvage something from the original resulted in keeping one statue with 3 LEDs and instead shift from Phone – Arduino to Arduino – Arduino using the Central to Peripheral communications. For this we decided the sculpture was to be a permanent piece of decor to exist with the LEDs lit at all times, however when the peripheral Arduino is sensed via bluetooth an LED would blink in response to someone being within range giving a sense of curiosity as to who is coming near.

whatsapp-image-2021-12-10-at-2-26-43-pm-1 screen-shot-2021-12-11-at-2-48-01-pm

In an ideal world with additional Arduino we would have enjoyed exploring how to connect multiple periphery to the central in some form of disguised Arduino that would just be kept on our person, for example if we all had some type of “tile” like item to keep in our bags with the sculpture placed in the studio having the LEDs blink in response signals to us (and others near) that someone is on their way.

*Disclaimer, given our sudden shift in concept in order to make it more in line with the requirements of the assignment we used the Arduino example of Central to Peripheral communication using the BLE Senses gesture sensor and changed the code to work with a blinking LED in response to the central connecting to the peripheral.

Demo Video | How it Works | Code

References
Arduino. “Connecting Nano 33 BLE Devices over Bluetooth®”, Bagur, José. https://docs.arduino.cc/tutorials/nano-33-ble-sense/ble-device-to-device

Arduino. “How to connect with one master to 2 slaves”, Posted by vyohai, 2020. https://forum.arduino.cc/t/how-to-connect-with-one-master-to-2-slaves/670681

Arduino. “Controlling RGB LED Through Bluetooth®”, Troya, Fabricio. https://docs.arduino.cc/tutorials/nano-33-ble-sense/bluetooth

Arduino. “BLEPeripheral constructor.” https://www.arduino.cc/en/Reference/BLEPeripheralConstructor

Arduino. “central().” https://www.arduino.cc/en/Reference/BLEPeripheralCentral

Let There Be Light

Let There Be Light

set1

This project utilizes the BLE technology between an Arduino central with a button, a potentiometer, a light sensor, a water sensor, and an LED, and an Arduino peripheral controlling a rotating lamp with a hanging rotating lattice. The aim is to explore the layers of a central-peripheral control and use visual cues to imply functionality. The step-by-step interactions are as follows:

    1. The blue LED inside of a lotus origami on the packaged central lights up as the BLE connection is established. The user sees it and notices a button embedded in the origami and presses it. This lights up the blue LED in the identical lotus on the peripheral.

    2. The light sensor detects dimming in brightness as the user interacts with the peripheral. An LED in the koi origami lights up as this happens. 

    3. The user rotates the koi origami, making the lattice spin.

    4. The user notices the water drop icons next to the water sensor and sprays water on the surface, increasing the light brightness in the lamp.

I can see this being used as a playful lamp that can be used in the home setting. If two people are interacting with it, the multiple modes of control can invite surprising discoveries and interaction.

Experience Video

How It Works Video

Final Project Images

set3     set2

Development Images

img_4765 img_4778 img_4785

Code

Circuit Diagram

peripheral-diagram

diagram-central

Network Diagram

network-diagram

Big Mable

Big Mable – The Crystal Ball Extraordinaire

by Nooshin Mohtashami

The crystal ball is a mysterious thing and so is the reason why I got to build one for this project. I started with a very different idea for this experiment but all my attempts for the other ideas failed except for this one.

Big Mable in action

Big Mable is a “crystal ball” shape that is the housing to an Arduino Nano Ble 33 sense connected to an RGB LED. The Nano board is activated and controlled wirelessly by a web interface running a P5js script.

Web Interface to Big Mabel

This is a very basic setup of connecting a web interface to the Arduino wirelessly and controlling the peripheral BLE to turn on and off the LED that is connected to it.

What’s inside the crystal ball

Big Mable

Big Mable

Inside Big MableHow it works:

A BLE device configured in the central mode scans and detects nearby BLE devices and determines their available services. In this case, the central BLE is a web page running a P5js script that scans and detects nearby BLE devices. The user can initiate scanning for other BLE devices using this P5js script running on a web page. After scanning and discovering the peripheral BLE with a specific UUID, the user pairs the 2 BLE devices and uses the web interface to interact with the peripheral device. The user interaction results in the central BLE to write a characteristic value to the peripheral BLE and depending on this value the peripheral BLE turns on or off an RGB LED that is connected to it.

Videos and Source Code

fritzing

Potential extensions:

  • Use a mobile device as the central BLE to interface with the peripheral instead of a laptop computer
  • Use a bluetooth speaker connected to the Arduino and have Big Mabel respond by voice coming out of the bluetooth speakers when a question is asked.
  • I can easily modify the web interface of this experiment to create a very different and useful (to me) concept. By slightly changing the web page that connects to the peripheral BLE, and using the exact same hardware setup, I can communicate different messages to my son who is usually sitting in his room with headphones on, using blinking LEDs that change colour. For example, I can turn on the light to Green when it’s time to eat dinner or purple when it’s time to take the dog out for a walk. The bluetooth signal strength of the 2 devices are strong enough so that I can connect to the peripheral device from my laptop from anywhere in the house.

Helicopter Parents

Group Member: Ellie Huang and Winter Yan

Project Description:

The impact of family on the growth trajectory of children is hard to be ignored. Even though being too aloof from one other might make a family less bonded, being too close also risks sacrificing a certain degree of personal freedom. Helicopter parents are oftentimes referred to parents who pays extremely close attention to their children. As the name itself, helicopter parents hover overhead and micromanage every detailed aspect of their child’s life constantly. 

With this project, we would like to explore the inter-personal relationship affected by physical and psychological proximities. “Parents” with the controller are able to tap on the button and oversees where their “children” are at via lighted led strips. The central control is deliberately designed and fabricated in the form of a game controller, as it mimics the player character simulation of parents shadowing their own desires onto the children. The peripherals were constructed on top of neck collars, with the tightness of which, also symbolizes the sense of being controlled. Despite not being leashed, as long as “parents” and “children” are co-located in a space, “children” can be constantly monitored without noticing themselves.

For future development, we envision the collar strips to be created with more behaviours, such as tightening up or vibrating when the controller button is pressed. In an exhibition setting, we would like to give visitors the controllers when they enter the gallery space and let them identify people they want to keep an eye on throughout the gallery visit (if on a family basis, it could be parents – children, but also could be reversing the direction to children controlling their parents). It would be an interactive experience as visitors wander around the gallery, while not interfering with other art-appreciations activities at the same time.

Experience Video | How it Works Video | Arduino Code Link

Project Images:

img_3399

img_3403

Progress Photos:

development-image-002development-image-003

 

 

 

 

 

 

development-image-004development-image-005

Circuit Diagram –

Central:

central

Peripheral:

led-strips

 

References

[1] https://learn.adafruit.com/adafruit-neopixel-uberguide/arduino-library-use

[2] https://github.com/adafruit/Adafruit_MPR121

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.