Experiment 3: cronoScroll (fixed reupload)



cronoScroll is a tangible interface that allows a user to chronologically scroll through a museums archive. This type of navigation allows the user to explore the relationships between time and the artworks while observing their historical relations and the gradual evolution of various art-forms.

The interaction is achieved using an ultrasonic distance sensor and a user controlled draggable block that traverses the art timeline as it is scrubbed closer towards the control box fitted with the sensor.

This project was created using an Arduino Uno connected to an ultrasonic distance sensor. Sensor values are then fed to Processing—through serial processing—where the visual output is created.
A lot of focus and attention was paid to ensuring fluid interactions through smooth animations instead of jerky state transitions. The animation of the interface is powered by linear interpolation (lerp) to simulate smooth transitions between state changes. Lerp is also used to smoothen out the noisy sensor reading and prevent jumpy sensor values.

Images used are from the Getty Search Gateway and are part of the Getty Open Content Program, a program to “share images of works of art in an unrestricted manner, freely, so that all those who create or appreciate art will have greater access to high-quality digital images for their studies and projects.”



  • Arduino Uno


  • Ultrasonic Distance Sensor


  • Processing

Experience Video: https://youtu.be/GlS72-mmf04

How It Works: https://youtu.be/a5MLehEjEl0

Arduino Github Code: https://github.com/DemiladeOla/crono-scroll/tree/main/arduino/sensor

Processing Github Code: https://github.com/DemiladeOla/crono-scroll/tree/main/processing/art-scroller


Experiment 4: Call & Response


This project was created as a response to the extended periods of physical distancing we’ve been facing as a result of a number of pandemic related conditions. Even as we slowly return to shared spaces, we still have to maintain some sort of social distancing and cover up our faces with masks making communication slightly harder than it should be.

Call & Response is a project that allows 2 people in a shared space communicate with musical messaging by sending synth phrases back and forth in response to each other, creating an ongoing and unending sonic conversation between the 2 participants.

Call & Response is inspired by a method of communicating in music by the same name where 1 performer plays a sonic phrase and a second performer or the audience replies with a second musical phrase.

The use of sounds as a messaging mechanism allows us bypass language barriers in traditional communication tools while also respectfully circumventing COVID restrictions by eliminating the need for verbal communication through a face mask.

This project uses a PHONE to PHONE interaction with the sonic messaging app being developed in p5.js. The participants scan a QR code present in the shared space which opens the p5.js app link and the real time connection between the participants phones is created using p5.party, a library for creating multiplayer apps with p5.js The synth sounds are also generated using Tone.js, a JavaScript library for creating interactive sounds in the browser using the Web Audio API


Experience Video: https://youtu.be/SF0W1fUtb-E


How it Works: https://youtu.be/VoyMg0WICSA


Link to Code on p5: https://editor.p5js.org/demilade/sketches/8NYLrFP6a


Link to the code on github: https://github.com/DemiladeOla/call-and-response






Experiment 2: bedsideBox



bedsideBox is a room ambience control device that sets a room’s music based on user presence, user proximity and room lighting.

The music is controlled by connecting an Arduino Uno to a simple webpage which plays embedded YouTube music videos based on the status of the sensors. The code for the connection is based off of a tutorial video from Adam Thomas

Mode 1 – Inactive: No active user detected in room & Room is bright

  • Quiet ambient music plays in room
  • Front LED’s stay solid blue
  • Bedside LED’s stay off

Mode 2 – Active: Active user detected in room & Room is bright

  • Lo-fi music plays in room to soundtrack activities
  • Front LED’s blink blue to indicate user’s presenceCalm TechnologyTechnology should require the smallest possible amount of attention
    1. Create ambient awareness through different senses.
    2. Communicate information without taking the wearer out of their environment or task.
  • Bedside LED’s stay off

Mode 3 – Bedside: Active user is on the bed

  • Bedtime/Reading music plays in room
  • Front LED’s go off
  • Bedside LED’s come on to indicate user’s presenceCalm TechnologyTechnology can communicate, but doesn’t need to speak

Mode 4 – Sleep: Room is dark

  • Music stops playing
  • Front LED’s go off
  • Bedside LED’s stay on



  • Arduino Uno


  • PIR Motion Sensor
  • Ultrasonic Distance Sensor
  • LDR/Photoresistor


  • LED’s


Experience Video: https://youtu.be/nqJ9JKSrnsc

How It Works: https://youtu.be/w284vYH7nLw

Arduino Github Code: https://github.com/DemiladeOla/sensitive-objects

screenshot-2022-02-01-at-22-42-45 screenshot-2022-02-01-at-22-42-56 screenshot-2022-02-01-at-22-43-10screenshot-2022-02-04-at-17-19-17

Experiment 1: Body As Controller

Internet Attention

My series of studies were centered around the internet, the web, social media and our relationships & experiences with these media. Given that this experiment deals with using our bodies as controllers, my focus was on visualizing the existing relationships we have with these media platforms and looking at how we could create friction in these technologies where necessary, making us more active participants in our relations with these platforms, taking back some control from the algorithms designed to keep us mindlessly scrolling, clicking and consuming.

Scroll 1 – doomScroll


For my first scroll, I thought about our relationships with endless feeds and how we’ve been accustomed to just scrolling these feeds for hours on end even when we might not want to.
Using Posenet, I prototyped a scenario where the user has to carefully hover their left hand over directional arrows to scroll, using this interaction as way to add friction to the process, limiting the time spent scrolling and making us think more intentionally about interacting with these feeds

Present Link: https://editor.p5js.org/demilade/full/Ul9HVzur1

Edit Link: https://editor.p5js.org/demilade/sketches/Ul9HVzur1

Interaction Video: https://youtu.be/N67gJCCAJpY


Click 1 – peskyPopups


You’ve probably been to a website where you were assaulted by 1 popup after the other, seemingly unending all preventing us from accessing the content we came for. My first click study is a mini game where the user has to hover their left hand over the popup buttons and ‘clap to click’ the buttons and close them before the screen is overrun with popups and the healthbar gets red. The whole process is reminiscent of clapping at pesky insects invading our personal spaces reflecting similar emotions pesky popups evoke.
I use Posenet to track the users wrists and the distance between the wrists to simulate a clap click.

Present Link: https://editor.p5js.org/demilade/full/SusMA6Svh

Edit Link: https://editor.p5js.org/demilade/sketches/SusMA6Svh

Interaction Video: https://youtu.be/rZBlnjZtTqM


Click 2 – clickSwarm


This is a visualization of how everything online is in a constant battle for our attention. Every corner of the internet is peppered with call-to-actions, begging us to click in order to get some more of our engagement.
For this study I use PoseNet to track the users face via nose tracking. The cursor follows the nose and the cursors  within a certain threshold ‘activate’ following you, no matter where you go. The user has to blast an imaginary fireball by bringing their wrists together like a kamehameha to briefly get the cursors to point away but then everything reverts showing how we only seem to get away from this attention battle for only a short period of time.

Present Link: https://editor.p5js.org/demilade/full/J1UxhwUL6

Edit Link: https://editor.p5js.org/demilade/sketches/J1UxhwUL6

Interaction Video: https://youtu.be/s_OZ0YM1F5E


Scroll 2 – speedScroll


Following up to and building on doomScroll, this scrolling experience uses 2 hands instead of 1. The users left hand needs to be over the direction they would like to scroll and the right hand controls the scroll speed based on the intensity of the vertical hand wipe.

Present Link: https://editor.p5js.org/demilade/full/zP7C_Cbh8

Edit Link: https://editor.p5js.org/demilade/sketches/zP7C_Cbh8

Interaction Video: https://youtu.be/9qX_5uFL2vM



Experiment 4: Game Mode



  • Project Description

“Game Mode” is an upgraded digital version of a game I used to play with my cousins as a kid. In the beginning of the game, we would choose a certain object to be the indicator. Then we blindfolded the player and asked them to walk around the room. Another player started tapping on a surface. When the first player went near the indicator, the second player tapped harder and faster. And when the first player got far from it, the tapper tapped slower and lower. By only listening to the tapping, the first player has to find the indicator and the game ends. Then another player takes their place, and the game starts again.

In this project, I decided to upgrade this game to a digital interactive version. The indicator is a cube that contains the Arduino circuit and a battery to power it up. I gave the game a “The Incredibles” theme since I noticed that the ultrasonic sensor I’m using in this project, reminds me a lot of the big round glasses of Edna Mode, a character in that movie.

When the player moves their hands around the table to find the object, a song is playing on the laptop screen. When their hands move near the sensor, the volume goes up. they have to keep moving their hands – since they are blindfolded – to find the object.


  • Final Project Images

20211231_131857    20211231_131537


  • Development Images

20211218_0123326 20211230_211913 20211217_2134166



  • Circuit Diagram


Beautiful Now: Glowing Heart

Present Video: https://www.youtube.com/watch?v=A1bcsZj27S0



For the Proximities project, we decided to create a piece of multi-color glowing intimate clothing whose colors can be controlled through a digital interface. We embed Arduino-controlled heart-shaped LED strings into a wearable piece of velvet intimate camisole.


How it works: https://www.youtube.com/watch?v=d4sWTWiissc&ab_channel=YTG
The color of the LED light is controlled by the number input on the digital screen. The color code is in RGB order. The camisole is black velvet when the light isn’t shined. When you put in a color code and activate the LED, it shines through the velvet material into a centered heart shape. Every time you change a number, the LED lights up in a different color. The main color scheme that controls the range of colors can only be modified in the LEDcontrol code. In this piece for right now, we chose to put blue on zero to increase our chances of getting red and purple-ish colors.

screen-shot-2021-12-12-at-6-00-43-pm screen-shot-2021-12-12-at-6-00-19-pm
Design Concept:

The wearer must be close to the digital interface that controls it. In our case, it is the laptop. This requires a shared-spatial relationship for both the controller and the wearer. Limited by the length of the USB cable, the wearer and the controller can be the same person. We chose the wearable piece to be an intimate piece because we want to emphasize and embrace the loving relationship between couples and ourselves, our mind and body.

When worn by two people, the lights-up revelation of a heart is a warm message from the controller to the wearer about their love for them. When it is worn and played with by one user, it refreshes the user’s experience of wearing underwear and reminds them to love themselves.  Underwear/intimate wear is often overlooked as the basic we wear everything else over. It doesn’t embellish the look of us nearly as much as what we wear over it. And we often rush to put on other clothes instead of taking a moment to appreciate our most real selves.  We want to create a fun piece that promotes a healthy and positive visual interaction between one and oneself. Through wearing and playing our piece, we encourage our users to see themselves from a new perspective. To notice the glowing heart as a nice reminder of the importance of self-love and confidence. You are beautiful the way you are. We picked LED under velvet material because the darker the surrounding is, the more obvious the LED shines through the black velvet. Just like how the darker it gets, the more important it is for us to hold onto our faith in ourselves to pull through.

This piece, by sending a visual message and provoking interaction with the wearer, constructs this special spatial relationship that aims to put a smile onto the wearer’s face and some positive vibe alongside it!

Arduino Code: https://github.com/BECKY-BAY/arduino/blob/main/Experiment4-Arduino

P5JS Code: https://github.com/BECKY-BAY/arduino/blob/main/Experiment4-P5


Potential future development:


Orchestral Beat


Group Members: Adit Verma & Arjun Dutt


The original intent for this BLE-based sound installation was to create a ‘digital orchestra’.  On a central device, the video and sound of a (live) orchestra would be playing. Each person who would want to interact with the installation would be given a pin-badge as an entry ticket or identifier that they are participating in the installation. Behind the badge we have fixed an Arduino to a CR2032 watch battery- given its size, it was the only power source we could use that was small and compact enough to fit behind a badge. The central device is programmed to scan for peripheral devices (Arduino) and subsequently, the Arduino (Peripheral) is programmed to Broadcast its RSSI to the Central device continuously.

To each individual Arduino, we would assign a particular instrument (drums, guitar, etc.) from the soundtrack we created on Garageband through P5.js. And once you are in the vicinity of the central device, the proximity of the Peripheral is read by the Central device and that triggers a certain manipulation in the music. In this case, to the volume of the particular instrument that has been assigned to the Arduino. For example, if you have been given a badge with a drum on it, once you start walking away from the Central device the volume of the drums starts to get softer and as you keep moving towards the Central device, the volume of the drums will get louder. So, in essence, the volume of the instruments would continually vary as people continually walked around the room.

As you can imagine, we ran into a lot of hurdles while working with the RSSI and trying to program the Peripherals and the Central. There were a lot of hurdles being trying to figure how we could assign a single instrument from a soundtrack (with multiple instruments) to a single Arduino, Once switched on the Arduino wouldn’t automatically try to pair with the central device, in fact, it would need to be manually connected. This really took away from our hope of seamless, fluid installation and the ability of Peripherals to talk to each other.

Ultimately, with time ticking down, we split the song into the sum of its instrumental parts and assigned an Arduino to each of the three instruments. Instead of a video with an orchestra conductor, we split the laptop screen into 3 distinct areas to correspond to three distinct instruments. Once the Arduino badge is manually connected, its assigned sound will start playing. The sound will go higher or lower in volume depending upon the viewer’s proximity to the Central device.

Experience Video | How it Works Video | Code Link

Project Images:


untitled-1   20211211_234501

Development Images:

20211211_234341  20211212_223843


Network Diagram:


There is Magic in the Air…and it’s called Bluetooth



With the world spiraling out of control in the past year or so, we felt so out of control with our lives. We didn’t have a say with staying employed, with whether that vacation can actually happen before another lockdown. We felt so small compared to the world. That’s why our group wanted to explore the notion of “man controlling nature.” We wanted to use arduinos to give users the opportunity to change the weather and feel like they are in control of their lives and even the sky. Our goal was to explore the proximities between people and their environment, and how this can impact the way we view the world and share connections. This was the interaction that held the most meaning for us.

The installation we created is made up of two parts: the cloud and the wand. We designed and built everything from scratch using a variety of materials. The cloud comprises of a wooden structure made of bamboo sticks covered in a LED strip and cotton. The cloud was hung on the ceiling using wire; inside the wooden structure we placed an arduino and circuit which could send instructions to the LED strip. The wand was made using a stick and plastic bottle and covered with hot glue and acrylic paint. Inside the wand we placed another arduino and batteries which would collect and send information to the arduino in the cloud when in proximity to it. For this project we started out with two command gestures: point down, and twirl which would send different signals to the cloud. The first gesture will cause the cloud to have a thunderstorm-like effect with flashing blue and purple lights and a thunder sound, while the second signal will cause the cloud to display a rainbow of changing colours.

For this assignment, we used two Arduino BLE devices with the one in the cloud being the central that reads the data and generates an output, and the one in the wand is the peripheral which collects the data and sends it along. Because the cloud is fixed in location, the user can make their way around the cloud and cast spells from any angle given they are in proximity to the cloud. The cloud acts as both the technical beacon and the beacon for the magic to happen. Like J.K Rowling once said: “it is important to remember that we all have magic inside us.”

Experience Video: https://clipchamp.com/watch/ZvYsHTc8RwI

How it Works Video: https://clipchamp.com/watch/92o5SPeDhna

Final Project Images: 






Development Images:

img_20211206_201331 img_20211208_173101 img_20211208_180703


The Code: https://github.com/ichangimathprayag/Exp4_MagicInTheAir

The Network Diagram: 





Group members: Kelly Xu, Zhino Yousefi, & Prayag Ichangimath

Brains Talking


 Group members: Siyu Sun, Mehdi Farahani,  Anantha Chickanayakanahalli

 “Brains being by themselves tend to get noisy especially with all the electronic devices to blame. When they are brought in proximity or closeness to other brains a sense of synchronization is possible and there is a scope of calmness. In this experiment, we are trying to represent this proximity by means of a physical installation of two objects that represent the two brains. They are activated by changes in their positions and when moved they demonstrate the noise of everyday sounds reverberating. These sounds come to a standstill and attain a silence when the brains are brought together (touching each other). This is meant to translate the effect people have on each other.”


Project Description

Brains Talking is an interactive artwork based on the study of  Bluetooth Low Energy (BLE). In this installation, we are digging into the term of Consciousness in the human mind. Consciousness means some kind of experience, cognition, feeling, imagination, and volition. We can perceive and receive the information in the world, and the brain is the most crucial tool to let it can be trued. There are so many nerve systems in our brain, and we can do the sensation during that, just like we can speak, we can touch, we can taste and smell the flavor, which can help us interact with the world. But compared with the un-consciousness, it is very easier to understand, just like we don’t have any self-thoughts, awareness, and can’t do the interactive. For instance, stones, sculptures, installations, even like our technology world. So we combine the still acrylic material and then use laser cutting to cut off the outline of the human brain, to mimic future human brains. when consciousness becomes unconscious, what can we humans do?


  1. We put some sound from our life. Neurofeedback is a form of alternative treatment wherein brain waves are retrained through auditory or visual stimuli. Everything can be turned to sound in our physical world, and they can activate our neuro system to make the feedback. So we are thinking about mimicking a virtual environment that can interact with future brains. The sound in this audio track came from the original physical sound, like the movement elevator, boiling water, the shaking of keys, flapping the handrail, closing the door, and so on. All of these inadvertently form our communication system. In this installation, we are set up a defined distance that can activate the sound, people can use the distance to control the sound value, which means the distance from each other is never far away. As we all know, following the development of technology in our recent world, the relationship between humans is turned into a silent world. We always focus on intelligent devices which ignore the people around us. So we want to use this kind of idea to critique this phenomenon.


2.  Mimic the hand’s power





Experience Video




How it Works Video


Final Imagines







Development  Images

Laser Cutting Sketch









Circuit Diagram





Processing File:


Arduino Code:





1-   Jade, Laura. “INTERACTIVE BRAINLIGHT- Brain Sculpture Lights Up Your Thoughts.” YouTube, Date accessed: December 03, 2021

URL: https://www.youtube.com/watch?v=ZQmA5X47ewU

2-   “BRAINLIGHT.” Laura Jade, Date published: August 29, 2018, Date accessed: December 03, 2021

URL: https://laurajade.com.au/interactive-brain-light-research-project/

3-    Jade. Laura – BRAIN LIGHT PROJECT – RESIDENCY 2015, Date published: July 08, 2019, Date accessed: December 03, 2021

URL: https://www.cultureatwork.com.au/laura-jade-culture-at-work-artscience-residency-2015/

4-     Kanai, R. (2010, August 31). Are Two Heads Better Than One? It Depends. Retrieved from https://www.scientificamerican.com/article/are-two-heads-better-than/#

“Who’s There?”


Much like with any final assignment, we had our fair share of ups and downs. Originally we had interpreted the assignments use of proximity more in line with what was apparently done last year (given distance for COVID), we had created 3 statues with 3 LEDs each that would respond to a P5.js “send message” in a fake text threat that would light up one of the LEDs in response to draw attention to the visual of people having a conversation while we were apart as the 3 of us live completely spread out from each other.

We instead changed the assignment to be more so representative of being together and what happens when that action is fulfilled. Given our attempts and misunderstanding with our first concept our desperation to salvage something from the original resulted in keeping one statue with 3 LEDs and instead shift from Phone – Arduino to Arduino – Arduino using the Central to Peripheral communications. For this we decided the sculpture was to be a permanent piece of decor to exist with the LEDs lit at all times, however when the peripheral Arduino is sensed via bluetooth an LED would blink in response to someone being within range giving a sense of curiosity as to who is coming near.

whatsapp-image-2021-12-10-at-2-26-43-pm-1 screen-shot-2021-12-11-at-2-48-01-pm

In an ideal world with additional Arduino we would have enjoyed exploring how to connect multiple periphery to the central in some form of disguised Arduino that would just be kept on our person, for example if we all had some type of “tile” like item to keep in our bags with the sculpture placed in the studio having the LEDs blink in response signals to us (and others near) that someone is on their way.

*Disclaimer, given our sudden shift in concept in order to make it more in line with the requirements of the assignment we used the Arduino example of Central to Peripheral communication using the BLE Senses gesture sensor and changed the code to work with a blinking LED in response to the central connecting to the peripheral.

Demo Video | How it Works | Code

Arduino. “Connecting Nano 33 BLE Devices over Bluetooth®”, Bagur, José. https://docs.arduino.cc/tutorials/nano-33-ble-sense/ble-device-to-device

Arduino. “How to connect with one master to 2 slaves”, Posted by vyohai, 2020. https://forum.arduino.cc/t/how-to-connect-with-one-master-to-2-slaves/670681

Arduino. “Controlling RGB LED Through Bluetooth®”, Troya, Fabricio. https://docs.arduino.cc/tutorials/nano-33-ble-sense/bluetooth

Arduino. “BLEPeripheral constructor.” https://www.arduino.cc/en/Reference/BLEPeripheralConstructor

Arduino. “central().” https://www.arduino.cc/en/Reference/BLEPeripheralCentral

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.