Author Archive

Experiment 3: cronoScroll (fixed reupload)

 

screenshot-2022-04-06-at-12-57-10

cronoScroll is a tangible interface that allows a user to chronologically scroll through a museums archive. This type of navigation allows the user to explore the relationships between time and the artworks while observing their historical relations and the gradual evolution of various art-forms.

The interaction is achieved using an ultrasonic distance sensor and a user controlled draggable block that traverses the art timeline as it is scrubbed closer towards the control box fitted with the sensor.

This project was created using an Arduino Uno connected to an ultrasonic distance sensor. Sensor values are then fed to Processing—through serial processing—where the visual output is created.
A lot of focus and attention was paid to ensuring fluid interactions through smooth animations instead of jerky state transitions. The animation of the interface is powered by linear interpolation (lerp) to simulate smooth transitions between state changes. Lerp is also used to smoothen out the noisy sensor reading and prevent jumpy sensor values.

Images used are from the Getty Search Gateway and are part of the Getty Open Content Program, a program to “share images of works of art in an unrestricted manner, freely, so that all those who create or appreciate art will have greater access to high-quality digital images for their studies and projects.”

 

Microcontroller:

  • Arduino Uno

Input:

  • Ultrasonic Distance Sensor

Output:

  • Processing

Experience Video: https://youtu.be/GlS72-mmf04

How It Works: https://youtu.be/a5MLehEjEl0

Arduino Github Code: https://github.com/DemiladeOla/crono-scroll/tree/main/arduino/sensor

Processing Github Code: https://github.com/DemiladeOla/crono-scroll/tree/main/processing/art-scroller
screenshot-2022-04-06-at-14-22-04
screenshot-2022-04-06-at-14-19-56
screenshot-2022-04-06-at-14-26-14

screenshot-2022-04-06-at-14-17-01

Experiment 4: Call & Response

screenshot-2022-02-07-at-12-23-33

This project was created as a response to the extended periods of physical distancing we’ve been facing as a result of a number of pandemic related conditions. Even as we slowly return to shared spaces, we still have to maintain some sort of social distancing and cover up our faces with masks making communication slightly harder than it should be.

Call & Response is a project that allows 2 people in a shared space communicate with musical messaging by sending synth phrases back and forth in response to each other, creating an ongoing and unending sonic conversation between the 2 participants.

Call & Response is inspired by a method of communicating in music by the same name where 1 performer plays a sonic phrase and a second performer or the audience replies with a second musical phrase.

The use of sounds as a messaging mechanism allows us bypass language barriers in traditional communication tools while also respectfully circumventing COVID restrictions by eliminating the need for verbal communication through a face mask.

This project uses a PHONE to PHONE interaction with the sonic messaging app being developed in p5.js. The participants scan a QR code present in the shared space which opens the p5.js app link and the real time connection between the participants phones is created using p5.party, a library for creating multiplayer apps with p5.js The synth sounds are also generated using Tone.js, a JavaScript library for creating interactive sounds in the browser using the Web Audio API

 

Experience Video: https://youtu.be/SF0W1fUtb-E

 

How it Works: https://youtu.be/VoyMg0WICSA

 

Link to Code on p5: https://editor.p5js.org/demilade/sketches/8NYLrFP6a

 

Link to the code on github: https://github.com/DemiladeOla/call-and-response

 

 

screenshot-2022-02-07-at-12-32-45

screenshot-2022-02-04-at-21-16-53

 

Experiment 2: bedsideBox

bedsideBox

screenshot-2022-02-01-at-22-41-02

bedsideBox is a room ambience control device that sets a room’s music based on user presence, user proximity and room lighting.

The music is controlled by connecting an Arduino Uno to a simple webpage which plays embedded YouTube music videos based on the status of the sensors. The code for the connection is based off of a tutorial video from Adam Thomas

Mode 1 – Inactive: No active user detected in room & Room is bright

  • Quiet ambient music plays in room
  • Front LED’s stay solid blue
  • Bedside LED’s stay off

Mode 2 – Active: Active user detected in room & Room is bright

  • Lo-fi music plays in room to soundtrack activities
  • Front LED’s blink blue to indicate user’s presenceCalm TechnologyTechnology should require the smallest possible amount of attention
    1. Create ambient awareness through different senses.
    2. Communicate information without taking the wearer out of their environment or task.
  • Bedside LED’s stay off

Mode 3 – Bedside: Active user is on the bed

  • Bedtime/Reading music plays in room
  • Front LED’s go off
  • Bedside LED’s come on to indicate user’s presenceCalm TechnologyTechnology can communicate, but doesn’t need to speak

Mode 4 – Sleep: Room is dark

  • Music stops playing
  • Front LED’s go off
  • Bedside LED’s stay on

 

Microcontroller:

  • Arduino Uno

Sensors:

  • PIR Motion Sensor
  • Ultrasonic Distance Sensor
  • LDR/Photoresistor

Actuators:

  • LED’s

 

Experience Video: https://youtu.be/nqJ9JKSrnsc

How It Works: https://youtu.be/w284vYH7nLw

Arduino Github Code: https://github.com/DemiladeOla/sensitive-objects

screenshot-2022-02-01-at-22-42-45 screenshot-2022-02-01-at-22-42-56 screenshot-2022-02-01-at-22-43-10screenshot-2022-02-04-at-17-19-17

Experiment 1: Body As Controller

Internet Attention

My series of studies were centered around the internet, the web, social media and our relationships & experiences with these media. Given that this experiment deals with using our bodies as controllers, my focus was on visualizing the existing relationships we have with these media platforms and looking at how we could create friction in these technologies where necessary, making us more active participants in our relations with these platforms, taking back some control from the algorithms designed to keep us mindlessly scrolling, clicking and consuming.

Scroll 1 – doomScroll

screenshot-2022-02-01-at-22-21-28

For my first scroll, I thought about our relationships with endless feeds and how we’ve been accustomed to just scrolling these feeds for hours on end even when we might not want to.
Using Posenet, I prototyped a scenario where the user has to carefully hover their left hand over directional arrows to scroll, using this interaction as way to add friction to the process, limiting the time spent scrolling and making us think more intentionally about interacting with these feeds

Present Link: https://editor.p5js.org/demilade/full/Ul9HVzur1

Edit Link: https://editor.p5js.org/demilade/sketches/Ul9HVzur1

Interaction Video: https://youtu.be/N67gJCCAJpY

 

Click 1 – peskyPopups

screenshot-2022-02-01-at-22-26-03

You’ve probably been to a website where you were assaulted by 1 popup after the other, seemingly unending all preventing us from accessing the content we came for. My first click study is a mini game where the user has to hover their left hand over the popup buttons and ‘clap to click’ the buttons and close them before the screen is overrun with popups and the healthbar gets red. The whole process is reminiscent of clapping at pesky insects invading our personal spaces reflecting similar emotions pesky popups evoke.
I use Posenet to track the users wrists and the distance between the wrists to simulate a clap click.

Present Link: https://editor.p5js.org/demilade/full/SusMA6Svh

Edit Link: https://editor.p5js.org/demilade/sketches/SusMA6Svh

Interaction Video: https://youtu.be/rZBlnjZtTqM

 

Click 2 – clickSwarm

screenshot-2022-02-01-at-22-28-41

This is a visualization of how everything online is in a constant battle for our attention. Every corner of the internet is peppered with call-to-actions, begging us to click in order to get some more of our engagement.
For this study I use PoseNet to track the users face via nose tracking. The cursor follows the nose and the cursors  within a certain threshold ‘activate’ following you, no matter where you go. The user has to blast an imaginary fireball by bringing their wrists together like a kamehameha to briefly get the cursors to point away but then everything reverts showing how we only seem to get away from this attention battle for only a short period of time.

Present Link: https://editor.p5js.org/demilade/full/J1UxhwUL6

Edit Link: https://editor.p5js.org/demilade/sketches/J1UxhwUL6

Interaction Video: https://youtu.be/s_OZ0YM1F5E

 

Scroll 2 – speedScroll

screenshot-2022-02-01-at-22-31-00

Following up to and building on doomScroll, this scrolling experience uses 2 hands instead of 1. The users left hand needs to be over the direction they would like to scroll and the right hand controls the scroll speed based on the intensity of the vertical hand wipe.

Present Link: https://editor.p5js.org/demilade/full/zP7C_Cbh8

Edit Link: https://editor.p5js.org/demilade/sketches/zP7C_Cbh8

Interaction Video: https://youtu.be/9qX_5uFL2vM

 

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.