Author Archive

“Who’s There?”


Much like with any final assignment, we had our fair share of ups and downs. Originally we had interpreted the assignments use of proximity more in line with what was apparently done last year (given distance for COVID), we had created 3 statues with 3 LEDs each that would respond to a P5.js “send message” in a fake text threat that would light up one of the LEDs in response to draw attention to the visual of people having a conversation while we were apart as the 3 of us live completely spread out from each other.

We instead changed the assignment to be more so representative of being together and what happens when that action is fulfilled. Given our attempts and misunderstanding with our first concept our desperation to salvage something from the original resulted in keeping one statue with 3 LEDs and instead shift from Phone – Arduino to Arduino – Arduino using the Central to Peripheral communications. For this we decided the sculpture was to be a permanent piece of decor to exist with the LEDs lit at all times, however when the peripheral Arduino is sensed via bluetooth an LED would blink in response to someone being within range giving a sense of curiosity as to who is coming near.

whatsapp-image-2021-12-10-at-2-26-43-pm-1 screen-shot-2021-12-11-at-2-48-01-pm

In an ideal world with additional Arduino we would have enjoyed exploring how to connect multiple periphery to the central in some form of disguised Arduino that would just be kept on our person, for example if we all had some type of “tile” like item to keep in our bags with the sculpture placed in the studio having the LEDs blink in response signals to us (and others near) that someone is on their way.

*Disclaimer, given our sudden shift in concept in order to make it more in line with the requirements of the assignment we used the Arduino example of Central to Peripheral communication using the BLE Senses gesture sensor and changed the code to work with a blinking LED in response to the central connecting to the peripheral.

Demo Video | How it Works | Code

Arduino. “Connecting Nano 33 BLE Devices over Bluetooth®”, Bagur, José.

Arduino. “How to connect with one master to 2 slaves”, Posted by vyohai, 2020.

Arduino. “Controlling RGB LED Through Bluetooth®”, Troya, Fabricio.

Arduino. “BLEPeripheral constructor.”

Arduino. “central().”

“9” – Desk Companion & Halloween Decoration

pxl_20211028_003419408 “9”

“9” is the ninth Stitch-Punk from the 2009 Shane Acker movie of the same name taking place in a dystopian future where the split soul of a scientist lives on through burlap-sack dolls known as Stitch-Punks. I decided to create a replica of the protagonist some time ago as decoration and given Halloween is approaching I decided to recreate some of his known actions from within the movie consisting of his eyes flickering when the microphone picks up sound in order to simulate blinking, his torch lighting up once the lights are turned off to guide him through the darkness and flickering when the lights are on (flame/light going out), and lastly his “soul/heart” lighting up when something is near. All of these interactions may use both passive and active interaction as he can simply react to the environment he’s in without intervention or one can directly interact with the character. I thought this character as the perfect vessel for this experiment as it can utilize the functions explored using the Arduino, sensors and LEDs while staying somewhat true to the character as he is seen within the movie. 

In relation to the calm technology principles this project utilizes the following:

  • Technology should require the smallest possible amount of attention. It does not need to be directly interacted with in order to work, it can completely react to the environment around it while keeping me company atop my desk.
  • Technology should inform and create calm. As a desk companion it provides an ascribed sense of emotional support.
  • Technology can communicate, but doesn’t need to speak. The data used by the sensors can relay the message of several different things given context, for example why did the eyes light up? Was there a loud noise? Why is the soul getting brighter? Is something approaching? Why is the torch going out? How much light is in the room.

Experience Video | How it Works | Github Code


pxl_20211026_163202871  pxl_20211027_014257558  pxl_20211027_224527073pxl_20211028_001016837pxl_20211028_003410583

Official “9” Merch & Poster





**Please note that I do not own any rights to the design of the Stitch-Punks/9 Doll or any rights to the movie 9

9. Acker, Shane. Focus Features, 2009.

Unknown. “1.jpeg”, Pinterest, Uploaded by Edward Lee

Unknown. “2.jpeg”, Pinterest, Uploaded by IMDB

Experiment 1: Re-learning to learn…how to learn

When I was in my 1st year of undergrad I had briefly been tasked with learning processing (similar to P5.js), that was its own challenge and I remember thinking “well thankfully that’s over with, I won’t have to use that again”. That was nearly 6 years ago and yet here we are. I my stomach sank seeing Processing and P5.js on the course outline but onward we go. I would say I succeeded in attempting to re-learn Processing/P5.js while trying to fight off the thought that I’m not good at it. I’m not the best nor do I think I did all that well seeing as with everyone learning for the first time tutorials and remixing code is your best friend in your journey to learning different coding languages. Looking back I would say that I “failed” at being more creative in my studies. I spent more time learning, watching tutorials over and over while not being able to focus and making sure the code worked than testing out newer or more unique ways to click and scroll. I relied heavily on tutorials and peers for learning how to do these and I found myself reusing code across all four because my brain could not process how do anything complex at this time. When in doubt, ask for help, I suppose, but this was an interesting experiment to reset my mind to think about traditional coding methods.

Study 1: Changing images, in a snap! (Click)

Starting off with the one code that doesn’t work properly. A common challenge I faced with all 4 studies was not declaring variables and the like properly which would lead to my code running but not registering anything. I tired to go into each study by writing the code as basic as I could without using the webcam until the very end. Clicking with a mouse is one of the first functions you learn, so I attempted (kind of successfully) to show images, “-in a snap!”. I would consider this successful seeing as the terminal shows me its registering my fingers snapping but for whatever reason is not loading the photos, I have a couple theories as to why this might be but I have not found a way to fix it as of late.


Present     Edit     Demo

Study 2: Head banging until you see colours (Click)

This was inspired by literally head banging and having the blood rush to your head in which case there is a chance you may see colours. Similar to the previous click where rapid movement registers the click, I’m still not certain as to how to refine the code to track the movements more accurately. This was a challenge as well as it selectively worked depending on how I wore my hair on webcam where my hair up was fine, but down and obscuring my face was problematic. It also registers every head movement I make as a click so it ends up looking more like what I was attempting for scrolling instead, minus the gradient transition between colours. Looking back I was try to find a way for it to register my head at one point on a grid that my webcam sees my head in and use that as the click but the variety that comes with the rapid colour changes is actually something I enjoy about the outcome, however unintentional. 


Present     Edit     Demo

Study 3: Groovy Baby!  (Scroll) 

This was a challenge just based on the type of movement needed which was a leg movement, I’m 5’11” so even fitting in the frame is a struggle muchness lifting my leg for x amount of time in order for my webcam to register it. Luckily framing it as if I’m just moving my mouse across a screen to change the gradient helped immensely. For this one my webcam was having a hard time keeping track of my full body despite the target being my knee, the skeleton would often “slip” off my body and appear behind me which quite scared me the first time it happened as I thought there was someone behind me until I realized what was happening. It is intended to make a gradient of green assuming it registers my knee which is the one part of my leg I can keep in frame. 


Present     Edit    Demo

Study 3: Groovy Baby! (Scroll) Part 2

I think this one is a bit more of an abstract interpretation of what “scrolling” is as well as my most successful study. I envisioned my old home desktop running Windows XP where the volume bar was just that, a bar, but with your mouse you could use the scroller to toggle the volume up or down. So I attempted to code a kind of pseudo music player where doing groovy arm movements would change the volume. This code tracks my hand (more generally my wrist) and lowers the volume as I raise my hand, this is because while I was sitting to code I did not want to get up in frame to see if putting my hand down would lower the volume, so I reversed it to allow myself to stay seated. Unlike the previous study, my webcam had less to track (only my upper body) and did so with more accuracy.


Present     Edit    Demo

*You will notice across my edit links that my projects have everything sorted into folders in the sidebar, this is just out of preference as I prefer to have my assets collapsed instead of seeing them stacked on top of each other.

*Please note my bibliography is quite long as I am citing every resource and media I used to learn.

Unknown. “11.jpg”, Pinterest, Uploaded by Отец всея Руси.

Unknown. “12.jpg”, Pinterest, Uploaded by Lemon Nomel.

Unknown. “13.jpg”, Pinterest, Uploaded by Unknown User.

Unknown. “14.jpg”, Pinterest, Uploaded by ylino. oui.

Turbo ft. Yoko Takahashi. “A Cruel Angel’s Thesis / Eurobeat Remix”, YouTube, Uploaded by Turbo, 2019.

Dan Shiffman. “7.6: Clicking on Objects -p5.js Tutorial”, YouTube, Uploaded by The Coding Train, 2015.

Dan Shiffman. “Q&A #1: Side-Scroller in p5.js”, YouTube, Uploaded by The Coding Train, 2016.

Biomatic Studios. “Let’s make Pong! (Tutorial for beginners) $1 – p5js”, YouTube, Uploaded by One Man Army Studios.

Dan Shiffman. “ml5.js: Webcam Image Classification”, YouTube, Uploaded by The Coding Train, 2018.

Dan Shiffman. “11.1: Live Video and createCapture() – p5.js Tutorial”, YouTube, Uploaded by The Coding Train, 2016.

Kazuki Umeda. “Face detection (webcam) for p5.js coders.”, YouTube, Uploaded by Kazuki Umeda, 2021.

Dan Shiffman. “7.4: Mouse Interaction with Objects – p5.js Tutorial”, YouTube, Uploaded by The Coding Train, 2017.

Dan Shiffman. “9.12: Local Server, Text Editor, JavaScript Console – p5.js Tutorial”, YouTube, Uploaded by The Coding Train, 2016.

Dan Shiffman. “P5.js Web Editor: Uploading Media Files – p5.js Tutorial”, YouTube, Uploaded by The Coding Train, 2018.

Lauren Lee McCarthy. “Reference: p5. SoundFile”, Webpage.

Dan Shiffman. “ml5.js Pose Estimation with PoseNet”, Youtube, Uploaded by The Coding Train, 2020.

Gloria Julien. “Code Test: Removing Images [p5.js]”, Webpage, 2019.

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.