Atelier (Discovery): Colour Tracking Audio Visualizer

By Madelaine Fischer-Bernhut

Github: https://github.com/MaddieFish/Atelier–Discovery—Assignment-1

Programming Language: JavaScript(p5.js and tracking.js)

For this project, I decided to stick with p5.js as we had been learning about its capabilities within Atelier this past couple of weeks. Javascript, in general, is the language I have the most experience in and furthermore, p5.js is a library I am most familiar with (I have used processing a lot in the past). Still, I wanted to challenge myself by integrating another library alongside p5.js into my work, so I decided I wanted to try adding a computer vision library to my project. I have always been fascinated with computer vision, so exploring it was something I was excited to do. I found and decided to use the computer vision library tracking.js. The library allows the user to colour track, motion track, face track, and more. I decided to focus on colour tracking.

Project Description:

The project I created is a colour tracking audio visualizer. The program takes the pixel information of a webcam/computers camera to track the presence of specific colours (magenta, cyan, and yellow). The tracking information/data (which includes x and y coordinates, tracked colour, and dimensions of the coloured object) is stored in an array as values. Within the script, I was able to call those values to create parameters for the representative ellipses and sounds for the tracked colours. I used p5.js’s capabilities in oscillation to create a sound output. The tracked cyan colour controls the oscillator’s frequency and the tracked yellow colour controls the amplitude.  The magenta colour adds a Bandpass filter over the sound and controls the filter’s resonance and frequency.

The coloured objects I used for tracking.
The coloured objects I used for tracking.

Note: In the future, I would like to create a glove of sorts with colourful fingers instead of individual objects. I think it would be easier for the user to create more controlled and finesse sounds. For example, one would only have to hold a finger down if they do not want to track a particular colour, instead of having to completely put it down while juggling the objects for other colours.

My goal for this project was to create something that was fun to interact with. I wanted to go beyond simple mouse events and into an interactivity that went beyond just a gesture of your fingers. Because I decide to use colour tracking and computer vision, the user can interact with the program without even touching the computer. All they need is colour to control the sound and visuals of the project.

I have always been fascinated with the installations and projects that use technology to track the movements of the human body to create generative works of art or immersive experiences.  My use of colour tracking in my project is just a stripped down way of implementing movement analysis within p5.js.  Originally, I thought of using a Kinect, like many other of the projects I’ve seen, for its built-in motion tracking abilities, but I decided against it. Instead of using the form of the human body I wanted to use colour because I felt it would be easier to implement with my present skills.

Continue reading “Atelier (Discovery): Colour Tracking Audio Visualizer”