Atelier (Discovery): Colour Tracking Audio Visualizer

By Madelaine Fischer-Bernhut


Programming Language: JavaScript(p5.js and tracking.js)

For this project, I decided to stick with p5.js as we had been learning about its capabilities within Atelier this past couple of weeks. Javascript, in general, is the language I have the most experience in and furthermore, p5.js is a library I am most familiar with (I have used processing a lot in the past). Still, I wanted to challenge myself by integrating another library alongside p5.js into my work, so I decided I wanted to try adding a computer vision library to my project. I have always been fascinated with computer vision, so exploring it was something I was excited to do. I found and decided to use the computer vision library tracking.js. The library allows the user to colour track, motion track, face track, and more. I decided to focus on colour tracking.

Project Description:

The project I created is a colour tracking audio visualizer. The program takes the pixel information of a webcam/computers camera to track the presence of specific colours (magenta, cyan, and yellow). The tracking information/data (which includes x and y coordinates, tracked colour, and dimensions of the coloured object) is stored in an array as values. Within the script, I was able to call those values to create parameters for the representative ellipses and sounds for the tracked colours. I used p5.js’s capabilities in oscillation to create a sound output. The tracked cyan colour controls the oscillator’s frequency and the tracked yellow colour controls the amplitude.  The magenta colour adds a Bandpass filter over the sound and controls the filter’s resonance and frequency.

The coloured objects I used for tracking.
The coloured objects I used for tracking.

Note: In the future, I would like to create a glove of sorts with colourful fingers instead of individual objects. I think it would be easier for the user to create more controlled and finesse sounds. For example, one would only have to hold a finger down if they do not want to track a particular colour, instead of having to completely put it down while juggling the objects for other colours.

My goal for this project was to create something that was fun to interact with. I wanted to go beyond simple mouse events and into an interactivity that went beyond just a gesture of your fingers. Because I decide to use colour tracking and computer vision, the user can interact with the program without even touching the computer. All they need is colour to control the sound and visuals of the project.

I have always been fascinated with the installations and projects that use technology to track the movements of the human body to create generative works of art or immersive experiences.  My use of colour tracking in my project is just a stripped down way of implementing movement analysis within p5.js.  Originally, I thought of using a Kinect, like many other of the projects I’ve seen, for its built-in motion tracking abilities, but I decided against it. Instead of using the form of the human body I wanted to use colour because I felt it would be easier to implement with my present skills.

One of my inspirations was Sony’s VR Move Controllers. Sony’s VR Move Controllers also uses a mixture of colour and brightness tracking with an external camera to allow the character to spatially interact with the VR world.

Image result for sony vr controller technology

I also delved into a lot of processing based Kinect installations in order for inspiration.

Partycles – An Interactive Installation // Microsoft Kinect:

Image result for Partycles - An Interactive Installation // Microsoft Kinect

Partycles was created using processing and utilized the movement of two people to influence a particle system of 20,000 particles. I was afraid that if I did something similar with p5.js it would be too much for the web server to handle. I wanted to create something that would be accessible to anyone with a regular computer webcam.

Ross Flight: Kinnect Body Instruments:

Image result for V Motion Project
Both Ross Flight’s Kinnect Body Instruments and the V Motion Project utilizes motion to create music and turn the body into a musical instrument. After getting the colour tracking to work I knew I wanted to add something else to make my project more dynamic. Before I even settled on colour tracking I had wanted to create an audio visualizer of some sort but decided against it when I realized that an audio visualizer would probably be a go-to project for most people. It was only after I started researching motion capture and coming across installations and performances, like the top two, did I realize that I could use colour tracking to create audio alongside being a visualization for that audio. The visual impacts the audio and the audio impacts the visual. I was sold to the idea immediately.


Videos: (I was not able to embed the videos, so it must be downloaded)

Demo Video Colour Tracking Audio Visualizer 1 (Webcam)

Demo Video Colour Tracking Audio Visualizer 2 (Grey)


Image Documentation Gallery:

Click on thumbnails for more information and description.

Code References: 

BMOREN (Get started with color tracking)-

Colour Camera-

I used BMOREN’s code as the core of the program and expanded beyond it by adding audio interaction and visualization. His initial code only created the basis to track the colour by converting parts of the colour camera code so it would be readable bt p5.js. It was a very helpful reference!


Leave a Reply