By Madelaine Fischer-Bernhut
The project I created is a colour tracking audio visualizer. The program takes the pixel information of a webcam/computers camera to track the presence of specific colours (magenta, cyan, and yellow). The tracking information/data (which includes x and y coordinates, tracked colour, and dimensions of the coloured object) is stored in an array as values. Within the script, I was able to call those values to create parameters for the representative ellipses and sounds for the tracked colours. I used p5.js’s capabilities in oscillation to create a sound output. The tracked cyan colour controls the oscillator’s frequency and the tracked yellow colour controls the amplitude. The magenta colour adds a Bandpass filter over the sound and controls the filter’s resonance and frequency.
Note: In the future, I would like to create a glove of sorts with colourful fingers instead of individual objects. I think it would be easier for the user to create more controlled and finesse sounds. For example, one would only have to hold a finger down if they do not want to track a particular colour, instead of having to completely put it down while juggling the objects for other colours.
My goal for this project was to create something that was fun to interact with. I wanted to go beyond simple mouse events and into an interactivity that went beyond just a gesture of your fingers. Because I decide to use colour tracking and computer vision, the user can interact with the program without even touching the computer. All they need is colour to control the sound and visuals of the project.
I have always been fascinated with the installations and projects that use technology to track the movements of the human body to create generative works of art or immersive experiences. My use of colour tracking in my project is just a stripped down way of implementing movement analysis within p5.js. Originally, I thought of using a Kinect, like many other of the projects I’ve seen, for its built-in motion tracking abilities, but I decided against it. Instead of using the form of the human body I wanted to use colour because I felt it would be easier to implement with my present skills.
One of my inspirations was Sony’s VR Move Controllers. Sony’s VR Move Controllers also uses a mixture of colour and brightness tracking with an external camera to allow the character to spatially interact with the VR world.
I also delved into a lot of processing based Kinect installations in order for inspiration.
Partycles – An Interactive Installation // Microsoft Kinect: https://www.youtube.com/watch?v=SEoyw8Slclc
Partycles was created using processing and utilized the movement of two people to influence a particle system of 20,000 particles. I was afraid that if I did something similar with p5.js it would be too much for the web server to handle. I wanted to create something that would be accessible to anyone with a regular computer webcam.
Ross Flight: Kinnect Body Instruments: https://www.youtube.com/watch?v=mnAb-uJYLNs
Image Documentation Gallery:
Click on thumbnails for more information and description.
BMOREN (Get started with color tracking)-https://gist.github.com/bmoren/3ff2cbc1f254092b82f12ab039fa5da2
I used BMOREN’s code as the core of the program and expanded beyond it by adding audio interaction and visualization. His initial code only created the basis to track the colour by converting parts of the colour camera code so it would be readable bt p5.js. It was a very helpful reference!