Daisy Max/Msp patch

Hey guys. Just thought I would post the Max/Msp patch for Daisy. The patch reads in values from three proximity sensors through an arduino running firmata. If you are interested in Msp or with sound and programming check out the patch and just start playing! If you have any questions I would love to share what I have learned so far. Flow based programming could be a cool alternative for some of you who like to think more visually.

Here is a link to the patch.


M2=V (Music + Movement = Visualz)

Input in: center of mass dancers + beatDetector track = screen viz

Source Code: https://www.dropbox.com/sh/etb2jxd98sh49yb/IrXRLIRJnS




We used the oscP5message  code by andreas schlegel to receive data from the Center Of Mass Server for Kinect by Nick Puckett.


From this base we added on pieces of code from this example:


http://www.openprocessing.org/sketch/2271 that was modified to create a substantially different output.


We then added code to convert the input from the Kinect into visuals based on the Kinect Server code by Nick Puckett.


Near the end of the project our awesome DF classmate Lindy Wilkins helped us out



* oscP5message by andreas schlegel

* example shows how to create osc messages.

* oscP5 website at http://www.sojamo.de/oscP5



import oscP5.*;

import netP5.*;


import ddf.minim.*;

import ddf.minim.analysis.*;

import gifAnimation.*;


//import processing.net.*;


Minim minim;

AudioPlayer song;

FFT fft;


PImage[] animation;

Gif loopingGif;

Gif nonLoopingGif;


float kickSize, snareSize, hatSize;


OscP5 oscP5;

NetAddress myRemoteLocation;


int connectionPort = 5206;



float SW1 = 0;


String pattern1 = “/Person1”;

//float x3d1;

//float y3d1;

//float z3d1;

float x2d1;

float y2d1;


float x2d1Prev;

float y2d1Prev;


float x2d2Prev;

float y2d2Prev;


float vel1;

float vel2;


String pattern2 = “/Person2”;

//float x3d2;

//float y3d2;

//float z3d2;

float x2d2;

float y2d2;


//String pattern3 = “/Person3”;

//float x3d3;

//float y3d3;

//float z3d3;

//float x2d3;

//float y2d3;


//String pattern4 = “/Person4”;

//float x3d4;

//float y3d4;

//float z3d4;

//float x2d4;

//float y2d4;


//String pattern5 = “/Person5”;

//float x3d5;

//float y3d5;

//float z3d5;

//float x2d5;

//float y2d5;


//String pattern6 = “/Person6”;

//float x3d6;

//float y3d6;

//float z3d6;

//float x2d6;

//float y2d6;



float dis1_2;


float SWdist1_2;

float SCvel1;

float SCvel2;


void setup() {


// frameRate(24);

oscP5 = new OscP5(this,connectionPort);



minim = new Minim(this);

song = minim.loadFile(“04 Common People.mp3”, 512);


fft = new FFT(song.bufferSize(), song.sampleRate());

loopingGif = new Gif(this, “rain.gif”);


//nonLoopingGif = new Gif(this, “rain.gif”);


minim = new Minim(this);






void draw() {



//draw a vertical line that follows User 1 horizontally

//println (y2d1);

SW1 = int(map(y2d1,0.0,450.0,10.0,320.0));

SCvel1 = map(vel1,10.0,25.0,120.0,255.0);

SCvel2 = map(vel2,10.0,25.0,120.0,255.0);

SWdist1_2 = map(dis1_2,40.0,120.0,10.0,100);



//draw a horizontal line that follows User 2 vertitally




//draw a vertical line that follows User 3 horizontally

//  fill(175,20,40,70);

//  rect(x2d3,0,40,displayHeight);


//draw a horizontal line that follows User 4 vertitally




//draw a vertical line that follows User 5 horizontally




//draw a horizontal line that follows User 6 vertitally







println (song.bufferSize());


for (int i = 0; i < song.bufferSize() – 256; i++)


//rect(width, 900, song.left.get(i)*5000, song.left.get(i)*500);



ellipse(width/2, -20, song.left.get(i)*2600, song.left.get(i)*800);






dis1_2= dist(x2d1,y2d1,x2d2,y2d2);

//println((“dist 1 2: “+dis1_2));


vel1= dist(x2d1,y2d1,x2d1Prev,y2d1Prev);




vel2= dist(x2d2,y2d2,x2d2Prev,y2d2Prev);




//println((“velocity “+ vel1));



/* incoming osc message are forwarded to the oscEvent method. */

void oscEvent(OscMessage theOscMessage) {


if(theOscMessage.checkAddrPattern(pattern1)==true) {







if(theOscMessage.checkAddrPattern(pattern2)==true) {







//  if(theOscMessage.checkAddrPattern(pattern3)==true) {

//     x3d3=theOscMessage.get(0).intValue();

//     y3d3=theOscMessage.get(1).intValue();

//     z3d3=theOscMessage.get(2).intValue();

//     x2d3=theOscMessage.get(3).intValue();

//     y2d3=theOscMessage.get(4).intValue();

//   }

//   if(theOscMessage.checkAddrPattern(pattern4)==true) {

//     x3d4=theOscMessage.get(0).intValue();

//     y3d4=theOscMessage.get(1).intValue();

//     z3d4=theOscMessage.get(2).intValue();

//     x2d4=theOscMessage.get(3).intValue();

//     y2d4=theOscMessage.get(4).intValue();

//   }

//   if(theOscMessage.checkAddrPattern(pattern5)==true) {

//     x3d5=theOscMessage.get(0).intValue();

//     y3d5=theOscMessage.get(1).intValue();

//     z3d5=theOscMessage.get(2).intValue();

//     x2d5=theOscMessage.get(3).intValue();

//     y2d5=theOscMessage.get(4).intValue();

//   }

//   if(theOscMessage.checkAddrPattern(pattern6)==true) {

//     x3d6=theOscMessage.get(0).intValue();

//     y3d6=theOscMessage.get(1).intValue();

//     z3d6=theOscMessage.get(2).intValue();

//     x2d6=theOscMessage.get(3).intValue();

//     y2d6=theOscMessage.get(4).intValue();


//   }




void stop()











We used code samples from the following at various points:







Cheers to our lovely dancers! We are the night. Let’s take over the world~

Mazi, Hank, Torin



The Drawing Board: Initial Prototype & Final Product

Our initial prototype consisted of the Skeleton processing code we were given by nick, modified to track the hands of the user – allowing them to draw with the right hand, and control both stroke width and colour using the y and x axes of the left hand, respectively.

Click here to download the processing sketch.

The final execution was built in Adobe Air/AS3 and used the AirKinect extension for gathering the depth camera data and a UDP socket to transfer it over the network.

Chop Shop

Chop Shop (Heather Phenix, Jeremy Littler  and Mitzi Martinez) – OCAD DFI, October 2012



Cho Shop Logo

Chop Shop is a multi-player, screen-based and sensor-controlled game that speaks to the collaborative and competitive physicality of our work environment. Introducing a playful, tactile game with multi-player capabilities is a way to facilitate interaction between DFI workspace dwellers and potentially increase productivity. Single player mode is also possible. Short and frequent breaks from work are known to get ideas flowing with fresh perspective, and this is the main intention behind Chop Shop. The interaction is provided through a series of proximity sensor boards connected to a central Arduino Uno and a projector. The fluctuating proximity of players’ hands over the sensor boards is what influences the related screen-based Chop Shop icons to move towards a target – a common goal if you will. The illustrated floating body parts projected on the screen make reference to the senses involved in the human design creation process. The idea is to provide an opportunity to break free from typical static sitting positions involved in digital work flows and increase productivity by moving around having fun in a collaborative environment.




Initial Concept


Early Wiring Test



Assembly Test: It Worked!


Final Testing and Development (Photographs by Mtzi)


Continue reading Chop Shop

‘Colour Me!’, the Pollen-picking Game

“Hello” from Peggy, John and Hudson. We’d like to introduce you to our new game, ‘Colour Me!’ ‘Colour Me!’ is a two-player, Kinect-based game.

The game world consists of a stick figure in a landscape of simple shapes and colours. The figure’s chest is round and contains a grid of 25 empty circles. This grid represents the task ahead and player progress. As the player brings flowers to her chest, their orange pollen is incrementally transferred to the grid. As pollen storage grows, the grid becomes more orange: hence the game’s name.

The first player must pick at least 20 flowers before the sun sets. If she doesn’t, a monster rises up to devour her. But getting and keeping the flowers is no easy task: the player is hindered by the random appearance of stinging bees, of poisonous flowers and by the effects of wind. Each of these elements, unless avoided or defeated, has a subtractive effect on the amount of stored pollen.

The bees are the most dangerous of all: it is the task of the second player to defend the first player by using wind to prevent them from stinging her.

The game uses Arduino and Processing to communicate with two actual, motorized fans set on either side of the playing area. The second player has been deliberately set behind and out-of-view of the game’s projection: he can only act according to instructions vocalized by the first player. This arrangement encourages loud, clear communication and teamwork between players. You can’t win on your own!

This is a fun, physical game and we hope it piques your interest!

Please enjoy our final video:

Our code is available here:


‘KRF_kinectSkeletonServer_v1_4’ is the Processing sketch that serves up the Kinect data.

‘Fans_Rev5’ is the Arduino sketch which runs the fans and provides the fan activity data to our game.

‘Skeleton_Maker_Rev15’ is the Processing sketch of our game.  It contains many tabs to keep things organized and house the different game assets such as the bee, monster, and character body.

Here’s some early concept work that led to our final game:

Concept drawing _ John


Concept sketch _ Hudson

Concept drawing_Peggy


Here is a video of the early bee prototype from Peggy:

Here is a video of Hudson’s early skeleton generation code:

Here is a video of John’s early wind control mechanism:

More fan motor action:




Project 2: Dais/E

Here is our video of Borxu, Jaxson and Andrew’s Project 2 submission:

Daisy (or Dais/E?)


Our supporting paperwork is attached.



And here’s the demo in class!


Memory Room: Yuxi, Ryan & Cris

It’s impossible to move, to live, to operate at any level without leaving traces, bits, seemingly meaningless fragments of personal information.

– William Gibson

We were interested in creating a project that allows a user to engage with a person in another time and place.  By leaving a trace behind for others or engaging in the traces left behind for us, we become aware of the presence of the other even in their absence.

We wanted to use the Kinect to track an individual in a given space and record his position after he has settled into the room.  After leaving, a projection of his image would appear in the exact spot he has just left and remain for the same length of time as he stayed in that spot before fading away.   The projection acts as our virtual trace left behind in the places we have visited.



Continue reading Memory Room: Yuxi, Ryan & Cris

The Drawing Board: Aesthetics

Below is a compilation of the various brushes created by our in-house illustrator Yifat! Reflecting her “cute drawing phase”, these lovely creatures allow you to create unique collages with a choice of several backgrounds and canvas options.