Week Update 4

I have been working on the tones that are generated when someone moves in the space. My intention is for body movements to trigger tones based on their location within the space and then control the way the sound moves between the 4 speakers with their hands. I have been testing this out in the actual space. I have the tones working but I am still having some problems with the spatialization of the sounds from the hand movements. I am posting some code from SuperCollider that generates this from Processing.

Synth Definition for Tone:

// Deep Drone Synth
| freq, modFreq, pmIndex, panX, panZ, amp = 1.0 |

var en = Env.linen(1, 5, 3);

var src = PMOsc.ar(freq, modFreq, Line.kr(0, pmIndex, 5), 0, 0.1) * EnvGen.ar(en, doneAction: 2);
//var out = SplayAz.ar(4, src, spread: 1, center: 0);
//var delay = CombL(src, 2.0, 2.5, 6);
var out = src;// + delay;
//out + delay;
Out.ar(out, Pan4.ar(out, panX, panZ, amp) );


An Open Sound Control Listener listens for OSC events sent from Processing in order to trigger new Synth objects based on the location.
Here is the OSC listener in SuperCollider:

// Receive from Processing (trigger Crash Synth)
~oscDrone = OSCresponder(nil, '\trigDrone', {arg time, responder, msg;

// Location of Body
var loc = msg[1];
//var loc = 0.5;
var cFreq = (loc + 2.0) * 50.0;
var mFreq = cFreq/2.8.rand; //(4.8.rand);
var pIndex = 20.100.rand; //cFreq + 0; //100.2000.rand;
var panX = rrand(-1.0, 1.0);
var panZ = rrand(-1.0, 1.0);
~drone = Synth(\droneDeep, [\freq, cFreq, \modFreq, mFreq, \pmIndex, pIndex, \panX, panX, \panZ, panZ]);



Comments are closed.