Sound + Vision: Anne, Nav and Jess Final Project

Our concept evolved significantly since its inception based on a number of factors, the most important being the feedback we received throughout the process. The largest criticism of our original idea was that it the experience was geared towards the wearer of the camera while the playback experience was completely passive. Based on this input, we decided to shift the concept towards making a participatory experience for users, where they could interact with and choose moments of focus of the media that they were experiencing.
As we developed our project, there was a major decision point that we needed to make – do we create this as a physical DJ tool or do we lean towards a more experimental aesthetic. This decision would impact the music we selected, the visualization and ultimately the relationship between movement and sound/image modulation.

Ultimately, we decided that we should move towards the experimental, that this would open up greater possibilities to the user and would yield more interesting results. The final concept is to create sensors that can be used as a movement based tool to generate sound and image art. The tool would make the art-making process accessible to more people and encourage users to see potential for expression in the world around them. This has traditions in the work of artists such as Alan Licht who use strange objects to generate even stranger sounds. The difference with our tool is that the interface we created is very easy to use, though perhaps challenging to master.

See it in action below.

Read more about the project and check out our code here.

Muybrige Jump Final Project

Concept

This project started with an investigation into re-imagining early cinematic devices, and to examine the early uses of moving images and entertainment in the context of digital media.It’s meant to investigate the how the simplicity of cinematic devices generated wonder and enthusiasm, whereas in our era of over-saturation, such enthusiasm is more more difficult to achieve. I drew an idea map that shows the process of defining what could be done technically, and how this might lend itself to the theme.  Research into early cinematic devices helped identify a specific technology and point of reference- Eadward J. Muybridge’s early high speed photography. The intention of one of Muybridge’s early experiments was to determine whether horses’ feet left the ground while galloping. I immediately recognized a connection with a continued fascination with this very same idea; people are fascinated by capturing themselves in mid-air in photographs (as evidenced on blogs etc). Creating a Muybridge-based image matrix of non-gravitational imagery of participants would achieved not only an interactive experience, but would provide a comparison to what audiences of early cinema may have experienced.  At a very basic level, it might demonstrate that simple technologies and physical participation are underestimated in a large proportion of entertainment.

Process

The original outline to create the project was;

-display images in a Muybridge based grid. My first reference was this Processing code, which was a capture display only, and did not allow for buffering or saving frames.

-set length of capture and loop

-capture to buffer  or folder – triggered by sensor

-indicator of begin capture with sound or light

-buffer replays mixture of captured videos

In terms of the Processing grid, my reference example only dealt with live capture. It took some research into the possibilities of capturing, saving, and displaying images. I wanted to modify this grid idea to have a captureEvent or serialEvent that saved images to a folder, and allowed random playback from a large pool of images.

Initially I wanted to capture only images of participants in mid-air, and have non-gravitational moments from many participants all play back randomly in a grid,. This appeared to be the best illustration of my concept.  As I explored the possibilities further, I also considered capturing the sequence of frames of a jump, from ground to mid-air to landing. The issue of how to isolate the beginning of the jump from the participants’ presence onscreen, and isolating their vertical movement with sensors was a trial and error process.  I conducted a series of tests using Processing motion detection code,  handheld accelerometers, foot-triggered pressure sensors, and suspended tilt sensors that jumpers would activate mid-air.  This will require more testing, as the sensors also need to work in tandem with the Processing sketch to capture properly.

Final Jump Project 

This iteration of my project allows participants to capture the complete sequence of their jump (or other movement) and experience their image in the Muybridge matrix immediately. A simple digital push button starts the countdown and the capture of up to 100 frames, or roughly 3 seconds to the data folder. An initial Jump Movie would be playing in an installation context and would instigate similar interactions. As I’m not yet sure the best way to isolate the non-gravitational moment, but for this phase a very direct instruction set was the best way to go. The existing jump images act as a bit of a directive, and the push button and text countdown lead the participant into usage.

Seeing the interactions in my class presentation, I now find there’s more to explore in this version of the idea. I question whether a pool of images with people in mid-air would have the same degree of user interest. It might have a very different outcome. Depending on the participants’ movement, the grid brought a lot to explore in terms of perception of movement, and in terms of tracking eye movement. The video from my class presentation is evidence that users will inevitably experiment and use the framework to explore different uses and ideas; it was a great opportunity to see how people engage with it and what the more interesting aspects of the interaction are.

Future Steps

I’d like to make the button a handheld wireless object that lends itself to jumping. The isolated jump movement is something I’d like to try to see how the interaction changes. This may be best achieved in combination with motion detection or background subtraction written into the Processing code.

Documentation

Digital Push Button

 

Arduino code

void setup() {
Serial.begin(9600);
pinMode(2, INPUT);
}

void loop() {
int sensorValue = digitalRead(2);
Serial.write(sensorValue);
}

 Processing Code

//MUYBRIDGE JUMP – Creation and Computation Final Project
//By Maureen Grant
//Code Assistance by Jim Ruxton, Dan Fraser, and Dustin Freeman
//based on code from Learning Processing by Daniel Shiffman

import processing.video.*;
import processing.serial.*;

Serial myPort;        // The serial port
Capture myCapture;    // The video capture

// Declare Variables
int countDown = 0;
int pictureCount=30;
int mode = 0;
int maxImages = 100; // Total # of images in jump folder
int imageIndex = 0; // Initial image to be displayed is the first

// Declare an array of images, and the grid
PImage[] images = new PImage[maxImages];
ImageGrid grid;

int jumpThreshold;  // if on, go to mode 2, saves captured images to the image array
boolean jumpCapture;
int imageCount;

void setup() {
size(640, 480, P2D);
myCapture = new Capture(this, 640, 480); //for version using iSight
//   myCapture = new Capture(this, 640, 480, “IIDC FireWire Video”, 30);  //for external camera

frameRate(24);
refreshImages();

String portName = Serial.list()[0];  //initializing serial port
myPort = new Serial(this, portName, 9600);
}

void refreshImages() {    // for loop playing images in a 10 x 10 grid, offset
for (int i = 0; i < images.length; i ++ ) {
images[i] = loadImage(“jump/jump” + nf(i, 3) + “.jpg” ); //jump folder instead of data, filename
}
grid = new ImageGrid(images, 10, 10);
}

void draw() {

background(0);   // Modes contributed by Dustin Freeman.
if (mode == 0) {   // Mode 0 = Play Images
grid.draw(imageIndex);
// increment image index by one each cycle
// use modulo ” % “to return to 0 once the end of the array is reached
imageIndex = (imageIndex + 1) % images.length;
}
if (mode == 1) {    // Mode 1 = startCapture = Capture images and save in jump folder according to naming convention
if (pictureCount<maxImages) {
image(myCapture, 0, 0);
saveFrame(“jump/jump” + nf(pictureCount, 3) + “.jpg”);
pictureCount++;
}
else {
refreshImages();
mode = 0;
}
}
if (mode == 2) {  //Mode 2 is the countdown, followed by start capture
countDown–;
image(myCapture, 0, 0);
PFont font;
font = loadFont(“Arial-Black-100.vlw”); //need to have file in data folder
textFont(font);
fill(255, 0, 0);
text(countDown/10 + 1, width/2, height/2);  // modify speed and length of countdown with countdown/##
if (countDown == 0) {
startCapture();
}
}
}

class ImageGrid {
int xSize;
int ySize;
int xTiles;
int yTiles;
int xTile;
int yTile;
int xImagePos;
int yImagePos;
int offset;
PImage[] images;
int imageCount;

ImageGrid(PImage[] tempImages, int tempXtiles, int tempYtiles) {
images = tempImages;
xTiles = tempXtiles;
yTiles = tempYtiles;
}

void draw(int startOffset) {
xSize = 64;
ySize = 48;

imageCount = 0;
for (yTile = 0; yTile < yTiles; yTile++) {
for (xTile = 0; xTile < xTiles; xTile++) {
xImagePos = xTile * (xSize);
yImagePos = yTile * (ySize);
offset = (startOffset + imageCount) % images.length;
image(images[offset], xImagePos, yImagePos, 64, 48);
imageCount++;
}
}
}
}

void captureEvent(Capture myCapture) {
myCapture.read();
}

void mousePressed() {
countDown = 48;
mode = 2;
}

void startCapture() {
mode = 1;
pictureCount=0;
}

void serialEvent(Serial myPort) {  // simple digital button
int inByte = myPort.read();
println(inByte);
jumpThreshold=inByte;
if (jumpThreshold >= 1) {    //if button is pressed, Processing receives a 1, else is 0. If equal to or greater than 1, go to mode 2 (startCapture)
countDown = 72;
mode = 2;
}
}

final _ the code — three concepts for chimes + plus a bonus

I chose to work with the headset’s “Meditation” values — the idea of meditation, or relaxation,  seemed to fit nicely with wind chimes. I  experimented with three different scenarios. Here they are, along with the code.

1) meditation by quintiles.

Meditation values are delivered to the Arduino on a scale of 0-100. There are 6 chimes. I decided to attached EEG to 5 of them, and leave one as the “observer” — a concept I picked up from reading about Mindfullness meditation. So, I divided the meditation values into quintiles and deliverd them to the wind chime via five independent muscle wires (all powered by the same 12V external power supply, but each attached to a separate pin on the UNO and passing through individuals MOSFETs.

FYI Also, using PWM, I drove Meditation (and sometimes Attention) values to an LED as a visual cue that the headset was delivering data.

SEE CODE at bottom of post (1/2)

Here is a basic schematic I have drawn to illustrate the wiring of the wind chime.

 

2) Bandwave values

One of the most exciting moments for me during this project was the decision to tap into the bandwave values and tie them to the Arduino.

EEG bandwaves (as they relate to the Mindwave headset are as follows:

delta // 0.5 – 2.75Hz

theta  // 3.5 – 6.75Hz

lowAlpha // 7.5 – 9.25Hz

highAlpha // 10 – 11.75Hz

lowBeta // 13 – 16.75Hz

highBeta  // 18 – 29.75Hz

lowGamma // 31 – 39.75Hz

highGamma // 41 – 49.75Hz

I had to do a little research: basically find the the Neursosky literature that explained how I could access the bandwaves. It didn’t take too long. Inside the original code provided by Neurosky, there is the “case 0x83”. That’s where you access the bandwaves — however, what is the specific code used to do it? I experimented with writing my own, but got only one value output for all of the bands. I looked around. Finally, I discovered a Processing sketch that had accomplished what I wanted to do. So I adapted that code to Arduino – and it worked!

see the code below (2/2).

Here is a pic of the output.

PROBLEM: all that extra work seemed to really slow down the Arduino. I saw lots of chugging going on in the serial monitor – and so I  decided not to work with the bandwave data.

3) pwm meditation

Lastly, I drove all of the Meditation data through a single pin, using PWM. So that the higher the M-values, the more current passed through to the muscle wires, causing(in theory) more activity.

OUTCOME:

I experimented with all of these scenarios, hoping that one might actually trigger some chimes. But ultimately, the mechanics of making the wind chime “chime” was the problem.  I should have consulted Jim, as he had mentioned to me that he had worked with muscle wire. I had run out of time. But I am very excited about what I did accomplish:

— a Nitinol PWM circuit

— the Mindwave hack

— accessing the EEG bandwave values via some reseacrh and tweaking code

— powering muscle wire with EEG input!

–> I will keep experimenting with EEG, muscle wire and … wind chimes. I am still excited by the project.

 

4) BONUS!

…earlier on in the project, I used PWM to power an analog meter I found at Active Surplus — this was for kicks. But I thought it could be used to illustrate Attention values that were active during Meditation sessions on the chimes.

–> I was helped by a nice fellow at the Integrated Media wing to figure out how much voltage would power the meter. It turned out that the range for the meter was 0 – 0.3 DC. After consulting with Jim, the first step was to try PWM, and map the headset values  to “0 and (0.3/5)*255 = 15”.

I found that I had to go much lower, and map between 0 and 4.

But it worked! (the first time – the next few times the meter only tracked about 1/3 of the entire spectrum?!?)

What became obvious here was that the headset data was being transmitted 1/s. I already knew this, but you really see it with the meter.

I mocked up a new graphic for the meter – a sort of spoofy Design Fiction meter that reads when and what kind of memory is being formed by the headset user.

HERE IS THE CODE FOR WINDCHIMES

/*

Project Title: MindChimes
Group Member: Mark Thoburn

Course Code & Title: Creation and Computation DIGF6B02
OCAD University

Created: November, 2011

Based on:
Arduino Bluetooth Interface with Mindwave (example code provided by NeuroSky, Inc.)
NeuroSky MindSet packet parser for Processing, by Rory Nugent, Last updated October 27th, 2009

 

*/

//START analogWrite code
int MusclePin = 3;
int MuscleValue;
int MinWaveValue;

const int numReadings = 10;

int readings[numReadings];      // the readings from the analog input
int index = 0;                  // the index of the current reading
int total = 0;                  // the running total
int average = 0;                // the average
//END analogWrite code

#define LED 13
#define BAUDRATE 115200
#define DEBUGOUTPUT 0
//#define BLUESMIRFON 2

#define MEDITATION0 5
#define MEDITATION1 6
#define MEDITATION2 7
#define MEDITATION3 8
#define MEDITATION4 9
//#define MEDITATION5 10
#define MINDWAVE A0

// checksum variables
byte generatedChecksum = 0;
byte checksum = 0;
int payloadLength = 0;
byte payloadData[64] = {
0};
byte poorQuality = 0;
byte attention = 0;
byte meditation = 0;

// system variables
long lastReceivedPacket = 0;
boolean bigPacket = false;

void setup() {

//START analoWrite code
pinMode(MusclePin, OUTPUT);

// initialize all the readings to 0:
for (int thisReading = 0; thisReading < numReadings; thisReading++)
readings[thisReading] = 0;
//    END analogWrite code

pinMode(MINDWAVE, OUTPUT);
pinMode(MEDITATION0, OUTPUT);
pinMode(MEDITATION1, OUTPUT);
pinMode(MEDITATION2, OUTPUT);
pinMode(MEDITATION3, OUTPUT);
pinMode(MEDITATION4, OUTPUT);
//pinMode(MEDITATION5, OUTPUT);
pinMode(LED, OUTPUT);
//pinMode(BLUESMIRFON, OUTPUT);
//digitalWrite(BLUESMIRFON, HIGH);
Serial.begin(BAUDRATE);           // USB
delay(3000) ;
Serial.print(194,BYTE) ;

}

// Read data from Serial UART //

byte ReadOneByte() {
int ByteRead;
while(!Serial.available());
ByteRead = Serial.read();
#if DEBUGOUTPUT
Serial.print((char)ByteRead);   // echo the same byte out the USB serial (for debug purposes)
#endif

return ByteRead;
}

void loop() {

// Look for sync bytes
if(ReadOneByte() == 170) {
if(ReadOneByte() == 170) {

payloadLength = ReadOneByte();
if(payloadLength > 169)                      //Payload length can not be greater than 169
return;

generatedChecksum = 0;
for(int i = 0; i < payloadLength; i++) {
payloadData[i] = ReadOneByte();            //Read payload into memory
generatedChecksum += payloadData[i];
}

checksum = ReadOneByte();                      //Read checksum byte from stream
generatedChecksum = 255 – generatedChecksum;   //Take one’s compliment of generated checksum

if(checksum == generatedChecksum) {

poorQuality = 200;
attention = 0;
meditation = 0;

for(int i = 0; i < payloadLength; i++) {    // Parse the payload
switch (payloadData[i]) {
case 2:
i++;
poorQuality = payloadData[i];
bigPacket = true;
break;
case 4:
i++;
attention = payloadData[i];
break;
case 5:
i++;
meditation = payloadData[i];
break;
case 0x80:
i = i + 3;
break;

case 0x83:
i = i + 25;
break;
default:
break;
} // switch
} // for loop

#if !DEBUGOUTPUT

// *** Add your code here ***

if(bigPacket) {
if(poorQuality == 0)
digitalWrite(LED, HIGH);
else
digitalWrite(LED, LOW);
Serial.print(“PoorQuality: “);
Serial.print(poorQuality, DEC);
Serial.print(” Attention: “);
Serial.print(attention, DEC);

Serial.print(” Meditation: “);
Serial.print(meditation, DEC);

if(meditation > 40)
digitalWrite(MEDITATION1, HIGH);
else
digitalWrite(MEDITATION1, LOW);

Serial.print(” Time since last packet: “);
Serial.print(millis() – lastReceivedPacket, DEC);
lastReceivedPacket = millis();
Serial.print(“n”);

// START analoREAD code
//MinWaveValue = analogRead(MinWavePin);
MinWaveValue = analogRead(attention);
MuscleValue = map(MinWaveValue, 0, 100, 0, 4);
analogWrite(MusclePin, MuscleValue);
Serial.println(MinWaveValue);

// subtract the last reading:
total= total – readings[index];
// read from the sensor:
readings[index] = analogRead(attention);
// add the reading to the total:
total= total + readings[index];
// advance to the next position in the array:
index = index + 1;

// if we’re at the end of the array…
if (index >= numReadings)
// …wrap around to the beginning:
index = 0;

// calculate the average:
average = total / numReadings;
// send it to the computer as ASCII digits
Serial.println(average);

//  END analogWrite code

switch(meditation / 6) {

case 0:
digitalWrite(MEDITATION0, HIGH);
digitalWrite(MEDITATION1, LOW);
digitalWrite(MEDITATION2, LOW);
digitalWrite(MEDITATION3, LOW);
digitalWrite(MEDITATION4, LOW);
//digitalWrite(MEDITATION5, LOW);
break;

case 1:
digitalWrite(MEDITATION0, LOW);
digitalWrite(MEDITATION1, HIGH);
digitalWrite(MEDITATION2, LOW);
digitalWrite(MEDITATION3, LOW);
digitalWrite(MEDITATION4, LOW);
//digitalWrite(MEDITATION5, LOW);

break;
case 2:
digitalWrite(MEDITATION0, LOW);
digitalWrite(MEDITATION1, LOW);
digitalWrite(MEDITATION2, HIGH);
digitalWrite(MEDITATION3, LOW);
digitalWrite(MEDITATION4, LOW);
//digitalWrite(MEDITATION5, LOW);

break;
case 3:
digitalWrite(MEDITATION0, LOW);
digitalWrite(MEDITATION1, LOW);
digitalWrite(MEDITATION2, LOW);
digitalWrite(MEDITATION3, HIGH);
digitalWrite(MEDITATION4, LOW);
//digitalWrite(MEDITATION5, LOW);

break;
case 4:
digitalWrite(MEDITATION0, LOW);
digitalWrite(MEDITATION1, LOW);
digitalWrite(MEDITATION2, LOW);
digitalWrite(MEDITATION3, LOW);
digitalWrite(MEDITATION4, HIGH);
digitalWrite(MEDITATION5, LOW);

break;
case 5:
digitalWrite(MEDITATION0, LOW);
digitalWrite(MEDITATION1, LOW);
digitalWrite(MEDITATION2, LOW);
digitalWrite(MEDITATION3, LOW);
digitalWrite(MEDITATION4, LOW);
//digitalWrite(MEDITATION5, HIGH);
break;

}
}

#endif
bigPacket = false;
}

else {
// Checksum Error
}  // end if else for checksum
} // end if read 0xAA byte
} // end if read 0xAA byte
}

HERE IS THE CODE FOR *BANDWAVES AND *MEMORY METER.

/*

THIS CODE CONTAINS TWO EXPERIMENTS THAT WORKED OUT WELL:
1) I accessed EEG bandwave data by adapting an existing Processing sketch
2) I powered a $5 analog meter bought at Active Surplus,
using PWM and mapping headset values (0-100) to the meter’s voltage input (0-0.3 volts).

<>

Project Title: MindChimes
Group Member: Mark Thoburn

Course Code & Title: Creation and Computation DIGF6B02
OCAD University

Created: November, 2011

Based on:
Arduino Bluetooth Interface with Mindwave (example code provided by NeuroSky, Inc.)
NeuroSky MindSet packet parser for Processing, by Rory Nugent, Last updated October 27th, 2009

Wiring:
Inputs:
Outputs:

*/

//START analogWrite code
int MusclePin = 3;
int MuscleValue;
int MinWaveValue;

const int numReadings = 10;

int readings[numReadings];      // the readings from the analog input
int index = 0;                  // the index of the current reading
int total = 0;                  // the running total
int average = 0;                // the average

//END analogWrite code
#define LED 13
#define BAUDRATE 115200
#define DEBUGOUTPUT 0
//#define BLUESMIRFON 2

#define MEDITATION1 6
#define MEDITATION2 7
#define MEDITATION3 8
#define MEDITATION4 9
#define MEDITATION5 10

#define MINDWAVE A0

// checksum variables
byte generatedChecksum = 0;
byte checksum = 0;
int payloadLength = 0;
byte payloadData[64] = {
0};
byte poorQuality = 0;
byte attention = 0;
byte meditation = 0;

//byte payloadData[24] = {
//  0};

byte delta = 0;// 0.5 – 2.75Hz
byte theta = 0;// 3.5 – 6.75Hz
byte lowAlpha = 0;// 7.5 – 9.25Hz
byte highAlpha = 0;// 10 – 11.75Hz
byte lowBeta = 0;// 13 – 16.75Hz
byte highBeta = 0; // 18 – 29.75Hz
byte lowGamma = 0; // 31 – 39.75Hz
byte highGamma = 0;// 41 – 49.75Hz

// system variables
long lastReceivedPacket = 0;
boolean bigPacket = false;

void setup() {

//START analoWrite code

pinMode(MusclePin, OUTPUT);

// initialize all the readings to 0:
for (int thisReading = 0; thisReading < numReadings; thisReading++)
readings[thisReading] = 0;
//    END analogWrite code

pinMode(MINDWAVE, OUTPUT);
pinMode(MEDITATION1, OUTPUT);
pinMode(MEDITATION2, OUTPUT);
pinMode(MEDITATION3, OUTPUT);
pinMode(MEDITATION4, OUTPUT);
pinMode(MEDITATION5, OUTPUT);
pinMode(LED, OUTPUT);
//pinMode(BLUESMIRFON, OUTPUT);
//digitalWrite(BLUESMIRFON, HIGH);
Serial.begin(BAUDRATE);           // USB
delay(3000) ;
Serial.print(194,BYTE) ;

}

// Read data from Serial UART //

byte ReadOneByte() {
int ByteRead;
while(!Serial.available());
ByteRead = Serial.read();
#if DEBUGOUTPUT
Serial.print((char)ByteRead);   // echo the same byte out the USB serial (for debug purposes)
#endif

return ByteRead;
}

//MAIN LOOP//

void loop() {

// Look for sync bytes
if(ReadOneByte() == 170) {
if(ReadOneByte() == 170) {

payloadLength = ReadOneByte();
if(payloadLength > 169)                      //Payload length can not be greater than 169
return;

generatedChecksum = 0;
for(int i = 0; i < payloadLength; i++) {
payloadData[i] = ReadOneByte();            //Read payload into memory
generatedChecksum += payloadData[i];
}

checksum = ReadOneByte();                      //Read checksum byte from stream
generatedChecksum = 255 – generatedChecksum;   //Take one’s compliment of generated checksum

if(checksum == generatedChecksum) {

poorQuality = 200;
attention = 0;
meditation = 0;

lowAlpha = 0;
highAlpha = 0;
lowBeta = 0;
highBeta = 0;
lowGamma = 0;
highGamma = 0;
theta = 0;

for(int i = 0; i < payloadLength; i++) {    // Parse the payload
switch (payloadData[i]) {

case 2:
i++;
poorQuality = payloadData[i];
bigPacket = true;
break;

case 4:
i++;
attention = payloadData[i];
break;

case 5:
i++;
meditation = payloadData[i];
break;

case 0x80:
i = i + 3;
break;

case 0x83:  // EEG DATA EEG DATA _ adapted from a Processing sketch _ see above
{
int eegData [24];
int bytesParsed = 0;
//i = i +25;

for(int i = 0; i < payloadLength; i++){

eegData[i] = (int)payloadData[bytesParsed + i];
}

delta = ((eegData[0] << 16) | (eegData[1] << 8)) | eegData[2];
theta = ((eegData[3] << 16) | (eegData[4] << 8)) | eegData[5];
lowAlpha = ((eegData[6] << 16) | (eegData[7] << 8)) | eegData[8];
highAlpha = ((eegData[9] << 16) | (eegData[10] << 8)) | eegData[11];
lowBeta = ((eegData[12] << 16) | (eegData[13] << 8)) | eegData[14];
highBeta = ((eegData[15] << 16) | (eegData[16] << 8)) | eegData[17];
lowGamma = ((eegData[18] << 16) | (eegData[19] << 8)) | eegData[20];
highGamma = ((eegData[21] << 16) | (eegData[22] << 8)) | eegData[23];
}
break;

default:
break;

} // switch
} // for loop

#if !DEBUGOUTPUT

// *** Add your code here ***

if(bigPacket) {
if(poorQuality == 0)
digitalWrite(LED, HIGH);
else
digitalWrite(LED, LOW);
Serial.print(“PoorQuality: “);
Serial.print(poorQuality, DEC);

Serial.print(” Attention: “);
Serial.print(attention, DEC);
Serial.print(” Meditation: “);
Serial.print(meditation, DEC);

Serial.print(” lowAlpha: “);
Serial.print(lowAlpha, DEC);
Serial.print(” highAlpha: “);
Serial.print(highAlpha, DEC);

Serial.print(” lowBeta: “);
Serial.print(lowBeta, DEC);
Serial.print(” highbeta: “);
Serial.print(highBeta, DEC);

Serial.print(” lowGamma: “);
Serial.print(lowGamma, DEC);

Serial.print(” highGamma: “);
Serial.print(highGamma, DEC);

Serial.print(” Theta: “);
Serial.print(theta, DEC);

Serial.print(” Delta: “);
Serial.print(delta, DEC);

Serial.print(” Time since last packet: “);
Serial.print(millis() – lastReceivedPacket, DEC);
lastReceivedPacket = millis();
Serial.print(“n”);

// START analoREAD code
//MinWaveValue = analogRead(MinWavePin);
MinWaveValue = analogRead(attention);
MuscleValue = map(MinWaveValue, 0, 100, 0, 4);
analogWrite(MusclePin, MuscleValue);
Serial.println(MinWaveValue);

// subtract the last reading:
total= total – readings[index];
// read from the sensor:
readings[index] = analogRead(attention);
// add the reading to the total:
total= total + readings[index];
// advance to the next position in the array:
index = index + 1;

// if we’re at the end of the array…
if (index >= numReadings)
// …wrap around to the beginning:
index = 0;

// calculate the average:
average = total / numReadings;
// send it to the computer as ASCII digits
Serial.println(average);

//  END analogWrite code

/*switch(attention / 10) {

case 0:
digitalWrite(GREENLED1, LOW);
digitalWrite(GREENLED2, LOW);
digitalWrite(GREENLED3, LOW);
digitalWrite(YELLOWLED1, LOW);
digitalWrite(YELLOWLED2, LOW);
digitalWrite(YELLOWLED3, LOW);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;

case 1:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, LOW);
digitalWrite(GREENLED3, LOW);
digitalWrite(YELLOWLED1, LOW);
digitalWrite(YELLOWLED2, LOW);
digitalWrite(YELLOWLED3, LOW);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 2:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, LOW);
digitalWrite(YELLOWLED1, LOW);
digitalWrite(YELLOWLED2, LOW);
digitalWrite(YELLOWLED3, LOW);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 3:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, LOW);
digitalWrite(YELLOWLED2, LOW);
digitalWrite(YELLOWLED3, LOW);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 4:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, LOW);
digitalWrite(YELLOWLED3, LOW);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 5:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, HIGH);
digitalWrite(YELLOWLED3, LOW);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 6:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, HIGH);
digitalWrite(YELLOWLED3, HIGH);
digitalWrite(YELLOWLED4, LOW);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 7:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, HIGH);
digitalWrite(YELLOWLED3, HIGH);
digitalWrite(YELLOWLED4, HIGH);
digitalWrite(REDLED1, LOW);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 8:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, HIGH);
digitalWrite(YELLOWLED3, HIGH);
digitalWrite(YELLOWLED4, HIGH);
digitalWrite(REDLED1, HIGH);
digitalWrite(REDLED2, LOW);
digitalWrite(REDLED3, LOW);
break;
case 9:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, HIGH);
digitalWrite(YELLOWLED3, HIGH);
digitalWrite(YELLOWLED4, HIGH);
digitalWrite(REDLED1, HIGH);
digitalWrite(REDLED2, HIGH);
digitalWrite(REDLED3, LOW);
break;
case 10:
digitalWrite(GREENLED1, HIGH);
digitalWrite(GREENLED2, HIGH);
digitalWrite(GREENLED3, HIGH);
digitalWrite(YELLOWLED1, HIGH);
digitalWrite(YELLOWLED2, HIGH);
digitalWrite(YELLOWLED3, HIGH);
digitalWrite(YELLOWLED4, HIGH);
digitalWrite(REDLED1, HIGH);
digitalWrite(REDLED2, HIGH);
digitalWrite(REDLED3, HIGH);
break;

*/

if(meditation > 40){
digitalWrite(MEDITATION1, HIGH);
}
else
{
digitalWrite(MEDITATION1, LOW);
}

if (highAlpha > 30 && < 60) {
digitalWrite (MEDITATION1, HIGH);
} else {
digitalWrite(MEDITATION1, LOW);
}

if (lowBeta =>20 && =< 50) {
digitalWrite (MEDITATION2, HIGH);
} else {
digitalWrite(MEDITATION2, LOW);
}

theta 20-60
highGamma 10-35
delta -20-60

}
}

#endif
bigPacket = false;
}

else {
// Checksum Error
}  // end if else for checksum
} // end if read 0xAA byte
} // end if read 0xAA byte
}

 

 

 

final _ hello wind chime

with sticky felt and I started looking for alternatives —

i have always loved wind chimes …

– they are known to being goodluck and ward eveil spirits off.

– more practically (?) wind chimes can be used to observe changes in wind direction. but in this case, they can be used to signal changes in mind states – how cool is that?

I loved this idea and so found a chime and started experimenting using jewelry tools

to attach muscle wires to chime strings…

first, I tried for proof of concept … and success!

IMG_0196

here are photos of the set-up:

 

 

 

final _ EEG (and PWM)

After success with the muscle wire circuit, I experimented a little. Nitinol is tough to work with – and now I know why Berzowska et al at Concordia went out of their way to get Nitinol with a 60C transition temperature.

At 70C, it singes the felt and sticks to its surface.

Regardless, I decided I wanted to keep pushing the tech side more, as opposed to looking for alternative materials to felt – so I set my mind to the MindWave Arduino hack.  This was, after all, the end goal – tying EEG to muscle wire. I knew I could swap in the headset values for the POT values. The sticky felt would have to wait.

….

I completed the hack, as outlined by the headset manufacturer, Neurosky. WOW! Success.. very fun; instructive and educational. I learned a lot just by doing — the RF dongle that speaks to the headset is no longer usable for gaming, but so what? it’s now dedicated to Arduino. great!

 

 

 

MindWave and PWM

Because the headset sends data only 1/s, PWM does not seem to be able to accurately map the data to an LED (It does, but with blackouts).

*note: i am referring to a single LED – not the row of 10 in the photo.

This does not bode well for subtle muscle wire contractions, dependent on the current running through the wire.

final_felt petals

just a quick FYI … I started making flower petals and sowing muscle wire into them.

 

final_1st felt experiment

 

 

a quick muscle wire experiment

IMG_0150

final_ PWM/Nitinol circuit – success!

*big thanks to Jim Ruxton for help with the PWM/Nitinol circuit.

**PS Jim, I could not find an IRL530 – but the IRF530 seems to work fine.

—————————————

Step two was hooking a piece of Memory Wire up to an Arduino.

A very helpful read was the MIT PhD thesis of Marcelo Coelho, who worked on the Kukkia Flower back in Montreal. Marcelo’s work was specifically helpful in two ways.

1) techniques for memory shaping Nitinol

2) circuitry for working with Nitinol.

The take-away was that I should consider an independent 12V power supply for the Nitinol and I needed something called a MOSFET N-ch (a transistor).

Wikipedia described the MOSFET as having “a voltage on the oxide-insulated gate electrode can induce a conducting channel between the two other contacts called source and drain.

I used the PWM code from the Analogue lab to run current through the muscle wire, starting at 0  and going up to 1023.

Here’s a diagram of the PWM/Muscle wire circuit.

Now it’s time to design petal power (anchor Nitinol into felt and see what works).

not quite sure what I am going to do…

Testing Testing…

Setting up the potentiometers to control the horizontal and vertical lines is fairly basic and should be relatively simple but will take you thru my odyssey nevertheless.

Test 1: I used the Arduino code from the Graphing a Sensor lab and proceeded to over-engineer it. In the Dimmer lab, I was unhappy that the potentiometer did not move the ball the full width of the screen and I thought I could overcome this by mapping the analogRead to the width of my screen dimension (1200 pixels). I also purchased what I thought was a better potentiometer than the one in our kits, in case that was the issue. Below is the code and the results. It worked in that I could produce a horizontal line. The problem however is I get a dotted line if I rotate the potentiometer to quickly.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Test 2: I tried re-purposing the code from the Graphing a Sensor lab and it worked much better. It now draws a smooth line no matter how fast the rotation. But of course there is a new problem to solve. When I open the sketch, the line draws from right to left, yet the potentiometer rotation is from left to right. The code is below – I’m sure the solution is blindingly obvious but I’m just tired trying right now. Time for a break…

And after staring at it again, I see the error in the Processing code… the fix is highlighted below in yellow. Works now – left to right rotation = line drawn from left to right on screen.

Final Project Idea Revision – Muybridge Jump

In keeping with the theme of re-examining early cinematic technologies, I have a revised project idea based on Muybridge’s early motion experiments, and essentially the first high speed photography. While reading about Muybridge’s fascination with whether or not horses’ feet left the ground while running, the outcome displays a very simple fascination with motion, flight, and gravity. We still have this simple fascination, as can be seen with the popular ‘jump’ images and video that display subjects with feet off the ground. I want to take this fascination and recreate it in a collective space to ask why it’s still used, and to re-construct Muybridge’s imaging with Processing to ask these questions.

here is a breakdown of the steps:

-LED flashes to signal participant to jump

-capture jumping action  (Arduino button, camera button or pressure sensor to activate image capture)

-set length of capture (one second or 25 frames)

-store to buffer (PImage)

-Processing buffer to replay captured images in Muybridge matrix or filmstrip template

I’ve found this Processing code to implement the matrix. Now I’m adding a boolean for the camera button, and working on how the image buffer works. Ideally it would be able to store and replay the jumps from various people and shuffle the images within the matrix. I’d like to modify it in the future to only display images with feet off the ground, but right now that seems like a second step.