Photos of our building process.

CONNECT 4000

CONNECT 4000 is a modern interpretation of the classic game Connect Four. In this fun new game, players make their decisions online and the results play out at a location in the real world.

Log on to www.connect4000.com with a friend and pick a colour. Select a column in which to drop your marble. But choose quickly: if the timer on the game board runs out, a column will randomly be chosen for you. The first player to line up four of their marbles is the winner. Hit the RESET button to clear the game board and play again!

Continue reading CONNECT 4000

HEARTSTRINGS

We began this project by brainstorming ideas to on how to encompass our interests, address a problem in our lives, and connect people who may not necessarily be in the same space.

This led us to the conclusion that enhancing/augmenting the nature of webcam conversations was the direction we were aiming to go. Since a lot of us in the program aren’t from Toronto, Skype conversations with the significant other become routine.

The question, now formed, was how can we make Skype conversations more intimate? After researching online, we came across an academic paper entitled ‘Intimate Heartbeats: Opportunities for Affective Communication Technology’ by Joris Janssen et. al. The findings of this study concluded that “using self-report and behavioral tracking in an immersive virtual environment, that heartbeat perception influences social behavior in a similar manner as traditional intimate signals such as gaze and interpersonal distance”.

From this, we decided that effectively communicating heartbeat information to your Skype partner would be the ideal way to increase the feelings of intimacy when conversing with one’s significant other. We also found projects which we used as inspiration, and also as a benchmark for what avenues not to pursue (kissenger… a little creepy).

After picking up a Pulse Heart Sensor, we began working on some initial prototypes – a simple brooch that one would attach to their shirt, with an accompanying LED that would pulse to the beat of one’s heart.

However, as stated in the article we read and as common sense would have it, “the stimulus must be attributed to the conversational partner in order to have influence”, and as such, an LED blinking doesnt really do the job on its own.

So we moved on and attempted to work with multiple inputs/outputs including a vibe motor, heating pad, touch sensors, and RGB leds.

Due to the power requirements of the heating pad, vibe motor and rgb led, and the power constraints of the arduino, we had to limit ourselves to the input of the heart beat information supplied by the pulse sensor and a stretch sensor to control your mood, and a few outputs including an RGB LED and a vibe motor for a tactile sensation.

From our initial brooch prototype, we tossed around some ideas and came to the conclusion that a wearable solution would be ideal – it would hide all the electronics nicely and also allow for the pulse sensor to be within an ear’s distance.

For the aesthetic, we wanted the prototypes to feel comforting, and have an organic feel, so that it wouldn’t be cumbersome and feel like an extension of oneself “connecting” to their partner through technology.

And this required us (with some help from Borxu & Yushi) to learn how to use a sewing machine.

 

http://youtu.be/D-8PpsDpong

We used code from the Pulse website as well as Cosm to send heartbeat information over the internet and receive it in arduino, such that the vibe motor and led one was wearing would react to the heartbeat of their partner. (Arduino Code | Processing Sketch)


The final iteration was sewn onto a red velour sweater, for that touch of class, and was made using a Lilypad, some conductive thread, and was soldered onto a perf. board.

We also used a stretch sensor which would change the color of the heartbeat LED from blue to red, however that didnt make the final iteration due to a broken sensor (however can be seen in our sweet video).

After some debating, we decided that some sort of data visualization was needed in order to bring it all together, so with a basic processing sketch, we were able to map the two heartbeats, and as they became closer in unison, so to did the circles spatially.

Going forward, we would develop our second prototype into a similar sweater-sewn iteration and include more outputs, such as heat and sound.

Moreover, since there is a heavy dependence on Skype, we imagine developing an API to integrate the data visualization into the Skype window so that it is not competing with the conversation window for your attention. The processing sketch itself would also have to be developed extensively so that the aesthetic and efficacy of it are markedly improved.

For this particular project, we wanted to have the arduino wired so that the organic feel of connecting the usb which is protruding as the sweater’s umbilical cord was emphasized. However in a future rendition, a wireless version would be a good option to make the Heartstrings sweater more mobile.

Eschatology 1/4 – Lindy, Yifat, Alex, Torin

For full documentation go to: http://eschatology14.wordpress.com

Box man: Body remote robot car

–final video

This is Peggy and Hank’s project 3. Our concept is using player’s body gesture to control a robot worker to solving a box man puzzle. Originally, we started from thinking about how to use wireless technology to connect two objects more than doing a game. For instance, Hank’s wife always want a coffee machine that can help her cook coffee every morning when she gets up starting to wash her face; and I always want to do something for couples in long distance relationship (maybe two umbrellas, can project your lover’s external environment inside your umbrella. Or two bangles, can transfer your pulse data into some other things and send to your lover’s bangle.) However, we thought those are not fun enough. So we change the idea to a game. (Hank dreams for the robot car. And I like to solve puzzle. So we combined our ideas together.)

Step 1: build the robot In the process, the biggest problem was this robot car was hard for us to be programmed. We didn’t know where we could start. Officially, it can be programmed by C++. But we know nothing about C++. So we did a lot of research to figure out how can we code in Arduino environment. And then, we could control it’s speed and direction.

–forward: motors.setSpeeds(motorSpeed,motorSpeed);

–backward: motors.setSpeeds(-motorSpeed,-motorSpeed);

–right: motors.setSpeeds(motorSpeed,-motorSpeed);

–left: motors.setSpeeds(-motorSpeed,motorSpeed);

–controlled by the buttons on the car

lcd.clear();
  lcd.print("Waiting");

  unsigned char button = buttons.waitForPress(BUTTON_A | BUTTON_B | BUTTON_C);
  lcd.clear();

  if (button == BUTTON_A){
    motorSpeed=50;
    lcd.gotoXY(0,0);
    lcd.print("left");
    motors.setSpeeds(-motorSpeed,motorSpeed);
  }

  if (button == BUTTON_C){
    motorSpeed=50;
    lcd.gotoXY(0,0);
    lcd.print("right");
    motors.setSpeeds(motorSpeed,-motorSpeed);
  }

  if(button == BUTTON_B){
    motorSpeed=50;
    lcd.gotoXY(0,0);
    lcd.print("move");
    motors.setSpeeds(motorSpeed,motorSpeed);
  }

  lcd.gotoXY(0,1);
  lcd.print("spd=");
  lcd.print(motorSpeed);

  buttons.waitForRelease(button);
  motorSpeed=0;
  lcd.clear();
  lcd.print("stop");
  motors.setSpeeds(motorSpeed, motorSpeed);

Next, we studied with the pin map for 3pi robot car and find out which pin we can use. (only 3 free pin available). –buttons controll: https://www.dropbox.com/s/wtzp9txx7p1obqx/buttonControl3.ino

And then, we tried to control the car wirelessly.

–Xbee talk

–Wireless button

–Keyboard control: read data from processing

//Processing

https://www.dropbox.com/s/uik63h2rusfmsr5/keypadControl.pde

//Arduino1:get data from processing, and send data through xbee

https://www.dropbox.com/s/kqh3x6fp6eyoxi6/controledByKeys.ino

//Arduino2:the car, receive data from xbee

https://www.dropbox.com/s/mmhqd68g7xqznpr/Xbee_control_car.ino

–video

Step 2: connect Kinect

–The code to read the kinect:

https://www.dropbox.com/s/vwe4hf4m9fzztbi/kinectSkeletonServer.pde

–Processing:

https://www.dropbox.com/s/q4sxf81puu6cab6/kinectControlMode.pde

–Arduino:read from Processing

https://www.dropbox.com/s/kqh3x6fp6eyoxi6/controledByKeys.ino

–Robot car

https://www.dropbox.com/s/3uu7hajcj2yjnqq/XBeeControl.ino

And then, we want to code to control the angle for the car to make sure every time the car will turn 90 degree. We choose to change the code on arduino1. Here is the code:

https://www.dropbox.com/s/aj0synvuc7w1mo4/angle_test2.ino

–angle

Step3 build the maze The originally design should be like this: But when the car push the box, we can not make sure the car push the centre of the box, which result in the box will be deviated the path and the car can not keep moving vertically or horizontally. Besides, more blocks in the maze is easier to block the signal between Xbees. So I modify the maze into this: And add a board on the car (can help to push). After all red boxes are put into the target places, the leds will turned on, and a win sound effect will be there.

Step4: Display When I tried to play this game before we do the presentation, I realize I have to keep looking my laptop to see the background change, in order to make sure my gesture. Thus, I couldn’t focus on the maze and control the robot better. Kate and Nick suggested me to set up a projector to project the background color on the maze, so that I don’t need to pay attention to my laptop. And the game can be more smooth.

–maze with projector

Sketch Ball – and variables

float x = 0;
float y = 0;
float xspeed = 2.2;
float yspeed = 1.5;
float r = 32;

void setup() {
size(200,200);
smooth();
}

void draw() {
background(255);

// Add the current speed to the x location.
x = x + xspeed;
y = y + yspeed;

// Remember, || means “or.”
if ((x > width) || (x height) || (y < 0)) {
// If the object reaches either edge, multiply speed by -1 to turn it around.
yspeed = yspeed * -1;
r = 64;
}

// Display circle at x location
stroke(0);
fill(175);
ellipse(x,y,r,r);

r = constrain(r-2,32,64);

}

A Sketch I like

/**
A perlin-noise based force field by guru
*/
PVector[] pos;
int[] age;
void setup() {
size(300,300);
smooth();
background(255);

pos = new PVector[2000];
age = new int[2000];
for( int i=0; i<pos.length; i++) {
pos[i] = new PVector( random(width), random( height ));
age[i] = 0;
}
}

void draw() {
noStroke();
fill(255,10);
rect(0,0,width,height);
stroke(0);
for( int i=0; i<pos.length; i++) {
point( pos[i].x, pos[i].y );
}
for( int i=0; i<pos.length; i++) {
// random(1);
pos[i].add( new PVector( 2 *noise( pos[i].x * 0.02, pos[i].y*0.016 )-1, 2*noise( pos[i].x * 0.017, pos[i].y*0.011 )-1));
age[i]+=(int)random(3);
if ( pos[i].x width || pos[i].y width || age[i] > 100 ) {
pos[i] = new PVector( random( width), random( height ));
age[i] = 0;
}
}

if ( random(1000) > 999 ) {
noiseSeed( (long)random(10000));
for( int i=0; i<pos.length; i++) {
pos[i].add( new PVector( random(-2,2), random(-2,2)));
}
}

}

My Blog

HEREPlease see my blog