Category: Project 1

00001

 

 

hell_01_world CNC

hell_01_world:

An investigation behind the design process of arduino project 00001.

Challenge: create an object / artifact found in a “haunted spaceship”. H.R. Geiger, anyone?

Concept:

Spinning head activates upon approach, splurts out blood as it gains velocity.

Foreseeable challenges:

  1. blood – create a barrier.
  2. Element of surprise – create a mirror effect, lighting up from behind the glass to reveal the spinning head.
  3. Casing – design product housing for sensors and breadboard / computer.

Sketch of concept

Object of inspiration

Object of inspiration

 

Method of process:

  • Break down the elements to trouble-shoot input ranges. This includes the sensors, mapping interface, and testing mechanical parts.
  • Test atomic parts of code to interact with the physical parts.
  • Stitch the components together physically, and build the code by integrating the parts step-by-step.
  • Trouble-shoot mechanics of code and physical output.
  • Design infrastructure and physical interface to house, protect, present and communicate the project.
  • Test in environment.

Step 1: investigating PIR sensor

Looking for a sensor that can detect movement, the PIR sensor (Passive InfraRed) is selected in hopes of greater reach for detecting movement.

It seems, however, that the movement it detects has a wide spectrum, and is not selective in what it detects, vs.the IR sensor that has a select spectrum range of 40 degrees in the direction the sensors are facing.

PIR sensor

First investigation into IR sensor

IRsensor

Illustration of sensor mechanics http://www.education.rec.ri.cmu.edu/content/electronics/boe/ir_sensor/1.html

Spectrum of sensor http://www.elecfreaks.com/wiki/index.php?title=PIR_Motion_Sensor_Module:DYP-ME003

Resolution: an IR sensor – Sharp 2Y0A21 for ranges between 4cm – 80cm was selected. Initially the IR was selected over the sonar sensor, as the design required the sensor to be hidden from view behind acrylic. This was not the case, as will be described later.

Upon reflection, PIR sensor is better-suited for HIGH / LOW applications, whereas the IR proximity is suited for ranges of proximity, conducive for mapping  / controlling such ranges to activate the mechanical parts of the project.

Step 2

 

Choice of motor for the spin.

A stepper motor with limited rotational range was not an option, and, as the spinning object of choice was determined, the weight of the object suggested a more powerful motor was required beyond the scope of a 5V power source. 12V motors were investigated, however the design did not allow enough surface contact for the object to be attached. A 30V motor was obtained after viewing samples, with the intent of reducing the power source to control the velocity (after many attempts, the affixed head simply flew off with significant trajectory).

30V mechanical DC motor.

30V mechanical DC motor.

Step 3

Testing coded parts

As stated, the PIR sensor was ill-suited for the project – the feedback was randomized, and no detectable pattern discerned.

PIR sensor with analog print values. Randomized.

PIR sensor with analog print values. Randomized.

Once an IR sensor was integrated, patterns of detectable values were determined between 4cm – 80cm, with 40cm being the optimal range of detection desired.

Mapped threshold of detection for the IR sensor.

30-40cm threshold.

 

 

 

 

 

 

 

 

 

An external DC motor with 12V battery power was required. The synthesis of the motor called for a transistor called a TIP120 (Darlington Transistor) to control the in / out power to the motor, and a diode is incorporated to capture a charge relay that could bounce back after the power is turned off – this information and diagram below is courtesy of http://bildr.org/2011/03/high-power-control-with-arduino-and-tip120/

TIP120 DC motor fritz dwg

Image provided by bildr.org, demonstrating external DC motor w/ transistor and diode.

Test prototype of the DC motor and external power with TIP120 transistor and diode. 2.2K resistor required for the TIP120.

Glen and Stephen help to review the board after initial set-up.

Once the motor was integrated with the IR sensor, testing proved successful – motor was activated, lights were added, and the IR detected motion at a defined threshold. Lights and motor were activated at varying threshold levels, to build as a two-step process.

Running prototype of lights, DC motor in action, activated by IR sensor.

Step 4

Building the components:

The critical juncture point is the motor and the spinning object. Weight, size, speed of the object, in conjunction with the motor power and torque define the physical parameters of the design. A plastic knob was acquired to be used as a connector to the head, which fit without any allowance for error. This in turn was bonded to the spinning dias of the motor with weld-bond sealant to fill up crevices and surface.

Two plastic knobs were adhered together with triclorethylene bonding agent. Aluminum male connector was cut and customized for fitment into the shaft of the motor head.

Two plastic knobs were adhered together with triclorethylene bonding agent. Aluminum male connector was cut and customized for fitment into the shaft of the motor head.

2014-10-05 14.46.56

 

Bonding components with different material properties calls for an integral system of design to withstand the rotational forces.

The base of the motor required a construction of a 6” plinth to house the mechanics, in addition to the spinning head defined as 6”. This led to a construct of a clear acrylic box to house the performance, and to protect viewers from any possible projectiles.

Step 5

Eventually a decision was made to not incorporate the projectile blood, as this would be a one-time performance. Another decision eliminated the “two-way mirror” effect, as it became apparent that the transparency of the box indicated the desire to make the mechanics (but not the electronics) visible. Horror is often-times hidden to amplify the element of surprise, yet the opposite can be true: sometimes the most terrifying elements of humanity are seen when it is in plain sight, and the machinist aesthetic reveals everything about the desire to replace the once-human form spinning in a post-human eternal gesture.

Structural composition was determined through AutoCad simulation for clearances and optimal visibility of performance. IR components, cabling and base platforms were determined with actual thicknesses of materials (3/8” optical acrylic).

ACAD screen

AutoCad development of template.

Graphics etched into the reveal did create allowances, to obscure the lower base form where the electronic bread board and cables were collected. Allowances for cables was designed to ensure connectivity and access. As sharing code is a part of the culture of the creative commons initiative, a decision was made to etch (right or wrong) the code used in the program, right into the physical structure. This is the DNA of the interactive program.

Illustrator output file for CNC laser etching and cutting template. Precision to 1?16” allowance.

Illustrator output file for CNC laser etching and cutting template. Precision to 1/16” allowance.

Graphics created in two-tone dispersion to simulate a gradient transition from translucent to clear.

Base housing for the DC motor – 6” acrylic customized with circular-cut framing.

The IR sensor was also integrated, as, during testing, it was evident that the acrylic would have obscured any detection or reading if it were placed behind the surface. Therefore, the sensor was visible at the base of the housing.

IR sensor

IR sensor revealed through the acrylic, flush with the external surface.

Construction of the housing was performed with trichloroethylene solvent.

Additional views of the base and housing for electronics.

Wire (mis)management.

Step 6

Testing the product required calibrating the threshold of the IR sensor within the environment. Different environments changed the sensitivity, but was easy enough to determine optimal distances. The final product entailed modifying the external power source, dropping the voltage to 6V power instead of 12V. The torque of the rotation in conjunction with the weight of the spinning head object (weighted to simulate a real baby’s head) proved too powerful for the base, which was intentionally not affixed for ease of access. The application of a potentiometer resulted in a staggered on /off rotation, with the off intervals as indicative of the potentiometer mapping variable. This could be accredited to the conversion of a digital signal into an analog out pulse – the power is incapable of being modulated, but the interval of signal is, resulting in an adjusting rotational rhythm. The following video does not demonstrate the pulse / potentiometer intervention, but is more indicative of the reduced voltage fed into the signal.

machineHead

Step 7

Documenting and recording the display developed into a narrative of its own. The original intent of the design was to have the arduino produce an audio amplification of pre-recorded sound effects, however, post-production enabled the inherent audio of the DC motor gears to be amplified, generating its own authentic voice to the soundtrack. Two tracks were developed – one with a music score (video option embedded) and one without. A part of the track is slowed down to 20% to match the video speed, adding a different intimate dimension of time to the experience. The environment sound of the fluorescent ballast humming is also amplified to showcase the actual field recording process, at the beginning and end of the video. Very few transition effects were added, but timing and negative space / black-outs enhanced the true lighting effect reacting from the IR sensor.

Post-production editing – Adobe Premiere CC. Minimal transition effects were used, and sound effects were generated by the gnashing of the DC motor gears and field recording of the fluorescent lights. No added colour treatments were required.

Post-production editing – Adobe Premiere CC. Minimal transition effects were used, and sound effects were generated by the gnashing of the DC motor gears and field recording of the fluorescent lights. No added colour treatments were required.

Final thoughts:

Coding into the physical world is extremely satisfying when the connection is made. The digital input with physical output is like a pure synthesis of energy in raw power, refined only by the imagination. Like all things, bugs are inherent to the process, and, like all design, a considerable amount of hacking is required to manufacture the results envisioned. Process is purity, and all things eventually reveal itself as the only real magic in this world.

Code

Note: a post-evaluation attempt to incorporate a long value – replacing the delay –  to the potentiometer changed little of the output results of the DC motor, however constant motor signals vs delayed signals transferred variably at seemingly randomized mapped values. Further investigation required.

/*test_IRSensr_threshmotor_pot.ino
the following code has been appropriated from 3 variations with a sample
as follows. Hello_01_World activated 10 06 2014, Jay Irizawa

Analog input, analog output, serial output

Reads an analog input pin, maps the result to a range from 0 to 255
and uses the result to set the pulsewidth modulation (PWM) of an output pin.
Also prints the results to the serial monitor.

The circuit:
* potentiometer connected to analog pin 0.
Center pin of the potentiometer goes to the analog pin.
side pins of the potentiometer go to +5V and ground
* LED connected from digital pin 9 to ground

created 29 Dec. 2008
modified 9 Apr 2012
by Tom Igoe

This example code is in the public domain.

*/

// These constants won’t change. They’re used to give names
// to the pins used:
const int analogInPin = A0; // Analog input pin that the IR is attached to
const int analogOutMotor = 9; // Analog output pin that the motor is attached to
const int ledPin1 = 8;
const int ledPin2 = 7;
const int potPin = 5; // potentiometer analog read

int sensorValue = 0; // value read from the IR
int outputValue = 0; // value output to the PWM (analog out)
int threshold = 350; // distance from IR to activate Motor (approx 20″)
int potVal = 0; // pot variable to store the value coming from sensor
int outputSpeed = 0;
void setup() {
// initialize serial communications at 9600 bps:
pinMode(analogOutMotor, OUTPUT);
pinMode(ledPin1, OUTPUT);
pinMode(ledPin2, OUTPUT);
Serial.begin(9600);
}

void loop() {
// read the analog in value:
sensorValue = analogRead(analogInPin);
// map it to the range of the analog out:
outputValue = map(sensorValue, 0, 1023, 0, 255);
// change the analog out value:
analogWrite(analogOutMotor, outputValue);

potVal = analogRead(potPin);
outputSpeed = map(potVal, 0, 1023, 0, 255);
analogWrite(analogOutMotor, outputSpeed);

// potentiometer delay
potVal = analogRead(potPin); // read value from pot analog dial
if
(sensorValue > threshold) {
digitalWrite(analogOutMotor, HIGH);
delay(potVal);
digitalWrite(analogOutMotor, LOW);
delay(potVal);
}
else if(sensorValue < threshold) {
digitalWrite(analogOutMotor, LOW);
}
// set up lights
if
(sensorValue > 300) {
digitalWrite(ledPin1, HIGH);
}
else if(sensorValue < 300) {
digitalWrite(ledPin1, LOW);
}
if
(sensorValue > 300) {
digitalWrite(ledPin2, HIGH);
}
else if(sensorValue < 300) {
digitalWrite(ledPin2, LOW);
}

// print the results to the serial monitor:
Serial.print(“sensor = ” );
Serial.print(sensorValue);
Serial.print(“\t output = “);
Serial.println(outputValue);

// wait 2 milliseconds before the next loop
// for the analog-to-digital converter to settle
// after the last reading:
delay(2);
}

Final Fritz diagram of arduino schematic

Final Fritz diagram of arduino schematic

Incoming Doom

 

 

(My apologies for the late post everyone!!)

 

Project Description

The premise behind the project involved teleportation. A space traveller accidentally initiates reception of a rogue “teleport” signal and is horrified when it is revealed that there is a space ghost coming through. The user experience is as follows: user walks by installation, an interesting visual appears in an enclosure of sorts – this is the teleport receiver pod, as the user walks by, the touch screen lights up, notifying them of an incoming teleport, the user is prompted to touch the screen. If the user touches the screen a sequence begins, detailing the incoming transmission and how there is something non-human coming through. At this point the teleport receiver begins revealing the monster coming through as it lets out a harrowing cackle.

This project required various components to function as an interactive experience. Firstly, in order to initiate the entire interaction, an Arduino was used in combination with an IR depth sensor to sense a potential user passing by. When a user is detected it announces announces it via serial. The Arduino also doubles as a controller for ambient/environmental lighting, which allowed for extra atmosphere to be added, for example, flashing red when the teleport alarm goes off.

The remaining components were, the project light surface (aka the teleport receiver), this was a Processing sketch running the incoming ghost visuals, projected onto a translucent mesh surface from behind. A second, separate Processing sketch was running a pre-scripted on screen dialogue accompanied by audio read out of the events occurring. This is the primary method of interaction. Once this application receives the serial data from the Arduino, reporting that a user is nearby it will prompt the user to touch the large screen. Upon receiving a touch, the application begins a series of timed events and audio queues. During this period it will then communicate with the application running the projected surface and start the video of the ghost appearing.

This combination of on screen dialogue with projected video and choreographed lighting in the surrounding environment, I hope helped contribute towards creating an immersive experience.

 

Circuit Diagrams

Assembly List

Label Part Type Properties
J1 Infrared Proximity Sensor
LED1 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
LED3 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
LED4 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
LED5 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
Part1 Arduino Uno (Rev3) type Arduino UNO (Rev3)
Q4 NPN-Transistor package TO92 [THT]; type NPN (EBC)
Q5 NPN-Transistor package TO92 [THT]; type NPN (EBC)
Q6 NPN-Transistor package TO92 [THT]; type NPN (EBC)
R2 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%
R3 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%
R4 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%
R5 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%

Shopping List

Amount Part Type Properties
1 Infrared Proximity Sensor
4 RGB LED (com. cathode, rgb) rgb RGB; package 5 mm [THT]; polarity common cathode; pin order rgb
1 Arduino Uno (Rev3) type Arduino UNO (Rev3)
3 NPN-Transistor package TO92 [THT]; type NPN (EBC)
4 100Ω Resistor resistance 100Ω; package 1206 [SMD]; tolerance ±5%

 

Code

Arduino IR Distance and LED Code – https://github.com/chrols2014/IRDistanceforSpaceKook/blob/master/IRDistanceSensor.ino

Main Processing Application – https://github.com/chrols2014/SpaceKookDesktopApp

2nd Processing App, Responsible for syncing video playback on projector – https://github.com/chrols2014/SpaceKookVideo_Player

 

Sketches / Design Files

 

 

 

Photographs

 Video

https://vimeo.com/108974484

Process

The theme assigned of a “Haunted Space Station” immediately brought up a memory of one of my favourite episodes of Scooby-Doo as child. It involved a “Space Kook” that would appear out of nowhere and terrify people living in the area. I remembered finding it hilarious that the show explained the glowing, levitating and enormous “ghost saucer” as merely a projection on a the clouds overhead. This further influenced my decision to take this primarily Arduino based project and make it a combination with more screen-based and projected elements. I had the idea to project onto the mesh screen from the get go and planned on using it to allow for the correct height of the “space ghost”.

Unfortunately, it turned out to be much more trickier to implement two screens in a single Processing sketch. I spent some time trying the various methods and eventually gave up, opting to create a work around. So instead, I wrote two separate applications, one that would be responsible for all timing of the experience and one that would remotely play a video one instructed to. This idea sounded daunting at the time but I quickly found the reference on the server and client functions in processing and it was easier than expected. I simply used 127.0.0.1 instead of a public ip address to setup the server and it worked.

The Arduino aspect required serial communication between it and the main Processing sketch. I decided to use an infra read depth sensor to detect if a person was walking by. Upon being triggered it would send a simple serial command back to Processing. I wanted to add a level of atmosphere with lighting, so, in addition to sensing for a user, the Arduino also allowed me to set up an array of RGB LEDs. I was able to change the colour of these lights by sending simple instructions from Processing to the Arduino. This part of the process went smoothly.

The multimedia aspect of the project turned out to be a lot more work than I expected. Exporting the text-to-speech files and synchronizing them with the on screen dialogue was a tedious task. Furthermore, tying in the projected video timing and the triggers for lighting and sounds, choreography was the trickiest thing to achieve and I wish that I had set aside more time to work on it as it cost other areas of the project.

Challenges / Improvements

As mentioned above, the amount of time spent on trying to perfect choreography cost me seriously in the presentation aspect of the problem, this was mostly  time management issues and could have been avoided if I had a more realistic break down of the assets required for each component.

I did not expect to have to write two application for my displays instead of two. I had no idea that the multi-screen situation in Processing was so tricky. Not accounting for that was another underestimate on time.

I wanted the final “teleportation chamber” to look better for critique. Given the time frame and my proposed idea, I worked with what I had but it could have been much better.

 

References

Scooby-Doo Space Kook Source footage – http://youtu.be/swhAv9VcBWc (Used without permission of Youtube account holder and Hannah-Barberra.)

Some handy Processing references:

Server Code – http://processing.org/reference/libraries/net/Server.html

Client Code – http://processing.org/reference/libraries/net/Client.html

Arduino Code Samples – referenced within github files.

 

 

 

 

Using Arduino to play two tones simultaneously

During the last hack section, Nick introduced us to the concept of using a timer instead of delay() to coordinate the actions executed by arduino. Rida, Glen and I tried to generate multiple tones to be played in sync with LEDs and a servo. But outputing audio from arduino can be a little tricky, so this post is about what went wrong, how to fix the problem and the shortcomings of generating audio in arduino.

Arduino generates audio with its built in tone() function by using an internal timers. The arduino Uno has three available timers and the Mega, six. To generate a second tone, arduino will need to allocate another timer leaving it with only another one available. Each timer already has a preset function: (1) PWM, (2) Servo Library and (3) millis()/delay() functions.

On a standard Arduino board, the first tone will use timer 2 (so PWM on pins 9 and
10 will not be available); the second tone uses timer 1 (preventing the Servo library and
PWM on pins 11 and 12 from working). On a Mega board, each simultaneous tone
will use timers in the following order: 2, 3, 4, 5, 1, 0. (Arduino Cookbook, p.335)

The standard tone() function allows only one sound to be played at a time, and in order to play additional tones, you will need to download and install the Tone Library. This library is, actually, the full featured version of the built in function:

A simplified version of the Tone library has been incorporated into the Arduino core
since 0018. It only provides a single tone (since only one timer is used).
You can find the core documentation here. (Tone Library website)

Once you install the library and import it into a sketch, you will get a compile error. This is due to the fact that the library has not been updated to be conpatible with versions 1.0+ of arduino. To fix this you’ll need to go to “…\Arduino\libraries\Tone\Tone.cpp” and change #include <wiring.h> to #include <Arduino.h>. Now the library will work properly.

ToneLibrary

Bear in mind that using arduino to generate multiple square wave sounds will affect its functionallity:

Playing three simultaneous notes on a standard Arduino board, or more
than six on a Mega, is possible, but millis and delay will no longer work
properly. It is safest to use only two simultaneous tones (or five on a
Mega). (Arduino Cookbook, p.335)

Given the exercise Nick assigned us involved using the millis() function and servos, what our group attempted to do was not really a good idea (especially when you have around half an hour to figure out how to make things work correctly!). The good news is that we can learn not only from what works, but also from what does not. After fixing the issues, tweaking the code written in class and adding a potentiometer to have real time frequency control, this is how the code looks like. Below you can watch a video of two speakers playing different tones at the same time with a quick mix so the video doesn’t sound like two old video game consoles playing together. And, yes, the arduino tempo is not perfectly in sync with the other instruments.

Just the two arduino square waves:

These Motors Got Moves!

Using timing instead of delay in Arduino to move servos and light up LEDs to the beat of Michael Jackson’s Billie Jean.

 

Team: Lee, Elliott, Sachi and Jason

 

CODE:

 

/*
The challenge: to make a ‘robot’ dance to the beat of a song.
The equipment: Two servo motors gussied up as Starbucks sirens/go-go dancers, and 3 LEDs standing in for a discoteque display
The song: Billy Jean by Michael Jackson (at 120 bmp)

by:
Elliott
Jay
Lee
&
Sachi

*/

#include <Servo.h>

// lights

int ledPin1 = 5;
int ledPin2 = 7;
int ledPin3 = 12;

Servo myservo; // dancer 1
Servo myservo2; // dancer 2

int pos = 0; // seems to work for 120 bmp
int target = 103;
long lastMove;
int moveRate = 5;

long lastChange1;
int blinkRate1 = 500;
boolean ledState1;
/////////////////////////
long lastChange2;
int blinkRate2 = 250;
boolean ledState2;
/////////////////////////
long lastChange3;
int blinkRate3 = 1000;
boolean ledState3;
void setup()
{
myservo.attach(9); // attaches the servo on pin 9 to the servo object
myservo.write(0);
myservo2.attach(8);
myservo2.write(0);

pinMode(OUTPUT, ledPin1);
digitalWrite(ledPin1, HIGH);

pinMode(OUTPUT, ledPin2);
digitalWrite(ledPin1, LOW);

pinMode(OUTPUT, ledPin3);
digitalWrite(ledPin1, HIGH);

}
void loop()
{

if((millis()-lastMove)>=moveRate)
{

if(target==103)
{
if((pos+1)<=103)
{
pos+=1; //pos++ pos = pos+1
lastMove=millis();
}
else
{
target=0;
}

}
if(target==0)
{
if((pos-1)>=0)
{
pos-=1; //pos++ pos = pos+1
lastMove=millis();
}
else
{
target=103;
}

}
}
myservo.write(pos);
myservo2.write(pos);

if((millis()-lastChange1)>=blinkRate1)
{
//then do something
ledState1 = !ledState1; //toggle the value
lastChange1 = millis(); //store the time that you changed

}
if((millis()-lastChange2)>=blinkRate2)
{
//then do something
ledState2 = !ledState2; //toggle the value
lastChange2 = millis(); //store the time that you changed

}
if((millis()-lastChange3)>=blinkRate3)
{
//then do something
ledState3 = !ledState3; //toggle the value
lastChange3 = millis(); //store the time that you changed

}
digitalWrite(ledPin1, ledState1);
digitalWrite(ledPin2, ledState2);
digitalWrite(ledPin3, ledState3);
}

Whispering Space Helmet

Project Description

For the Haunted Spaceship assignment I decided to created a wearable space helmet that the user will wear in the haunted house environment. Once they step on a plank of wood with a pressure sensor underneath it, the helmet is activated. There are two outputs activated by the analog input. Red LED string blinks and pulses in front of the user’s eyes in the clear vinyl portion of the helmet. Additionally, an mp3 shield plays scary audio with creaking doors, whispering, screams etc. . When the user steps off the pressure sensor, the LED string and audio is turned off. This responsive wearable creates an experience that is surprising and defies the user’s expectations. This is emphasized by the fact that this wearable is worn on the head, so close to the ears and face. It is isolating and limits the user’s vision adding to the suspense of the experience and when the actuators are activated, the user cannot escape or ignore them easily.

Circuit Diagram

hauntedhelmet_bb

 

This is the basic circuit set up for my whispering helmet. The mp3 shield is first attached to the Arduino UNO. An SD card with my sound file is inserted into the shield. The breadboard is attache to ground and 5V. The force sensor is also attached to the breadboard, a 10K resistor and the A0 analog pin. For my outputs, LED string (shown here with a single LED) is attached to the breadboard and to digital pin 5. A set of computer speakers are also attached to the mp3 shield as an output.

Code

code

Sketches

photo 3 (12)

Design Files

helmet

Photos and Videos

photo 1 (15)

Process Journal

Before I began creating the physical “installation” for this project, I wanted to solidify the code I was going to use. I started by using the AnalogInOutSerial example to determine the minimum and maximum for the force sensor I was going to use. I then wrote code from scratch to have the LED blink when the force sensor went above a value of 100.

void setup() {
// declare the ledPin as an OUTPUT:
pinMode(ledPin, OUTPUT);
Serial.begin(9600);
Serial.print(sensorValue);
}

void loop() {
// read the value from the sensor:
sensorValue = analogRead(sensorPin);
if (sensorValue>100){
// turn the ledPin on
digitalWrite(ledPin, HIGH); // turn the LED on (HIGH is the voltage level)
delay(1000); // wait for a second
digitalWrite(led, LOW); // turn the LED off by making the voltage LOW
delay(1000); // wait for a second
// stop the program for <sensorValue> milliseconds:
delay(sensorValue);
}
else{
// turn the ledPin off:
digitalWrite(ledPin, LOW);
// stop the program for for <sensorValue> milliseconds:
delay(sensorValue);
}
}

Once I figured that out, I knew I had to incorporate the code for the mp3 shield. With help from Ryan, I downloaded the two libraries associated with the shield and modified the code for the MP3 ButtonPlayer2, swapping out the button value for an analog sensor value. There were a few other modifications I had to make as well. I had to make the track stop before playing so that it would play at all. There’s not a logical explanation I can think of for this change but that was a theme when working with this mp3 shield. I also had to define a step state where 0 was not stepping on the sensor and 1 was stepping on the sensor. This made it so that when someone stepped off the sensor, the audio would stop, not play the whole track.

Once these problems were sorted out, I started making the physical helmet. I used plaster cast strips to cast a mould of my bike helmet. Once that was dry I cut an opening for the face to come through. Later I added clear vinyl as a kind of protective visor in this opening. I then dismantled computer speakers and resoldered the connections. Then I cut holes on each side of the helmet for the speakers to fit.

photo (19) photo (20)

I then glued the LED string back and forth across the clear visor. Finally I soldered the force sensor to long leads and affixed all the leads in the project to the helmet so there’d be no risk of the user pulling out any connections. My last touch was red tubing I sewed onto the base of the helmet to reinforce the space aesthetic.

photo (21)photo 2 (14)

For our critique class I used a plank of wood to spread out the force of the user so the sensor would be activated even if the user wasn’t directly on top of it. If this helmet was the key interactive element in a haunted spaceship, with no other factors, the sensor would be less obvious so as to startle the user.

photo 1 (14)

Using the mp3 shield was the biggest challenge in this project. It I were to do it again, I think I would find some other way to use audio and maybe would have added other sensors and outputs because I am actually becoming more familiar with them and the simple code that make them function. I would have also loved to have spent more time on the actual fabrication of the helmet but overall I am happy with the look and function of my whispering space helmet.

Project Context

As this project developed and came to an end I started thinking about how this could be taken further and how it could work in different contexts. The idea of a wearable that is location specific is very interesting to me. The most common version of this are the audio devices you carry around with you at museums and galleries. Once you punch in the number for the piece you’re looking at, the audio device will tell you about the piece. This kind of interactive, contextual wearable is a category where I see my piece fitting.

http://mw2013.museumsandtheweb.com/paper/transforming-the-art-museum-experience-gallery-one-2/

This article showcases an interactive museum experience at The Cleveland Museum of Art. The gallery is an interactive environment where guests can connect with the space and pieces in it in a totally new way. Muti touch screens are all over the gallery. Guest can learn more about pieces through text, images and “games”. There is also a creative aspect where users are asked to make something (using the touch screen) related to a piece they are viewing. They are then able to share their creation on social media. There are also large touch screens for kids to create shareable drawings. There is also the option to use an iPad app called ArtLens which provides even more information on pieces based on your proximity to them or by scanning a piece into the app. You can also create a playlist of favourites by using this app. This interactive gallery relates to my project where the triggers are the pieces themselves and the delivery is information or an additional interactive experience. The main difference is that the delivery is not done through a wearable, but instead through touch screens and iPads. This makes sense for this particular environment because the users come from many different contexts and backgrounds so the delivery needs to be simple and intuitive.

http://exertiongameslab.org/projects/lumahelm

This article describes a product called the Luma Helmet. This has a more direct correlation with my project as the wearable is worn on the head. This helmet, developed at RMIT University, has LED strips that are activated by an accelerometer. This helmet provides additional safety for a cyclist with added visibility. Some Luma Helmets also have an imbedded heart rate sensor. This reminds drivers that they are sharing the road with a fragile living, breathing human and to therefore become more aware of them. This project is different from mine because the output is externalized and shared with outsiders whereas my helmet provides a personal individual experience. In both cases the helmet is location specific and uses sensors to create an output.

here comes the spider

Project Description 

“Here comes the spider” is a part of a Haunted Spaceship interactive installation created for the Creation & Computation class. The assignment was to create a responsive environment  using Arduino hardware and software.

In this project I used Arduino Uno board, Infrared Proximity sensor, LED white light, mini servo motor and 330 resistor. I also built a housing box from white foam core and mixed a large amount of green transparent sludge using water, Guar Gum, Borax and Glycerine. I used the recipe  from Stratford Festival prop department, which was given to me by Monica. I also used a plastic spider as a servo motor arm.

The final product was a spider covered in bright green goo on top of a box housing the board and all the components, which moved with a different speed depending to a proximity of the viewer.

Circuit Diagramms 

 

here goes the spider

goo_sweep_with_spiders _schem

 

 

Code

https://gist.github.com/tatianajennings/1e3b5b96031dae2d305f

Process Journal 

My goal was to use some of the sensors and motors we learned about in combination with Arduino board and the basics of arduino   programming language in order to make an object to respond to human presence and behavior.  In a very simple way.  To tell the truth I was not thinking about artistic value of this work or the complexities of narration. In response to the “space station” idea  in combination with the haunted house theme I came up with the most obvious visual cliche – goo. It must be the result of all those alien movies with insect like repulsive creatures dripping something from their mouth.

The first idea I had was to have a transparent container/small fish tank full of green jello with a servo motor and Arduino hidden in the lamp housing and agitating the jello when someone comes close. I was hoping that the jello would be moving and shaking as if something alive was going to come out of it and attack you . I researched the jello like synthetic substances since I didn’t want to use the food stuff and have to refrigerate it. Googling slimy toys and alien brains produced a variety of exiting looking stuff  I imagined would be available at the local toy store.  Unfortunately aliens are out of fashion at the moment and all I managed to find was one sad looking ninja turtle capsule with watery substance of strange colour.  A trip to the halloween party store was unsuccessful as well and in a fit of desperation I purchased half of a rubber heart and 2 plastic spiders.

I had to rethink my idea incorporating whatever I had at hand. This was the first time when I actually thought about what kind of sensor I would like to use. Since the only sensor I used in class so far was the light sensor I had to research what else was available and what it means to have a sensor measuring a change in certain condition.  The sensor measuring distance seems  to be appropriate and I found a few types available at Creatron. My first choice was an ultrasonic sensor but they were too expensive and I bought the proximity sensor instead. I also purchased a bigger and brighter  LED.

When I laid out the actual physical objects I had I realized that in order to hide the board, motor and the sensor I will have to build a box and whatever I wanted to move will have to live on top of it. I still was attached to the idea of something slimy but the issues of finding it and figuring out how to contain and move it was making it unnecessary difficult. However on Thursday after we were describing our ideas in class Monica told me that there is a recipe  for a sludge like  substance Stratford Festival uses in Alice in Wonderland show. They make buckets of it every show for their Humpty-Dumpty death scene. She emailed me the link and I was able to find all the ingredients. Unfortunately I left the sludge making till the very last moment.

I didn’t have much problem connecting  the sensor and the light to Arduino board using the examples on Arduino site and researching this particular model of proximity sensor GP2Y0A41SK0F IR on line. I found a Lucky Larry’s blog with this exact sensor used for distance calculation and I used his code in my program.

Writing of the program was a very difficult enterprise. I am an absolute novice and although I am starting to understand the logic of code writing I am still far away from writing something on my own. The best I can do is to combine some stuff from examples and exercises and try to adjust it. I had a plan that I can use the two pieces of code we tried out in class and I understood – Sweep and Calibration. I was going to add to that Lucky Larry’s code for the sensor and hope it will work. However Larry’s code was more complex than my understanding.  Since Kate mentioned that it’s ok to ask for help  if it will help you to learn things out in the process – I asked my partner to help me and together we tried to figure it out.

The main problem was to calibrate the movement of the servo arm in a way that it produced a realistic movement and had just the right timing of the delay to move all the way. If the delay was too short the arm didn’t have enough time to make the full sweep and return to the position 0. The other problem was to calculate the distance and to relate it to the intensity of the movement. My idea was that shorter the distance measured by the sensor is – faster is the movement of the servo. We used the serial monitor to find minimum and maximum proximity values and to try out different combinations of proximity and delay time. If the delay was too short the movement didn’t have enough range to be distinctly faster. The servo was moving in steps of 1 degree. I also wanted to have the LED to be connected to the proximity and to come on slowly depending how close the person was. However I was unable to make it work and the LED had only two states – on and off.

After I built the box and attached the Arduino and the motor inside it – I started with the goo. Unfortunately the recipe didn’t seem to be accurate and I spent hours trying to make it the right consistency.  I also wasn’t sure about the type of a container, which would work with the box I built and  look good. It couldn’t be glass because I had to cut a hole in the bottom of it for the servo. The spider was the last second idea. I attached it to the servo instead of an arm and it fit perfectly. I couldn’t completely abandon the goo idea since by that time I made buckets of it – so I poured some of it on top of the spider hoping that it would make it more alien looking. I was not sure that the whole contraption will survive the trip downtown – but it did and worked perfectly all evening long!

 

Looking back I find that the whole experience was very involving and exiting. However after I saw everybody else present their work and had time to reflect – I find that I would approach the next project (if we will have to do similar projects in the future) differently. I would try to think about the environment and how I want to affect it in more complex ways  and I would think about technology in that context. And see what’s possible. At least as a starting point. Making something work is challenging and interesting too, but having a more complex idea around it might help to focus the mind and to see some unexpected solutions.

photo 1 photo 2 photo 1 photo 4    photo 5 photo 3photo 2 photo 1

 

 

 

 

Project_01 Trapped Soul

PROJECT DESCRIPTION

 

Overview

The name of this installation is Trapped Soul.

This work is an installation that use strengthen of muscle to control a “soul” escape from a special container.

 

Background Story

This scene happens in an abandoned space ship. 50 years ago, there was a research about artificial life. The scientists launched a space ship with an embryo. But unfortunately, the embryo grew to an alien because of the cosmic radiation. All the scientists on this space ship were killed by it. The leader of this scientist team is the one who survived from alien’s attack, but people on the earth did not allow him back because they afraid alien will come to the earth with him.

At his last moment, his still did not want to kill the alien. He spend thousands days on it and regarded it as his own child. His feeling was so strong, so after he dead, his soul become a spirit embryo that trapped in the embryo container, waiting for someone letting his soul get free.

Design Plan

The purpose of this work is letting people experience a magical moment that they can control a ball’s movement, also the brightness and sound.

With participant tenses the muscle, the “soul” lift up, changing color from white to red, and when participant relax the muscle, it dims gradually and goes down. It will burst out a scream when participant tenses the muscle with full power.

To make the scene more immersive, I plan to build some pieces of a dead astronaut, when people get close, the hand will shake to call the people and a box will open to show the fabric pad.

But because of the time, I didn’t finish all the content, just built the essential interactive part.

Structure

Materials:

  • Transparent plastic board *1
  • Tapes
  • Transparent ball *1
  • A roll of fishing line
  • Elbow pad *1

It includes two main parts, one is a large transparent plastic tube with an Arduino UNO, a motor shield, a motor, a speaker, a ultrasonic sensor and a muscle sensor with two 9V batteries.

Another part is a fabric pad that can stable the electrodes of muscle sensor on one’s arm. It’ s reusable and provides a more accurate position for electrodes.

Jenna02

 

 

Sketch

arduino

Function

Muscle sensor can detect electric potential to measure muscle activity. When it is connected to Arduino, it can output a number depending on muscle activity. Typically it’s 0 when the muscle is totally relaxed and the harder you tense your muscle, the higher the number is. This number controls the motor, speaker and LED.

First, The number is used to control the direction of motor to lift and down the LED ball. When you equip this fabric pad on post brachium, if you tense musculus biceps brachii, the motor rotates to lift the ball, and when you relax the muscle, it rotates oppositely so the ball drops down slowly.

Second, the LED is linked to the muscle sensor. The number maps to the brightness of the LED between 0~255 from dim to bright, so the harder player tenses the muscle, the brighter the LED will be. When the number is 0, the LED extinguishes.

Third, the tone of the speaker is also linked to the muscle sensor with map function. Also the harder player tenses the muscle, the tone become higher.

This is used to create a magical atmosphere because people can move the ball and light it using original energy without touching it.

In addition, I added some function to avoid error in this system. The first is added denoise code to filter noise value. The second is I used a ultrasonic sensor to measure the distance between the ball and the bottom of the tube to avoid rotating oppositely.

 

CIRCUIT DIAGRAMS

Components:

  • Arduino UNO *1
  • 100 ohm resistor *2
  • Motor *1
  • Ultrasonic sensor *1
  • LED *1
  • Speaker *1
  • Muscle sensor *1

circuit

CODE

https://gist.github.com/Jennaltj/b94bb64bb8aeae964e7a

VIDEO

PROCESS JOURNAL

I changed my ideal for several times and now the Trapped Soul is the final version.

First, I planned to make a scary box, using light sensor to trigger an action. When people look into the box from a hole on it, the light sensor can detect the change of the light, and then trigger a servo or a motor to pop out a fake hand. This is a basic idea and I want to add some LED in it as the stars in universe to tease people look into the box and keep watching it until the hand appears.

7509c7d0f703918f9f0a3abe523d269758eec49d b2ecd81ed21b0ef4ffc89e2adec451da81cb3e54

I spend several days to prepare the materials and try to let it move properly, but after I made this, I found it didn’t look like what I want, the box is too small to hide the hand and it looks not scary and not technical at all. So I decided to give it up and make something else.

At that time, there’s only less than 1 week before deadline but I haven’t decided what I will do. I have another idea about a dead astronaut and move it by servo but it’s all about the visual impact so I still unsatisfied about it. I want to build a installation with strong interaction with visitors, not just let them look something happen when they passed by.

When I thought about interaction, the first thing that pop into my mind is muscle sensor, which can be used to trigger activity directly. I had a brainstorming, listing every element related to spaceship and came up several ideas. At last I chose anti-gravity as my theme.

Then, I started to learn how to use muscle sensor to show anti-gravity.It’s obvious that letting something float in the air with out any support can remind people anti-gravity. And this project also related to haunted house, so a soul is a perfect theme to combine both anti-gravity and haunted house.

After I decided the theme, I started to learn how to use muscle sensor. It’s totally new for me. I found some tutorial by google it but not too much. I wrote a very simple code to print the output number for testing, and then I met my first challenge also the hardest one: there is no output except zero. I spent two days to test it. I googled the problem but only found many people have the same problem, without any solution. I measured each connection joint of battery, the board and electrodes but found nothing wrong with them. I was so anxious because it’s near the deadline and I haven’t started anything even the most basic function.20141003_013231

After struggling for 2 days, I finally found out why it didn’t work. The reason is I am not strong enough to active the sensor effectively. My friends helped me test it and I wrote down the number for triggering other functions.

The next step is letting the motor can rotate both directions. I used motor shield to control the direction and speed. Here is a video for the first success that using muscle sensor to drive the motor. I ‘d like to share it but I don’t have enough time to edit and upload to Vimeo.

123

These are the steps that I change the material and test the motor.

First I cut a  foam ball as a wheel and use a lantern as the soul, but both materials are not good.

20141005_205819

 

Two days before deadline, Gary found a plastic ball for me. It’s much better than the lantern.

20141005_205812

 

At the same day, I replaced the foam wheel by the frame of fishing line.

20141006_005038

 

The last day, adding the speaker, and then the tube for decoration. This idea was also came from a brainstorming with Daniah and Gary.

20141006_034614

Challenges

The first challenge is no output from the muscle sensor. I tested it hundred times to find out why it cannot work. What I learnt from this experience is, the position of  electrodes is really important, so that’s one of the reason that I want a fabric pad to help locate the electrodes, ensuring the electrodes can be kept on a certain position every time.

Second, I met another difficulty when I tried to use RGB-LED to change the soul’s color from white to red, but It didn’t light when I connected it in this circuit. This problem still hasn’t be solved now.

The third challenge is about the material. It’s difficult to find required materials and some of them are impossible to make by myself. So the visual effect didn’t reach my expectation.

The last one is about the wire and fishing line. They are too easy to twist on the motor and because of the tube, it’s extremely difficult to fix it.

 

Next Step

Actually at last I still not very satisfied about my work. To solve some existing problem and make it more attractive, it can be improved in these aspects:

  1. Decoration: Adding a dead astronaut next to the tube, and 2~3 led strips connecting the tube and the astronaut. Light the led on the strip one by one to simulate the soul flowing to the container from his body.
  2. Wireless: Comparing with the pros and cons of the brightness changed led, I found if this function is canceled, the whole system can be much more stable because there will be no wire disturb the fishing line and the motor. But another plan is, using wireless method to control the light, just put a electricity source in the ball with the LED. Maybe Xbee can be used in this step. (but I never used it before)
  3. Gamification: If it’s possible, I want to make the whole project as a mini game, with more interaction with visitors, including a complete background story for people engage in, more trigger for surprise and people’s action can lead to different results. It can be designed as an electronic game but with physical installation and more immersive.

REFERENCE

The first idea is come from the movie called Alien, choosing the idea of popping out something from a container. But I changed it a lot. The second idea appeared in my mind in the process of brainstorming without any reference. But I found some similar project using muscle sensor and also fabric equipment like the two links below:

http://www.guokr.com/article/76098/

This is a project that using a hand-made muscle sensor to control the frequency of sound when people do sports

 

http://www.advancertechnologies.com/2013/03/diy-conductive-fabric-electrodes.html

This is a tutorial of how to make a conductive fabric electrodes.

 

Hands of The Undead

soulful hands3

 

 

 

A perfectly fit space station disappeared in the year 1976 with all of its crew vanishing without a trace. The “cursed space station” is what many of us in the field called it; all of us truly believe that the station is damned. Every astronaut that was sent to the station never made it back to earth. Contact with the astronauts is always lost after few hours in the station. The 7-year mark since it last appeared is fast approaching; we can’t afford to lose another one of our guys. Something must be done, perhaps this time we can conquer the space station and solve its secrets without losing another astronaut. We collected all the data we gathered over the last 40 years, everything that we ever got from all astronauts sent to the station. Data gathered from each astronaut gave us a limited perspective of the station; however compiling the data from the 5 different astronauts gave us more realistic and accurate perspective. We proceeded, with the help of our genius engineers, to develop a virtual reality that replicates the station and its horrifying parts. The computer-generated environment was created through the use of sophisticated algorithms that utilized the different forms of data we gathered. The results shocked us to our core. One of the most horrifying areas of the space station was a room full of severed hands. The reason this room is the most horrifying is due to the fact that these severed hands were placed inside the walls, which appears to be made of stretched fabric. The hands attack anyone who steps foot in the room. We called it “hands of the undead”. Once you step into the room, which is made possible thanks to the VR program, first you are welcomed with freaky sounds. Once you pass by the walls, the severed hands will start attacking and attempting to suck you into the wall. After many hours of studying the room and entire station, a revelation came to us. What we once perceived as an attacking mechanism, we know believe is the body parts of the lost astronauts attempting to escape the station and never succeeding.

 

Collaboration

What started as a passing idea, turned into a collaborative effort to combine our individual projects and present them as part of one storyline. Myself and 4 other classmates (Glen, Jenna, Cynthia and Gary) believed that we could take this individual assignment and turn it into something more, an initiative that was indeed a learning opportunity for all of us. The storyline is based on a cursed space station that disappeared in 1976 and kept mysteriously reappearing every 7 years. The story involves 4 astronauts who were sent to the cursed space station to investigate, each on their own. 7 years apart from each other, the astronauts embarked the cursed space station to investigate an unknown signal. Every astronaut had a different fate in the cursed space station, each of our projects depicts the fate of a different astronaut. The first astronaut turned into an alien, as seen in Glen’s project; turning into half-robot half-human was the fate of the second astronaut as depicted by Gary’s project. Cynthia’s work represents the third astronaut who faced a cruel fate and was turned into a vicious baby. My project represents all the astronauts who are trapped in the cursed space station and trying to escape, but failing to do so. Jenna’s project demonstrates the sprite of these trapped astronauts, which escaped their bodies but remain trapped in the space station.

 

Inspiration

The inspiration behind the use of fabric and the incorporation of object movement on fabric using servo motors is based on the following project. I was inspired by the fact that simple movement of several servo motors can produce a different outcome each time. This helped me to form the idea of using fabric for the walls and having the motors behind it.

http://vimeo.com/89183970

 

The following project helped in the birthing of many great ideas, ones that are ambitious and challenging to develop in two weeks. The interaction between the object and the audience inspired me to construct an elaborate object that can be found in the space station.

http://vimeo.com/58533050

 

Circuit diagram:

haunted house_bb

 

Circuit components:

(1) Arduino board
(4) Servo motors
(1) Ping sensor
Multiple Wires

The Code:

https://github.com/daniah/Project-1_Haunted-House/blob/master/haunted_house.ino

 

Design Process:

DesignProcess

 

Final Outcome:

IMG_99150 copy

IMG_98690

IMG_98630000

 

Challenges & discoveries

The main challenge at the beginning was my limited knowledge of Adruino and its capabilities. This created a serious obstacle that I needed to overcome through intensive research and vigorous studying. Coming from a graphic design background, my aim was to develop a visually appealing project. This derailed my understanding of the real objective of the project, which is to create a responsive scary environment.

The planning and forecasting of the project was something that I did not pay attention to, which led to the demise of my initial project. The initial project was ambitious; it involved the purchase of many shield motors. After purchasing the first one, I realized how expensive and time consuming my idea was and I immediately abandoned it.

After having to overcome serious obstacles, the next challenge was finding the right material that is able to showcase the impact of the hands movement. Attaching the hands to the motor was a challenging task while taking into accounting the weight tolerance. It was no easy task to mount the motors with the hands on the base. Using clay as a base was working at the beginning, however later on it turned out to be a weak form of base. This led to the shift to wood, with the use of duct tape, as a base due to the time constraint.

 

 

Special thanks

I would like to thank each one of the following individuals for their appreciable amount of support and feedback.

Glen, Gary, Jenna and Cynthia .

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.