Author Archive

Processing Fishing

Description

For the water park project I decided to use colour tracking to affect a processing animation. I have created a “fish tank” with various coloured laser cut fish and “barnacles” that the user fishes out of the tank using a magnetic fishing rod. When the fish or barnacle comes into the webcam’s view, the colour is detected and the swimming fish in my processing animation change colour based on the fish or barnacle you have caught. This project is interactive for the user because they are physically affecting the change in the animation and the user can see this change. It fits the theme of water in three ways. First, thematically, the visual elements are related to fish and fishing. Second, the fish and barnacles sit in a tank of water waiting to be caught. Third, the processing animation illustrates swimming, undulating fish. I modified code from: http://www.openprocessing.org/sketch/162912

Code:

https://github.com/tp14ga/Project-2/tree/6eccfc9fce59516e6453f10659aeb9b7da2d6d83/sketch_fish

Photos/Videos:

photo 3 (16)

First I decided to have “fish” as my colour tracking objects so I had them laser cut from transparent blue, orange and green plexi. My plan was to use these fish alone and combine them to create mixed colours. (explained in experiments)

 

photo 4 (15) photo 5 (9)

 

Once I abandoned my original idea of having the fish on sliders, I decided to change my vessel to a fish tank and have a magnetic fishing rod to retrieve the coloured pieces in the tank so the webcam could view them as the came out of the tank. I made this fishing rod by drilling a hole in both ends of a square dowel and feeding a string through both holes. I then glued a magnet to the long end of the string.

photo 4 (14)photo 2 (18)

 

The laser cut fish outfitted with magnets. All the pieces ready to go into the tank.

Project Context: 

http://www.andybest.net/2009/02/processing-opencv-tutorial-2-bubbles

This is an example of a project using Processing and OpenCV. This project is similar to mine because it is also a one player game using the same software tools. Instead of colour tracking, this project uses motion detection by comparing the current frame and the last frame. If there was movement, a bubble pops, if not, a bubble in drawn. The goal is to pop all the bubbles. His code also has to update constantly to look for new movement whereas my code constantly updates to look for new colours. Like my game, there is no set ending or something that happens in the code once all the bubbles have been popped as his project is also a basic first start. I would have liked to add possibly a different processing animation that occurs once all fish and barnacles have been caught.

Project Experiments

1: This is my first assignment from Body Centrics using Processing

For this assignment I chose to use a sensor I had never used before, the flex sensor. The first human movement that came to mind with regards to bending or flexing was exercise. This set up of sensors determines how effectively someone is performing specific exercises; in this case, a squat.

Screen shot 2014-10-05 at 11.08.34 PM

 

(click to enlarge)

Flex sensors are attached behind each knee and a light sensor is placed under the person performing the exercise. The maximum is reached when both flex sensors are fully bent by the knee and when the light sensor reads 105 (in this case). At this point, the knees are fully bent in a squatting position and the body is low to the ground covering available light to the light sensor. I modified your code by changing the layout, colours, and circle positions. I then made it so that the circles grow once they reach their respective maximum so as the person performs the exercise the circles will grow and shrink accordingly with a large circle for each sensor being the goal.

Screen shot 2014-10-05 at 10.29.48 PM

Screen shot 2014-10-05 at 10.29.41 PM

Screen shot 2014-10-05 at 10.29.53 PM

Screen shot 2014-10-05 at 10.19.55 PMScreen shot 2014-10-05 at 10.22.49 PM

 

These are screen shots of the Processing animation. The first image shows the resting body where both flex sensors are straight and flat and full ambient light is reaching the light sensor. The second image shows the sensors reaching their maximums. All maximums and minimums in the code have to be altered and calibrated in each new room due to differing light sources. I chose a yellow circle for the light sensor and two pink circles for the flex sensors.

photo 1 (13) photo (18)

 

For the set up, the breadboard and arduino sit on the floor underneath the person. The two flex sensors are affixed to velcro bands that wrap around above the knee. The flex sensor is then stuck onto the leg underneath the knee bend. This set up could be modified to test the efficacy of a push up (with flex sensors at the elbow bend) or a sit up (with the flex sensors along your stomach muscles).

2: Processing sketch of stick figure me

This is the second time I used Processing in class. I made a stick figure of myself that shows when the mouse is pressed. This was a good way for me to learn how to create a drawing in Processing from scratch. I started with a 100,200 size and centred the middle line, triangle skirt and ellipse head. I then added two arcs for bangs with a gap in the middle for a part and two long rectangles for hair. I then added triangle shoes and two small ellipses for eyes. I then added and if statement for mousePressed so the function: drawTegan will appear when the mouse is pressed or not if else.

3: Face tracking Body Centrics assignment

183958387

image (50)

You will need white eyeshadow or face paint, black eyeshadow or face paint, foundation, tape and an optional  curling iron.

photo 1 (18)

First I did my hair. I curled all my hair to give it some body. I then added two buns on either side of my head to obscure the shape of my head. I also used a large front section to swoop over one of my eyes and pinned it to the side. The long curls also obscure the shape of my jawline.

photo 3 (15)

I then wanted to remove dimension from my other ocular region. I used white eyeshadow all around my eye and on my eyelashes. I then used foundation to remove any darkness from my eyebrow.

image (44)

My next step was obscuring the bridge of my nose. I taped a large triangular shape across my nose and filled it in with black eyeshadow.

image (47)

To remove dimension from my mouth region, I taped another shape onto my lips and filled the lips in with foundation and the skin part with black eyeshadow to give the illusion of opposite light and dark areas.

image (48)

Final look

This anti face tutorial is based on Adam Harvey’s CV Dazzle project and follows his tips for reclaiming privacy.

1. Makeup

For my look,  I avoid enhancing or amplifying any of my facial features. Instead I use dark make up on light areas and light make up on dark areas to confuse face tracking technology.

2. Nose Bridge

The nose bridge is a key element that face tracking looks for. I obscured it by using dark makeup in an irregular shape over the nose bridge area.

3. Eyes

The position and darkness of the eyes are key features for face tracking. Both ocular regions are dealt with here in different ways. A large section of hair completely obscured one and light makeup conceals the dimension of the other ocular region.

4. Masks

No masks or partial masks are used to conceal my face.

5. Head

Face detection can also be blocked by obscuring the elliptical shape of one’s head. I used buns on either side of my head, and long curls to break up the roundness of my head and jaw.

6. Asymmetry

Most faces are symmetrical and face detection software relies on this. All hair and makeup elements are different on each side of my face.

 4: Colour Tracking

This is me experimenting with colour tracking using the gesture based colour tracker Nick provided us with. When I mouse press something in view, the colour is recorded and the tracker draws a blob of that colour over the object and anything else that it that colour. Here I learned that the tracker is not as specific as I would like and picks up on things that you don’t want it to because of lighting. This issue came up a lot when I was testing my project. Anything I was wearing, the colour of my skin, the colour of the wall behind the tank or even low lighting would make the colour tracker pick up on things I didn’t want it to.

5: Modification of fish_sketch

At this point I had decided to modify this sketch for my game project. Here I try changing variables like size, position, fill colour and stroke weight. When I make the background colour (0,0) the fish do not update their position. Instead, you see the track of where the fish has been the whole time within the sketch and therefore shows the colours that have been tracked throughout the game. This could become part of the game in the future potentially. It could become a multi-player game where whoever’s colour is of the highest percentage on the screen at the end would win. It reminds me of this:

 

6: Fish sliders

photo 2 (19) photo 1 (20)

 

My first idea was to have this plastic container as my tank. I laser cut rectangles of plexi that I glued together to create channels for a slider that the fish would be attached to. The player would change the position of the fish to change the colour of the Processing fish like the final game, but in this set up, the fish could be placed in front or behind each other. Then the colour tracker would be able to pick up specific colour mixes as the plexi is transparent and would create new colours (two fish together or all three together). In the end I abandoned this idea because the sliders I made didn’t work very well and weren’t sliding properly. Also the fish would not stay attached to their sliders.

7: Fish transparency

The next issue with my original idea was that the colour tracker had trouble identifying the colour mixes as they ended up to dark. In this video I attempt to have the colour tracker identify orangeblueFish, bluegreenFish and allFish. They all just end up a dark purplish colour. Another issue is that the surface of the fish reflect back the computer screen and confuses the tracker. At this point I decided to switch to my final “Processing Fishing” idea. Beyond the laser cut fish I also wanted other coloured items that the tracker could identify.

8: Foam sea creatures

photo 3 (17) photo 1 (19)

 

At this point I decided to try tracking these foam sticker sea creatures. I first tried gluing magnets to them so they could sit in the tank with the laser cut fish but no amount of weight would make them sink to the bottom. I then decided to attach them to a stick so the player could add them into webcam view but this didn’t really fit with the fishing idea I had come up with. I then decided to make coloured “barnacles” as extra tank items that do sink to the bottom. In this video I show the tracker identifying all the colours of the extra foam creatures. Some were too close to the colours of the laser cut fish so this confused the colour tracker.

Whispering Space Helmet

Project Description

For the Haunted Spaceship assignment I decided to created a wearable space helmet that the user will wear in the haunted house environment. Once they step on a plank of wood with a pressure sensor underneath it, the helmet is activated. There are two outputs activated by the analog input. Red LED string blinks and pulses in front of the user’s eyes in the clear vinyl portion of the helmet. Additionally, an mp3 shield plays scary audio with creaking doors, whispering, screams etc. . When the user steps off the pressure sensor, the LED string and audio is turned off. This responsive wearable creates an experience that is surprising and defies the user’s expectations. This is emphasized by the fact that this wearable is worn on the head, so close to the ears and face. It is isolating and limits the user’s vision adding to the suspense of the experience and when the actuators are activated, the user cannot escape or ignore them easily.

Circuit Diagram

hauntedhelmet_bb

 

This is the basic circuit set up for my whispering helmet. The mp3 shield is first attached to the Arduino UNO. An SD card with my sound file is inserted into the shield. The breadboard is attache to ground and 5V. The force sensor is also attached to the breadboard, a 10K resistor and the A0 analog pin. For my outputs, LED string (shown here with a single LED) is attached to the breadboard and to digital pin 5. A set of computer speakers are also attached to the mp3 shield as an output.

Code

code

Sketches

photo 3 (12)

Design Files

helmet

Photos and Videos

photo 1 (15)

Process Journal

Before I began creating the physical “installation” for this project, I wanted to solidify the code I was going to use. I started by using the AnalogInOutSerial example to determine the minimum and maximum for the force sensor I was going to use. I then wrote code from scratch to have the LED blink when the force sensor went above a value of 100.

void setup() {
// declare the ledPin as an OUTPUT:
pinMode(ledPin, OUTPUT);
Serial.begin(9600);
Serial.print(sensorValue);
}

void loop() {
// read the value from the sensor:
sensorValue = analogRead(sensorPin);
if (sensorValue>100){
// turn the ledPin on
digitalWrite(ledPin, HIGH); // turn the LED on (HIGH is the voltage level)
delay(1000); // wait for a second
digitalWrite(led, LOW); // turn the LED off by making the voltage LOW
delay(1000); // wait for a second
// stop the program for <sensorValue> milliseconds:
delay(sensorValue);
}
else{
// turn the ledPin off:
digitalWrite(ledPin, LOW);
// stop the program for for <sensorValue> milliseconds:
delay(sensorValue);
}
}

Once I figured that out, I knew I had to incorporate the code for the mp3 shield. With help from Ryan, I downloaded the two libraries associated with the shield and modified the code for the MP3 ButtonPlayer2, swapping out the button value for an analog sensor value. There were a few other modifications I had to make as well. I had to make the track stop before playing so that it would play at all. There’s not a logical explanation I can think of for this change but that was a theme when working with this mp3 shield. I also had to define a step state where 0 was not stepping on the sensor and 1 was stepping on the sensor. This made it so that when someone stepped off the sensor, the audio would stop, not play the whole track.

Once these problems were sorted out, I started making the physical helmet. I used plaster cast strips to cast a mould of my bike helmet. Once that was dry I cut an opening for the face to come through. Later I added clear vinyl as a kind of protective visor in this opening. I then dismantled computer speakers and resoldered the connections. Then I cut holes on each side of the helmet for the speakers to fit.

photo (19) photo (20)

I then glued the LED string back and forth across the clear visor. Finally I soldered the force sensor to long leads and affixed all the leads in the project to the helmet so there’d be no risk of the user pulling out any connections. My last touch was red tubing I sewed onto the base of the helmet to reinforce the space aesthetic.

photo (21)photo 2 (14)

For our critique class I used a plank of wood to spread out the force of the user so the sensor would be activated even if the user wasn’t directly on top of it. If this helmet was the key interactive element in a haunted spaceship, with no other factors, the sensor would be less obvious so as to startle the user.

photo 1 (14)

Using the mp3 shield was the biggest challenge in this project. It I were to do it again, I think I would find some other way to use audio and maybe would have added other sensors and outputs because I am actually becoming more familiar with them and the simple code that make them function. I would have also loved to have spent more time on the actual fabrication of the helmet but overall I am happy with the look and function of my whispering space helmet.

Project Context

As this project developed and came to an end I started thinking about how this could be taken further and how it could work in different contexts. The idea of a wearable that is location specific is very interesting to me. The most common version of this are the audio devices you carry around with you at museums and galleries. Once you punch in the number for the piece you’re looking at, the audio device will tell you about the piece. This kind of interactive, contextual wearable is a category where I see my piece fitting.

http://mw2013.museumsandtheweb.com/paper/transforming-the-art-museum-experience-gallery-one-2/

This article showcases an interactive museum experience at The Cleveland Museum of Art. The gallery is an interactive environment where guests can connect with the space and pieces in it in a totally new way. Muti touch screens are all over the gallery. Guest can learn more about pieces through text, images and “games”. There is also a creative aspect where users are asked to make something (using the touch screen) related to a piece they are viewing. They are then able to share their creation on social media. There are also large touch screens for kids to create shareable drawings. There is also the option to use an iPad app called ArtLens which provides even more information on pieces based on your proximity to them or by scanning a piece into the app. You can also create a playlist of favourites by using this app. This interactive gallery relates to my project where the triggers are the pieces themselves and the delivery is information or an additional interactive experience. The main difference is that the delivery is not done through a wearable, but instead through touch screens and iPads. This makes sense for this particular environment because the users come from many different contexts and backgrounds so the delivery needs to be simple and intuitive.

http://exertiongameslab.org/projects/lumahelm

This article describes a product called the Luma Helmet. This has a more direct correlation with my project as the wearable is worn on the head. This helmet, developed at RMIT University, has LED strips that are activated by an accelerometer. This helmet provides additional safety for a cyclist with added visibility. Some Luma Helmets also have an imbedded heart rate sensor. This reminds drivers that they are sharing the road with a fragile living, breathing human and to therefore become more aware of them. This project is different from mine because the output is externalized and shared with outsiders whereas my helmet provides a personal individual experience. In both cases the helmet is location specific and uses sensors to create an output.

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.