All posts by Michael Carnevale

SPOT & DOT – Tamagotchi Project

SPOT & DOT

General Summary

For my Tamagotchi project I created two robots that each take sensory input that guide output behaviour. SPOT is a web camera mounted on a robotic arm that controls its movement behaviour along three degrees of freedom (i.e., 3 servo motors). This version of SPOT is a re-iteration of a research project completed by Egil Runnar, Ling Ding, Tuesday, and myself where we had to create a life-like moving robot inspired by a robotic installation from the 70s named The Senster (see image below). DOT on the other hand is a robot consisting of three concentric rings attached to each other by hinges and servo-motors, thus allowing for three degrees of rotational movement. DOT’s movement along these three axes is pseudo-random, leading to a sense that DOT is exploring its immediate environment and potential combinations of movement and orientations. DOT has one sensory input, a photoresistor sensitive to the amount of light at its surface. The degree of light DOT senses is mapped and used to control the speed of DOT’s rotation where more light leads to faster movement, and less light leads to slower movement.

senster10-lrg

SPOT

DSC_8264

Summary

Since this iteration of SPOT is based on a previous group project I will not go into too much depth about the inspirational background and underlying concepts, but the core features, updated fabrication and computed behaviours deserve mention here.

Behaviour

SPOT’s behaviour can be summarized as three potential behaviour modes: random movements during inactive phase, face tracking, and motion tracking. SPOT’s behaviour is based on its camera input. SPOT is interactive as he can follow your face as you move around, you can distract him with physical motion in front of his camera, or you can simply watch his well-orchestrated random movements. SPOT’s camera input is fed straight into a Macbook Pro via USB and processes the visual information using Processing and a series of visual processing libraries.

Random Movements: When there are no faces or movements detected, SPOT’s three servo motors are given random movement commands (varying in final position and speed) and moves in burst intervals. In this mode, SPOT can take on a number of random movements and positions, often leading the viewer to anthropomorphize his actions. Each of these movements are executed in three steps – a fast initial movement, followed by medium speed movement, and then brought to the concluding position via a fast movement. Combined these three movements give the impression of organic behaviour where animal movements follow a similar pattern in nature, thus adding to the sense that SPOT is a living and intentional object.

Face Tracking: When SPOT detects a face in his visual field he moves his head up or down, or his base left and right to center the face in his camera input. If the face is far into the camera’s periphery, he will execute a fast corrective movement, otherwise smaller adjustment movements are made.

Motion Tracking: SPOT’s final behavioural mode is motion tracking, which uses an algorithm that calculates the degree of pixel change from one instance to the next. This behaviour was novel compared to SPOT’s original iteration. If enough change is registered compared to a threshold value, it will be interpreted as movement. If behaviour was detected along either the left/right axis or the up/down axis relative to SPOT’s centre of vision, he would make a compensatory movement to bring the moving object closer to his center of vision. One issue with the motion detection is that motion would naturally be detected during SPOT’s random movements, so motion detection was always temporarily disabled until his movements were completed, otherwise he would get locked in a neverending loop.

Parts

  • 3 servo motors (2 TowerPro servos + 1 mini-Servo)
  • laser cut pieces of Baltic Birth plywood (1/8th of an inch thick)
  • 1 generic webcam compatible with OSX
  • various nuts, bolts, washers, and mini-clamps
  • Arduino Uno + USB cable + jumper wires + breadboard
  • 5V DC Power Supply
  • Computer (here a 2013 Macbook Pro)

Fabrication

The plywood body composing the arms and base structure were designed and laser-cut from Baltic Birch plywood. The CNC file used to cut the pieces incorporated both the designs for SPOT and DOT to be cut in one print to save money and time. The CNC file was created via Adobe Illustrator as shown in Figure 1. The logic of the CNC laser cut file is that there is an order that lines will be cut such that blue lines are cut first, green lines second, and finally the red lines.

Screenshot 2015-12-17 23.05.22

The servos and camera were fastened to the plywood body and the parts were put together. The servos were locked into the CNC slots and held in place via a nut, bolt, and a piece of metal used as a clamp. The camera is held onto the arm via a nut and bolt. None of the parts of SPOT were glued together; it is held together entirely via clamps or nuts and bolts.

Software

Some aspects of SPOT’s programming were somewhat complex, particularly refitting the motion algorithm to suit the present purposes, and the overall control flow taking into account the camera input to Processing, which controlled the Arduino (and moved the servos), which then sent feedback back to Processing to control camera input. The face tracking algorithm was borrowed from an OpenCV library, while the motion tracking was taken from an algorithm adopted from the course instructors.

I adapted the motion control algorithm to not only distinguish between movement at the left vs. right of the screen, but also to determine if there is motion near the top or bottom of the screen. Motion was not registered in the center of the screen but only in the peripherals.

Processing’s control of the Arduino was through the serial port, where the Arduino would then send data back to Processing through the same serial USB port. Arduino would send data back to Processing because SPOT needed a way to suppress motion detection while moving, or else he would be stuck in an infinite loop. This was where the control flow started to become complicated, managing face tracking and motion tracking while making the robot behave as smoothly as possible.

DOT

IMG_0615

Summary

DOT is a relatively large (about 2 feet tall)           interactive installation with 3 concentric rings positioned within each other, affording 3 axes of physical rotation about a static central position. The three concentric rings are rotated using servo motors via independent commands. Two of the rings rotate about the y-axis (leftward and rightward) while one ring rotates about the x-axis (upward and downward). When all rings move simultaneously, it presents a mesmerizing combination of motions that provoke one to reflect on physical movement itself. DOT intentionally invokes a relation to astronomical devices, or to the rotations of the human eye. When DOT moves about, DOT can even give a sense of intentionality as one can relate to some of the movements it independently makes, and some of the positions it finds itself in, and in some ways ominously reminds one of biological motion. The speed of DOT’s motion is influenced via its one photoresistor, where more light leads to more agitated and faster movements.

Inspiration

Smithsonian11a

The concept for DOT came about as I randomly stumbled onto a website focused on mechanical movement (http://507movements.com/mm_356.html). One of the mechanical devices presented on the website was the Bohnenberger’s Apparatus (http://physics.kenyon.edu/EarlyApparatus/Mechanics/Bohnenbergers_Apparatus/Bohnenbergers_Apparatus.html), which was the mechanical device that inspired the gyroscope. This device got me thinking about rotational movements in general, for example how the eye moves about within its socket. I felt that this movement pattern was unique and worth exploring, so I made some preliminary designs. I got feedback from the shop instructor who told me that the best and most sturdy material to use would be either metal or plywood. Working with metal would have been an exciting project but for the looming deadline I surmised that laser cutting plywood would give me the most satisfactory and reliable results.

When the device was complete and finally working I could not believe how interesting the movements were to watch. The Brohnenberger apparatus was initially conceived as an astronomical device used to study rotational movement, but the movement of DOT as it behaved on its own was surprisingly organic. As the rings slowly moved into new orientations relative to each other it became easier and easier to anthropomorphize the result. Sometimes it would move smoothly, and sometimes it would move into what appeared to be strained and dramatic looking positions. This final product will give me the adaptable opportunity to study mechanical movement via microcontroller for the foreseeable future. Future projects may include providing DOT more sensors to guide its own movements and reduce the reliance on randomness.

Behaviour

DOT’s behaviour was relatively less complex in comparison to SPOT’s multiple behavioural modes, but DOT’s behaviour is in many ways more mesmerizing. DOT’s three concentric rings rotate independently (except the outermost ring) and update their position relative to their current position randomly. DOT’s rings move to a new position by rotating up to 10 degrees clockwise or counterclockwise from its current position (the value is determined randomly for each servo). New movements commands are sent to DOT’s three servos simultaneously, and do not update until all servos have completed their movements. The speed by which DOT moves is determined at any given time by how much light is detected via its photoresistor.

When any of DOT’s rings reaches the maximum rotation of the servo in either direction (i.e., zero or 180 degrees), DOT goes through a routine where that ring then rotates to the opposite end of rotation with 30 degrees of slack to prevent a series of redundant movements. While DOT is not as interactive as SPOT, which will track your face and movements in real time, DOT’s behaviour is more independent and fascinating to watch for its own sake, but one can indeed control the relative speed of DOT’s movements. In further iterations, DOT’s central circle could be fitted with sensors to guide its behaviour, leading to a fascinating potential series of intentional rotational movements.

Parts

  • 3 servo motors (1 TowerPro + 2 mini-Servos)
  • laser cut pieces of Baltic Birth plywood (1/8th of an inch thick)
  • 5V DC Power Supply
  • Photoresistor
  • Arduino Mega + jumper wires + copper wire + breadboard
  • Nuts, bolts, washers, spacers
  • Rubber caps

Fabrication

The body of DOT is also composed of plywood laser cuts. The main challenge of DOT was the fabrication, specifically, how to have multiple concentric rings fixed within each other, all turning on their own axis via servo motor. The current solution to this problem was to print two copies of the rings and fasten them together via nuts, bolts, washers, and spacers, and to fit the servo motors within them. Axles were attached to the servo motors and an axle was attached to the opposite side as well. The axles were made of long bolts, and fitted loosely into holes attached to the spacer between the next inner ring. This allowed the pieces to rotate smoothly, or in our case, to turn via servo motor.

dot

The wires for powering/controlling the servo motors were fitted through the space in between the rings alongside the spacers, and fed along each axle, until the wires reached the base where they were connected to the microcontroller. Similar to SPOT, there was no gluing involved in this project. The servos and axle joints were set in place purely by pressure, where tightening or loosening the adjacent ring prints (held apart via spacers) would allow one to fix or release the servo motors. The arms of the servo motors were connected to the axles via circular clamps or rubber caps, which held together the arm of the servo to the arm of its associated axle.

Software

Compared to SPOT, DOT’s software aspect was relatively simple. Unlike SPOT, DOT’s program could be housed entirely on the Arduino. Using the varspeedservo.h library, which allows you to control a servo’s position as well as speed, each servo was given independent commands. Photoresistor levels were mapped to servo speed values ranging from very slow, to a medium pace. Using a Boolean in the third argument to the varspeedservo function, the function was told to wait until the movement was complete before issuing new commands. This had the effect of waiting until each servo completed its rotation before new commands were submitted to each servo.

 

 

 

 

 

DailyDevice – Speedometer – IR_Rotary_Sensor

My daily device is a Speedometer that measures the number of times a wheel turns over time.

I wanted to figure out the relationship between the rotations per minute (rpm) of a DC motor with the amount of PWM voltage (controlled via transistor) and this method worked out.

To figure out the motor and transistor I used Marcello’s Daily Device (DC motor and potentiometer)

Parts for this Setup

– Potentiometer (to control wheel speed)

– Transistor + DC motor + SpokesWheel on axle (to measure amount of rotation)

– IR Rotary Sensor

IR Rotary Sensors

I bought an IR Sensor rotary kit from Creatron to do this. See the below link to see the sensors. It is basically an IR emitter and sensor put adjacent to each other. When the sensor is looping, it returns zeros when the sensor detects a signal from the emitter, and returns ones when the sensor is interrupted.

http://www.tinyosshop.com/index.php?route=product/product&product_id=541

The IR sensor can therefore detect whether it is being interrupted by the spokes of a turning wheel. By sampling the 0s and 1s over a set period of time, and determining the geometry of the spokes and wheels (e.g., 20 spokes per one rotation of wheel) one can calculate the amount that the wheel has turned over a set period of time. This value can then be converted into rotations per minute, which can be then converted into speed (speed = distance / time) by calculating the circumference of wheel.

Visuals of Setup and Demo

Overview

IR Sensor + DC motor with black spokes wheel connected to transistor + Potentiometer

IR Sensor – 3 Pins: 5Vs, Ground, and Digital output

IMG_0160

Clearer image of spokes wheel

IMG_0162

Video Demo

Serial.print values shows Potentiometer, DC voltage, digital IR value, # of rotations per sample interval (1 second), rotations per minute (RPM)

Code

// Function to use IR sensor to measure speed of turning spokes-wheel

// Motor and Potentiometer Values
int potPin = A4; int potValue = 0;
int motorPin = 3; int motorValue = 0;

// Rotary Wheel-Spin Sensor Values
int sensorRotaryPin = A0;
int sensorVal = 0; // 0 or 1 depending on whether the IR sensor detects the spokes or passes through them
int previousSensorVal = 0; // previous 0/1 value to use during loop calculations
int count = 0; // used for counting times digital signal shifts btw 0 and 1
float revolutions = 0; // number of times the wheel has turned 360 degrees
float rpm = 0; // rotations per minute
float speedVal = 0; // distance/time val calculated using revolutions and constants
int samplingInterval = 1000; // sample period of wheel turns over milliseconds to determine speed
long previousMillis = 0;

// Setup and Loop
void setup() {
Serial.begin(250000);
pinMode(8, INPUT);
}

void loop() {
potValue = analogRead(potPin); constrain(potValue, 0, 1023); // if(potValue>1023) {potValue=1023;}
motorValue = map(potValue, 0, 1023, 0, 255); analogWrite(motorPin, motorValue);
sensorVal = digitalRead(sensorRotaryPin);
spokesCounter();
GetSpeed();

// Serial Value Printing
Serial.print(potValue); Serial.print(” “);
Serial.print(motorValue); Serial.print(” “);
Serial.print(sensorVal); Serial.print(” “);
Serial.print(count); Serial.print(” “);
Serial.print(revolutions); Serial.print(” “);
Serial.println(rpm);
}

// Functions
int spokesCounter() { // Counts the number of times that analogRead changes values. A count of 20 spokes is 360 degrees.
if (sensorVal != previousSensorVal) {
count++;
}
previousSensorVal = sensorVal; // sets previous sensor val for next iteration through loop to compare
}

float GetSpeed() {
if(millis() – previousMillis >= samplingInterval) { // uses timer to turn count() value into rotation values and speed data
revolutions = (count/2.0)/20.; // # of revolutions. 20 spokes. count value divided by 2 so each change between 0/1 counts as 1 spoke. If 20 changes then the wheel has gone around 360 degree and revolutions = 1
rpm = revolutions * (60000.0/samplingInterval); // revolutions per minute (60,000 ms)
speedVal = (rpm * 20.0); // 20 cm is the wheel’s circumference. speed = cm/minute
previousMillis = millis(); // reset timer to resample
count = 0; // reset counter for next sample
}
}

Bird – RGB lights mapped to Sonar

I put the RGB LED into the bird and controlled its colour output using the incoming Sonar values. The Sonar values were also translated mapped onto tone values played through a speaker.

Each of three RGB pins for the common anode LED were given mapped values between 0-255. The ratio of RGB values was correlated with distance values (0-20 cm) obtained by the Sonar distance sensor. (See code below for how the ratio of values were mapped and calculated).

 

 

Here’s a couple images of my wiring

IMG_0146

 

IMG_0141

Below is the CODE I used if you wanna see how I controlled the RGB LED and mapped the sounds, distance, colours onto each other.
// For RGB LED
int redPin = 11;
int greenPin = 10;
int bluePin = 9;
int micPin = 0;
int micVal = 0;
int sampleRate = 500;
long lastChange = 0;

// For Ping Sensor
#include <NewPing.h>
#define TRIGGER_PIN 6 // Arduino pin tied to trigger pin on the ultrasonic sensor.
#define ECHO_PIN 5 // Arduino pin tied to echo pin on the ultrasonic sensor.
#define MAX_DISTANCE 40 // Maximum distance we want to ping for (in centimeters). Maximum sensor distance is rated at 400-500cm.
NewPing sonar(TRIGGER_PIN, ECHO_PIN, MAX_DISTANCE); // NewPing setup of pins and maximum distance.

// For Speaker Output
#include <NewTone.h>
#define TONE_PIN 8
int speakerRate = 1000;
int minFreq = 250; // minimum and maximum frequencies
int maxFreq = 2000;

/*uncomment this line if using a Common Anode LED */
#define COMMON_ANODE

void setup() {
pinMode(redPin, OUTPUT);
pinMode(greenPin, OUTPUT);
pinMode(bluePin, OUTPUT);

Serial.begin(115200);
}

void loop() {
// PING
delay(50); // Wait 50ms between pings (about 20 pings/sec). 29ms should be the shortest delay between pings.
unsigned int uS = sonar.ping(); // Send ping, get ping time in microseconds (uS).
Serial.print(“Ping: “);
int distance = uS / US_ROUNDTRIP_CM;
Serial.print(distance); // Convert ping time to distance in cm and print result (0 = outside set distance range)
Serial.println(“cm”);

int distanceScaled = map(distance, 0, MAX_DISTANCE, 0, 100); // Normalize ping distance values

// RGB Values calculated from distanceScaled
int red = getRed(distanceScaled);
int green = getGreen(distanceScaled);
int blue = getBlue(distanceScaled);

// Output Tone
if(distanceScaled > 0) {
int freq = map(distanceScaled, 0, 100, 1500, 250);
NewTone(TONE_PIN, freq);
delay(1);
noNewTone(TONE_PIN);
}

if((millis()-lastChange>=speakerRate) && (distanceScaled > 0)) {
int freq = map(distanceScaled, 0, 100, maxFreq, minFreq);
NewTone(TONE_PIN, freq);
delay(5);
noNewTone(TONE_PIN);
lastChange=millis();
}

setColor(blue,red,green); // red delay(1000);
}

void setColor(int red, int green, int blue) {
#ifdef COMMON_ANODE
red = 255 – red;
green = 255 – green;
blue = 255 – blue;
#endif
analogWrite(redPin, red);
analogWrite(greenPin, green);
analogWrite(bluePin, blue);
}

int getRed(int distance) {
if((distance > 0) && (distance < 26)) {
int red = map(distance, 1,25, 0,255);
return red;
}
else if((distance > 25) && (distance < 51)) {
int red = map(distance, 26, 50, 255, 0); return red;
}
else{int red = 0; return red;}
}

int getGreen(int distance) {
if((distance > 25) && (distance < 51)) {
int green = map(distance, 25,50, 0,255);
return green;
}
else if((distance > 50) && (distance < 76)) {
int green = map(distance, 26, 50, 255, 0); return green;
}
else{int green = 0; return green;}
}

int getBlue(int distance) {
if((distance > 49) && (distance < 76)) {
int blue = map(distance, 50,75, 0,255);
return blue;
}
else if((distance > 75) && (distance < 101)) {
int blue = map(distance, 76, 100, 255, 0); return blue;
}
else{int blue = 0; return blue;}
}