SPOT & DOT – Tamagotchi Project

SPOT & DOT

General Summary

For my Tamagotchi project I created two robots that each take sensory input that guide output behaviour. SPOT is a web camera mounted on a robotic arm that controls its movement behaviour along three degrees of freedom (i.e., 3 servo motors). This version of SPOT is a re-iteration of a research project completed by Egil Runnar, Ling Ding, Tuesday, and myself where we had to create a life-like moving robot inspired by a robotic installation from the 70s named The Senster (see image below). DOT on the other hand is a robot consisting of three concentric rings attached to each other by hinges and servo-motors, thus allowing for three degrees of rotational movement. DOT’s movement along these three axes is pseudo-random, leading to a sense that DOT is exploring its immediate environment and potential combinations of movement and orientations. DOT has one sensory input, a photoresistor sensitive to the amount of light at its surface. The degree of light DOT senses is mapped and used to control the speed of DOT’s rotation where more light leads to faster movement, and less light leads to slower movement.

senster10-lrg

SPOT

DSC_8264

Summary

Since this iteration of SPOT is based on a previous group project I will not go into too much depth about the inspirational background and underlying concepts, but the core features, updated fabrication and computed behaviours deserve mention here.

Behaviour

SPOT’s behaviour can be summarized as three potential behaviour modes: random movements during inactive phase, face tracking, and motion tracking. SPOT’s behaviour is based on its camera input. SPOT is interactive as he can follow your face as you move around, you can distract him with physical motion in front of his camera, or you can simply watch his well-orchestrated random movements. SPOT’s camera input is fed straight into a Macbook Pro via USB and processes the visual information using Processing and a series of visual processing libraries.

Random Movements: When there are no faces or movements detected, SPOT’s three servo motors are given random movement commands (varying in final position and speed) and moves in burst intervals. In this mode, SPOT can take on a number of random movements and positions, often leading the viewer to anthropomorphize his actions. Each of these movements are executed in three steps – a fast initial movement, followed by medium speed movement, and then brought to the concluding position via a fast movement. Combined these three movements give the impression of organic behaviour where animal movements follow a similar pattern in nature, thus adding to the sense that SPOT is a living and intentional object.

Face Tracking: When SPOT detects a face in his visual field he moves his head up or down, or his base left and right to center the face in his camera input. If the face is far into the camera’s periphery, he will execute a fast corrective movement, otherwise smaller adjustment movements are made.

Motion Tracking: SPOT’s final behavioural mode is motion tracking, which uses an algorithm that calculates the degree of pixel change from one instance to the next. This behaviour was novel compared to SPOT’s original iteration. If enough change is registered compared to a threshold value, it will be interpreted as movement. If behaviour was detected along either the left/right axis or the up/down axis relative to SPOT’s centre of vision, he would make a compensatory movement to bring the moving object closer to his center of vision. One issue with the motion detection is that motion would naturally be detected during SPOT’s random movements, so motion detection was always temporarily disabled until his movements were completed, otherwise he would get locked in a neverending loop.

Parts

  • 3 servo motors (2 TowerPro servos + 1 mini-Servo)
  • laser cut pieces of Baltic Birth plywood (1/8th of an inch thick)
  • 1 generic webcam compatible with OSX
  • various nuts, bolts, washers, and mini-clamps
  • Arduino Uno + USB cable + jumper wires + breadboard
  • 5V DC Power Supply
  • Computer (here a 2013 Macbook Pro)

Fabrication

The plywood body composing the arms and base structure were designed and laser-cut from Baltic Birch plywood. The CNC file used to cut the pieces incorporated both the designs for SPOT and DOT to be cut in one print to save money and time. The CNC file was created via Adobe Illustrator as shown in Figure 1. The logic of the CNC laser cut file is that there is an order that lines will be cut such that blue lines are cut first, green lines second, and finally the red lines.

Screenshot 2015-12-17 23.05.22

The servos and camera were fastened to the plywood body and the parts were put together. The servos were locked into the CNC slots and held in place via a nut, bolt, and a piece of metal used as a clamp. The camera is held onto the arm via a nut and bolt. None of the parts of SPOT were glued together; it is held together entirely via clamps or nuts and bolts.

Software

Some aspects of SPOT’s programming were somewhat complex, particularly refitting the motion algorithm to suit the present purposes, and the overall control flow taking into account the camera input to Processing, which controlled the Arduino (and moved the servos), which then sent feedback back to Processing to control camera input. The face tracking algorithm was borrowed from an OpenCV library, while the motion tracking was taken from an algorithm adopted from the course instructors.

I adapted the motion control algorithm to not only distinguish between movement at the left vs. right of the screen, but also to determine if there is motion near the top or bottom of the screen. Motion was not registered in the center of the screen but only in the peripherals.

Processing’s control of the Arduino was through the serial port, where the Arduino would then send data back to Processing through the same serial USB port. Arduino would send data back to Processing because SPOT needed a way to suppress motion detection while moving, or else he would be stuck in an infinite loop. This was where the control flow started to become complicated, managing face tracking and motion tracking while making the robot behave as smoothly as possible.

DOT

IMG_0615

Summary

DOT is a relatively large (about 2 feet tall)           interactive installation with 3 concentric rings positioned within each other, affording 3 axes of physical rotation about a static central position. The three concentric rings are rotated using servo motors via independent commands. Two of the rings rotate about the y-axis (leftward and rightward) while one ring rotates about the x-axis (upward and downward). When all rings move simultaneously, it presents a mesmerizing combination of motions that provoke one to reflect on physical movement itself. DOT intentionally invokes a relation to astronomical devices, or to the rotations of the human eye. When DOT moves about, DOT can even give a sense of intentionality as one can relate to some of the movements it independently makes, and some of the positions it finds itself in, and in some ways ominously reminds one of biological motion. The speed of DOT’s motion is influenced via its one photoresistor, where more light leads to more agitated and faster movements.

Inspiration

Smithsonian11a

The concept for DOT came about as I randomly stumbled onto a website focused on mechanical movement (http://507movements.com/mm_356.html). One of the mechanical devices presented on the website was the Bohnenberger’s Apparatus (http://physics.kenyon.edu/EarlyApparatus/Mechanics/Bohnenbergers_Apparatus/Bohnenbergers_Apparatus.html), which was the mechanical device that inspired the gyroscope. This device got me thinking about rotational movements in general, for example how the eye moves about within its socket. I felt that this movement pattern was unique and worth exploring, so I made some preliminary designs. I got feedback from the shop instructor who told me that the best and most sturdy material to use would be either metal or plywood. Working with metal would have been an exciting project but for the looming deadline I surmised that laser cutting plywood would give me the most satisfactory and reliable results.

When the device was complete and finally working I could not believe how interesting the movements were to watch. The Brohnenberger apparatus was initially conceived as an astronomical device used to study rotational movement, but the movement of DOT as it behaved on its own was surprisingly organic. As the rings slowly moved into new orientations relative to each other it became easier and easier to anthropomorphize the result. Sometimes it would move smoothly, and sometimes it would move into what appeared to be strained and dramatic looking positions. This final product will give me the adaptable opportunity to study mechanical movement via microcontroller for the foreseeable future. Future projects may include providing DOT more sensors to guide its own movements and reduce the reliance on randomness.

Behaviour

DOT’s behaviour was relatively less complex in comparison to SPOT’s multiple behavioural modes, but DOT’s behaviour is in many ways more mesmerizing. DOT’s three concentric rings rotate independently (except the outermost ring) and update their position relative to their current position randomly. DOT’s rings move to a new position by rotating up to 10 degrees clockwise or counterclockwise from its current position (the value is determined randomly for each servo). New movements commands are sent to DOT’s three servos simultaneously, and do not update until all servos have completed their movements. The speed by which DOT moves is determined at any given time by how much light is detected via its photoresistor.

When any of DOT’s rings reaches the maximum rotation of the servo in either direction (i.e., zero or 180 degrees), DOT goes through a routine where that ring then rotates to the opposite end of rotation with 30 degrees of slack to prevent a series of redundant movements. While DOT is not as interactive as SPOT, which will track your face and movements in real time, DOT’s behaviour is more independent and fascinating to watch for its own sake, but one can indeed control the relative speed of DOT’s movements. In further iterations, DOT’s central circle could be fitted with sensors to guide its behaviour, leading to a fascinating potential series of intentional rotational movements.

Parts

  • 3 servo motors (1 TowerPro + 2 mini-Servos)
  • laser cut pieces of Baltic Birth plywood (1/8th of an inch thick)
  • 5V DC Power Supply
  • Photoresistor
  • Arduino Mega + jumper wires + copper wire + breadboard
  • Nuts, bolts, washers, spacers
  • Rubber caps

Fabrication

The body of DOT is also composed of plywood laser cuts. The main challenge of DOT was the fabrication, specifically, how to have multiple concentric rings fixed within each other, all turning on their own axis via servo motor. The current solution to this problem was to print two copies of the rings and fasten them together via nuts, bolts, washers, and spacers, and to fit the servo motors within them. Axles were attached to the servo motors and an axle was attached to the opposite side as well. The axles were made of long bolts, and fitted loosely into holes attached to the spacer between the next inner ring. This allowed the pieces to rotate smoothly, or in our case, to turn via servo motor.

dot

The wires for powering/controlling the servo motors were fitted through the space in between the rings alongside the spacers, and fed along each axle, until the wires reached the base where they were connected to the microcontroller. Similar to SPOT, there was no gluing involved in this project. The servos and axle joints were set in place purely by pressure, where tightening or loosening the adjacent ring prints (held apart via spacers) would allow one to fix or release the servo motors. The arms of the servo motors were connected to the axles via circular clamps or rubber caps, which held together the arm of the servo to the arm of its associated axle.

Software

Compared to SPOT, DOT’s software aspect was relatively simple. Unlike SPOT, DOT’s program could be housed entirely on the Arduino. Using the varspeedservo.h library, which allows you to control a servo’s position as well as speed, each servo was given independent commands. Photoresistor levels were mapped to servo speed values ranging from very slow, to a medium pace. Using a Boolean in the third argument to the varspeedservo function, the function was told to wait until the movement was complete before issuing new commands. This had the effect of waiting until each servo completed its rotation before new commands were submitted to each servo.