Category Archives: Tamagotchi Pen Pal

Hypnotoad Demands Harmony

 

HYPNOTOAD DEMANDS HARMONY

[Video]

 

WHAT DOES HYPNOTOAD WANT?

Hypnotoad is kind of complicated. When you’re happy, he’s happy. When you’re sad, he’s sad. When you and a friend are both happy, he’s ecstatic! Just both don’t be mad. Hypnotoad wants to read your mind, and he insists, everyone must be happy. Remember…

Hypnotoad demands harmony…

This froggy character is a culmination of two ideas. The ambition to create a tandem biosensor that communicates back and forth between you and a partner, and a creature that mediates between two helplessly less psychic humans.

Silly humans.


WHAT IS HYPNOTOAD MADE OF?

Components

Yard Frog: Hypnotoad was upcycled from an abandoned corner of my mom’s front yard. Neglected and alone, for nearly twelve years, this ancient frog gained timeless wisdom and psychic powers, through a period of extended hibernation and meditation. As a partially cybernetic, water-permeable superbeing, he needed some functional repairs.

He came equipped with a micro controller, a weather corroded battery drawer, a raspy speaker, and an irreparably defunct motion sensor. Or light sensor. Or soul sensor. The mysterious lens lodged in his mouth was caked in years of mud and would not fire.

Adafruit Neopixel ring: I daisy chained a power-hungry (960mA) ring of 16 addressable RGB LEDs, with sturdy throughputs, to a 7 pin, possibly cloned Neopixel disk, with flimsily connected, flat contact pads. (I destroyed 3 of these disks just lightly applying solder)

Pulse Sensor – Earring style: An open source, optical pulse sensor created by ITP and USCD designers, Murphy and Gitman, that collects your pulse, based on the penetration of light through your ear lobe or finger, relative to the amount of blood coming and going,  moment to moment. This sensor differs from typical heart sensors, which detect heart rate through electrical muscle activity near the heart. Unfortunately, it is sensitive to ambient light, and will always return a signal. Testing was originally done on SparkFun’s dry electrode heart sensor, which had 2 extra wires (Lo+, Lo-) designed to filter out junk data when the electrodes lost contact. It was dropped because it dropped too much data, far too often.

Grove GSR: Originally based off of lie-detector polygraph technology, this arduino-specific sensor measures electrodermal activity to calculate moments of stress and excitement, through changes in skin conductance (AKA Electro-Dermal Activity). Testing was originally done on a Bitalino™ micro controller sensor board, but the EDA data that Bitalino generated, rolled through broad changes of stress slowly, while Grove’s sensor picks up moment to moment changes.

Arduino Micro Pro: I tried out a variety of micro pro microcontrollers, including two clones (Funduino and __), a Trinket, and the original variety. Other than a few offset Analog pinholes, and messily scattered labels that were only visible from the bottom, the cheap micro pro clones worked exactly like the real deal, which in turn worked just as well as their more popular cousin, Uno.  The main difference from an Uno is that Micro comes pre-soldered with male pins, which you’ll need to mate to a breadboard or female jumper cables to make use of. This costs you some of the size that Micro saves.

Green EL wire: Electro luminscent wire, that apparently gobbles a dangerous 120 volts straight from the wall, if your microelectronic store doesn’t remind you to buy an inverter. Buy an inverter. Just do it.

Woodcut letters, letter stickers, foam core boards, a cafe table, folding chairs, 3 sizes of hobby boxes, fake pond flowers, green modelling clay, green glitter paper, green aquarium stones, green mesh ribbon, green foam frog eggs, green and black fingerless gloves, un-sewable green scaly embossed cowhide, a tiny purple dog cape.

Code

Varspeedservo Library: This library let me control two servos independently, and most importantly, run them in parallel with a simple True/False parameter. I wish I could say  multithreading other components on Hypnotoad’s Arduino was as easy.

Adafruit_NeoPixel Library: This library lets you control multiple strips of neopixels from the same controller, as long as you create unique objects for each one, or daisy chain them all together, and tally up your total LED count.

Pulsesensor.com’s Interrupt: Not a library per se, the script on the pulse sensor’s website is indispensable, if you’re not familiar with introducing interrupts to your main script. I’m not, and I didn’t fully understand the components of the interrupt, so I discovered the hard way that anything you add to these functions, out of logical necessity, will run much faster than anything else. It interrupts the main Void Loop every 2 milliseconds.


 

HYPNOTOAD’S DEMANDS

(Functionality & Logic)

 

A 2x2 grid of each of the four conditions I might come across in comparing live biodata of two people
A 2×2 grid of each of the four conditions I might come across in comparing live biodata of two people

Hypnotoad responds differently, through a series of short and quick ribbits, relative degree of rotation, and glowing eyes, to express what he likes and dislikes.

3 different configurations of micro controllers
3 different configurations of micro controllers

This rough diagram shows the three options I had for arranging one or three wearables, with either concealed cables or a bluetooth module in every device. In the second option, both displays are slaves communicating to the master creature controller by bluetooth, and between the untethered wrist sensors, and pendant controllers

See Hypno in action (IMG_0419)


 RESTORING HYPNOTOAD

(The Build Process)

I spent weeks testing all of the bio sensors, and trying to daisy chain the cheap neopixels to the 16 pixel rings. I wired the necklace suspended rings directly in front of a micro pro, and ran signal, ground and voltage down arm-length cables to the pulse sensor and GSR finger cuff electrodes. After heat shrinking the cables in place, I realized I was going to need to run cables back up to the sensors and ring instead, coming up from a single, hidden controller board under his throne. This was necessary, as networking 3 arduinos of alternating roles was more trouble than controlling everything from one.

Original necklace prototype with wires bundled incorrectly
Original necklace prototype with wires bundled incorrectly

Ultimately, the necklace’s and Arduino Micro Pro should fit into a case that look like this…

Necklacle_cropped

I throttled the neopixel’s activity to stay within the Arduino’s power limitations, and turned both displays into separate instantiated objects of the same Adafruit library

The next step was completing the creature assembly. I knew I was going to:

  1. Fix, replace, or augment the frog’s internal motion sensor
  2. Make the frog jump
  3. Add lights, servos, and metal components to the frog chassis, adding to its weight

I had to figure out a way to rotate the frog to look at you, without exceeding the amount of torque the servo could handle before breaking. I bought a ballbearing assembly, sandwiched between two plates, and bolted the frog to existing holes on the plate, that happened to line up with speaker holes on the frog’s base.

I picked a small round servo blade that could pass through the center of the ballbearing assembly, protruding enough to clear the top opening. I bolted a rebar across the top assembly plate, and drilled two holes into the sides of the plastic blade to mount to the rebar and then screwed the blade into the servo itself (it needed two so that the blade wouldn’t unscrew itself).

Ballbearing Assembly with crossbar to spin mount plate with servo blade, independently
Ballbearing Assembly with crossbar to spin mount plate with servo blade, independently

On the bottom of the ballbearing assembly I attached two C-shaped metal bars, with only one screw each, to allow them to turn freely. Then I drilled holes in the side, attached springs, sandwiched my servo between them, and tensioned the springs to each, across the servo, with two springs. This kept the servo from twisting itself out of alignment, while still being mounted independent of both the frog and the case the frog would sit on. It also gave me back up space in case I needed to change the servo to a smaller or larger size later on.

 

A large servo spinning the ballbearing frog mount, and the smaller "jump" servo
A large servo spinning the ballbearing frog mount, and the smaller “jump” servo

With the mounting assembly finished, discrete enough to hide under the frog, but strong and modular enough to take any load, any servo, and mount anywhere, I moved on to the second servo.

I mounted the servo inside the frog, on top of its battery drawer, fixed a v-shaped lever to the servo blade, fed the second arm through another hole in the frog’s base, and made a simple kick for jumping.

The corroded guts (and power supply) of the frog
The corroded guts (and power supply) of the frog

Hypnotoad’s Stomach

I bypassed the frog’s previous power entirely, and wired directly to its board. I considered doing a back and forth conversion of analog to digital and digital to analog recording of a frog Croaking, but I knew the frog had a prerecorded ribbit sequence. Sadly, the motion sensors were clogged with mud, and damaged somehow. I debated splicing wires in from the arduino’s digital pins to the output of the two sensor wires, and faking a signal, but there was a chance that the motion sensor might produce a specific amount or pattern in voltage.

I realized the frog made a special croak, independent of the sensor, every time he boots up. So rather than messing with the sensor, adding a waveshield, or tackling the existing board and processor, I decided to exploit Hypno’s boot cycle. All I needed to do was briefly cycle power for single ribbits, and give him a lengthier cycle of power to do his full croak. This provided two sounds for two different contexts. At slightly below nominal voltage, I found I could modulate and deepen his croak opening up a third possible croak.

The frog had plastic eyes, with unretractable pegs blocking a perfectly usable hole for LED eyes. I melted through this with my soldering iron, daisy-chained two single LED Neopixels, and gave Hypno indicator eyes.

The eyes were epoxied in place. I melted them out and flat, so the LEDs could color his glass eyes
The eyes were epoxied in place. I melted them out and flat, so the LEDs could color his glass eyes

I built a small protoboard inside the frog’s body, to combine the grounds and voltages of the jump servo, eyes, and croak, feeding it out the bottom with a ribbon, and through a safe hole in the ballbearing plates.

I used the varspeedservo library to create three motion profiles.

  1. The frog turns and looks at the most anxious person in the pair
  2. The frog points in between the two friends and waggles if both people are upset (his eyes glow red, and he stutters short croaks)
  3. The frog points in the center of the two friends and hops up and down if both people are happy (his eyes glow green, and he does his full cycle of 3 uninterrupted croaks)

The jumping mechanism worked fine, but required long, unanchored bolts to go into Hypno’s chassis. I considered grinding the last half of the bolts down, so that he couldn’t snag his bolts on the (speaker) mounting holes at the top of a jump.

“Option Three”

My first, and largest coding hurdle, passed on from the Golf project, was communicating from necklace to necklace, and between necklace and frog by bluetooth. I already used bluetooth directly, from arduino to computer IDE, in previous projects to upload scripts wirelessly and print data. Networking three devices, to both send and receive data from one another, alternating AT roles between master and slave… I decided to avoid this early on. It already failed once with the HC05 modules for Golf.

The second option involved combining four sensors, two vibrating motors,  two ring displays and a frog display, two servos, and the croak, all on the same board. This worked well up until I realized I had created my fluttery light show with fixed delays, and an interruptor timer for the heart sensor. Timing two pulse sensors and performing the inter-beat calculations in the same interrupt, and timing two arduino displays, proved too difficult to program procedurally without creating my own classes. I needed to run two instances of the same class, with independent timers, at the same time: multithreading. Arduino isn’t meant to do multithreading.

Relative heart rates. One male one female, differing in weight by 70 lbs. Values are not directly comparable!
Relative heart rates. One male one female, differing in weight by 70 lbs. Values are not directly comparable.

A third option came to me. I could compare the two heart rates in their raw data, average or select from them conditionally, and just display one rate. This only needed one interrupt service routine, and would output the same values to both necklace. Even if I couldn’t time both separately, I could already treat the colour of the necklaces separately, letting me indicate the dominant beat.

Sadly, my script broke. The previously functional interrupt suddenly decided it was  “not declared in this scope“. I debated returning to a single display, as I had already accomplished one before, but I hadn’t saved enough incremental versions of my script between major changes, to recover all of the working functions I had added.

Hypnotoad awaits the day he can sit and judge us.


 CIRCUIT DIAGRAM

(Coming)


 

CHALLENGES

Millis: The death of multithreaded timing in Hypno’s code came from using a variable delay, with the interval matched to the interval between real beats. The next iteration will include an increment with identical timing criteria for the first 12 loops,  and a variable time condition to meet to restart the pulse. Both displays can have their conditions nested in the same loop.

Corrosion: Determining why the frog wouldn’t power up on its existing system took multimeter testing from battery terminals to motherboard (the battery drawer terminals had corroded in the rain)

Copper Traces: I spent more time than necessary trying to learn how to fix lifted solder pads. I scraped, sanded, and alcohol-washed the exposed metal, then tried using plumbers solder and other metals to wet the bare copper. I found an obscure and offbeat youtube video about replacing traces with rare conductive epoxy, and not so rare copper sheets. Then I gave up.

Voltage Shortages & Simple Solutions: I kept getting an inadequate 2 volts of power from the digital pin outs on Arduino, trying to feed Hypnotoad’s hardware 5 or more. I spent half a day failing to step it up, with every combination (or combined rating) of diodes, capacitors,  and transistors. I went as far as using a reed switch, supplying 5 volts directly, with the 2V digital pin powering the switch, and still couldn’t trigger the delivery of 5 volts I needed out the other end. It turned out I was missing the PinMode Output command; a one minute fix.

The Reed switch takes voltage through a coil inside to magnetically pull an open gate above it closed, and let supply voltage through. It didn't work. Still 2V
The Reed switch takes voltage through a coil inside, to magnetically pull together a broken gate, and let supply voltage through. It didn’t work. Neither did the TIP120 or 121. Two volts always came out
I added three diodes to filter voltage fluctuations giving the frog inconsistent power and croaking. The 2020 transistor tried to regulate 5V from a digital pin.
Diodes filtered voltage fluctuations that gave the frog inconsistent power and modulated croaking. The 2020 transistor tried to regulate 5V from a digital pin.

Hypnotoad’s Throne: Figuring out how to rotate the frog without breaking the servo or snapping a cord, or mounting the entire assembly off of the fragile servo blade, was tough. The mounting assembly alone, took 2 days to figure out, build and calibrate.

 

Furgie and Furgurt: Virtual Cuddling Pets

Overview

Furgs_overview_diagram

Meet the Furgs – Furgie and Furgurt!  The Furgs love to cuddle, and that is their main purpose – to allow partners to cuddle virtually.  By hugging one Furg, you create vibrations within your partner’s Furg.  When both partners hug their Furgs at the same time, the Furgs vibrate with greater intensity and purr.  In this way, hugging your new furry friend allows to you feel the vibrations of your loved one – from anywhere in the world.

Watch the video:

 

Structure

The Furgs are made with the following:

Sensors: Bend sensors are used to enable the detection of hugging.  If a sensor is bent past a pre-determined threshold, a hug gesture is detected and the actuators are triggered.

Actuators: Multiple vibration motors and a buzzer motor are sewn into the inner layers.  When Furg A is hugged, the first vibration motor of Furg A and Furg B vibrate, and the buzzer starts making a purring sound.  When Furg A and Furg B are hugged at the same time, all vibration motors on both Furgs vibrate – the Furgs vibrate with greater intensity.

Boards: Each Furg runs off of its own Arduino board and its own Lilypad board.  The reason that two boards are included in each Furg is to provide many pins for future functionality (described below).  The Arduino and Lilypad boards communicate across a Software Serial channel.

Connectivity: The Furg Arduino boards are linked via a Software Serial channel.  This was done for demonstration purposes only – future versions will include WIFI chips, so that two people, located in different locations, can hug via the Furgs.

Note, that initially Bluetooth chips were used, but the chips proved to be unreliable, often dropping pairings and/or running with inconsistent timing.  Sometimes the Furg communications would transmit simultaneously, while sometimes it would take several seconds.  As a result, Bluetooth was dropped.

Materials: The furgs are built around an egg shaped stuffed animal core.  Their custom outer layer is made of very soft, shaggy fake fur.  The ears are made of another type of fake fur that resembles cheetah fur, sewn together with the shaggy fur.  Fabrication of their shape was the most time consuming process over all others.  (Sewing is not one of my top skills, but it was great fun giving the Furgs their playful form).

 

Core Functionality Furgs_functionality_grid

 

 

Code

Code for the project is available here: https://github.com/jamesessex/TheFurgs

 

Future Functionality

Although I believe that the ability to hug and cuddle virtually is awesome, I see a lot more potential in the Furgs and will be building out more features.  In addition to cuddling, multiple gestures will be added that enable virtual play fighting.

There will be no rules built into the pets, only a rich set of gestures.  Users can then make their own games out of the gestures.  For instance, ear tugging and belly poking will be enabled.  Imagine a girlfriend poking the belly of her boyfriend.  The boyfriend can poke back, or tug an ear, or send a playful sound or  playful gesture (by selecting one from  an LCD screen that will be added to the Furg bellies).  Users can create their own playful rituals around the Furgs, and can change and evolve those rituals as they please.

An initial set of features to add include:

  • Rich sound set
  • Head petting
  • Belly rubbing & poking
  • Nose touching & nose rubbing
  • Ear tugging
  • Jiggle and bounce
  • LCD screen – displays animations like heart bursts, and provides a menu that enables sending of messages, sounds, and animations

 

 

 

 

 

 

 

 

 

 

 

Hope: The bird comes alive as you approach

vlcsnap-2015-12-15-11h00m59s4051vlcsnap-2015-12-15-11h01m59s573मनम

Overview:

My project is a bird which reacts to human presence & evokes, triggers a certain emotion in humans.

Inspiration: We everyday see birds around us but we hardly notice them or stop &think about how they feel. what they think, etc. The main motive of  this project is to make people realize the beautiful environment around them. When we try to control the environment, it affects the creatures living in it. Thus the cage is a symbol, it is a metaphor used depict how we all have knowingly or unknowingly control the environment surrounding us.

1989818538_129306bb8e_b

Interaction Model:

Stage 1:  when no one is around, the bird shivers & looks around for someone to rescue it.

Stage 2: as a person starts approaching, the bird starts flapping its wings & neck happily with the hope that it is going to be rescued.

Stage 3: when he departs the bird is again sad &awaits for someone to approach him again.

DSC_6044 - Copy-min DSC_6052 - Copy-min

Attempt:

I basically tried to capture the natural movement of a bird. The way it reacts when it is happy, frightened, sad, etc. There were 3 movements involved:both the wings & the neck. hence, 3 servos were controlled by a sonar sensor.

GOPR0089-min

Challenge:

The most difficult challenge was crafting a bird. the look had to be very natural & beautiful, eye pleasing to evoke an emotion in the user. The bird cannot have a rusty, grunge look. The most difficult was to hide the 3 servos & still control the movement. It should not be fragile but it should look delicate.

DSC_6177-min DSC_6257-min DSC_6115-min DSC_6334-min DSC_6285-min DSC_6300-min

Appearance:

I crafted the complete bird in pure white foam board with joining the pieces thus making it stylized. The cage was purposely black & dark to create a contrast &evoke an emotion.

Taking this forward, I would like to add more gestures & movements in my bird which will make it more realistic. Sound will add more to it. The neck can move even up & down with the addition of one more servo. So my next steps would be to explore & try out new possibilities to make my project better.

The project was really inspiring since this required all the 3 skills- observation, articulation & execution.

Thank You.



DO NOT DISTURB

DSC_6079-min

 

An interactive public installation was a project done as a part of creation computation in digital futures.

The process started with brainstorming o of the interactive installation was to create some character which reactions on the environment and also reacts to the people interacting with it.At the very beginning I started with thinking of different emotions and reaction people normally deal with and later on i thought of selecting disturbance as a metaphor to start with.

iStock_000011513482Large-minReading Mode is the best state of mind where man is in his own world of imagination . Getting disturbed by people noise or any of interrupt is frustrating and disturbing which no one likes.DSC_6233-min

DSC_6197 - Copy-minAn reaction of the installation towards the audience was a amazing.

jgkjkjk

 

 

 

Tamagochi-Halloween Dog Ghost

My Tamagochi project is literally a mechanical animal toy: a dog ghost who response to people around him and  reacts when people try to pat him.

tomagochi

It may seems to be strange, but the source of my inspire is exactly Tamagochi itself. I owned a toy like this when I was a little kid, and I enjoyed the little digital pet since I couldn’t own a real pet. That’s why I wanted to created some thing similar to a toy animal, but with some funny twist in it.

9382383_orig

The major design of my Tamagochi includes two parts: the sonar sensor and light sensor which detects motions around the toy; the other part is the servo on the top of the doll, which makes the doll vibrate and turns around as programmed.

Design In Process

My original design was quite different from the final one. The original design includes:

-a sonar sensor array which can detect people’s approaching from three directions

-a mp3 shield which plays sound effect when needed

-a color sensor which can mimic the color of people’s clothes with an RGB led.

-a piezo element which detect tapping

 

Unfortunately, most of them were discarded due to following reasons:

-the mp3 shield somehow slows down the whole program, which resulting the sonar senor not being able to detect approaching fast enough.

-the color sensor can only read color very close to it, which makes it impossible to function as an “eye”, observing color from a relatively far distance

-piezo element detects a large range of tapping rather than within a specific area, which makes the idea of “tapping the right spot” insignificant.

-sonar array actually works fine as expected, however the idea of the doll “seeing” peoples’ approaching and turn away from them always seemed to be missing some piece.

 

Eventually it occurs to me that rather than waiting the doll to turn back and face the audiences again, it would be a lot more interesting and funny if the doll actually invites people to tap it, and it make it ever better if the audience have to chase the patting spot since the doll will keep turning away and try to evasive their hands. The invitation of interaction was the key to turn the doll into a robot.

20151210_115046

Diagram

The code itself is fairly long so it might be better if I just explain the logic behind the code here:

1)the sonar sensor keeps checking if there is anything approaching, while the servo constantly vibrates as if the ghost dog is slightly shaking

2)if anything get into the sonar detecting range, servo turns to a random direction

3)if the object (audiences’ hands, in this case) follows the sensor, the servo turns into another direction

4)if the light sensor is tapped and cannot detect light anymore, the green LED lights up, servo turns back to its original position. Sonar sensor is turned off.

5)wait for 3 seconds, then the servo and sonar senor back online.

Conclusion

Although the design of this project has been altered a few times in process, the final result functions quite close to what I original wanted to create: a machine pet which react to people’s behavior, just as what a traditional tamagochi does. And from the feedback I received during the exhibition, I believe the audience did enjoy playing with it. I had a lot of fun while trying to figure out the best way achieve the result I want, and the final result is satisfying. There are two things I would like to change though:

1) I hope I could find a way to make the mp3 shield to work. It would be even funnier if the ghost dog can actually talk when moving

2) the fish line only merged as a problem after the testing has been almost done. The robot, however, was designed to be hanged on the ceiling by a fish line. The sonar sensor was placed in a position where the audiences’ hands would be detected if they want to touch the “pat spot” from the same level. However, after the fish line accident (which ruined my first prototype), I switched to a clamp and mounted the doll between two chair. Naturally, it’s not positioned on a level which is a lot lower than I expected. People have to crouch to be detected by the sonar sensor. If they try to tap the doll from above, the servo would not turn at all.

In my future projects, I will make sure more test would be done before the final deadline to avoid these kinds of problems. Overall, the project was enjoyable and educational. Thank you!

Tamagotchi Project – Amaze, an autonomous turtle robot

tamaPosterupsideDown

Video to be uploaded…

Overview

Amaze is an autonomous robot with four robotic legs, mimicking the movement of a turtle. When it detects an obstacle, it gets scared and moves away from it. But there is a chance it becomes bold and dashes into the obstacle. An open-ended maze sits at the center of the top of its body. With its shaky movement, a ball moves around within the maze.

Emotion and Movement Patterns

  • Calm, moves forward/left/right randomly
  • Cautious, looks around to the left and to the right after each movement
  • Panic, simultaneous movement of fore limbs and rear limbs at a high speed
  • Scared, slowly makes a left or right U turn and moves away at a high speed
  • Bold, slowly makes a left or right U turn but turns around half way and dashes into obstacle

(Refer to Implementation -> Algorithm for more information)

Design

Behavior / Emotion

The turtle robot reacts to obstacles and shows fear, cautiousness, or calmness through a sequence of movements with variable directions and speed.
My original idea was to trigger some behaviors (of the turtle robot) if the maze is being solved. If the “egg” of the turtle gets lost in the maze, the turtle moves furiously. But when the “egg” is returned to the “ending point” of the maze, the turtle calms down and moves slowly again. Then the emotional feedback is based on the maze instead of external environment or humans. So I decided to use distant sensors as input from the surroundings to allow more interactivity.

Kinematic / Kinetic Mechanism

To mimic four-legged locomotion, I initially experimented with four micro servo motors with extension. It didn’t move effectively because the forward sweep of servo arms counter the backward sweep to some extent. I decided to create four limbs with two parts and one joint on each, where each lower limb is limited to swing within approximately 90 degree, like human limbs. I also researched about turtle locomotion pattern and decided to go with what is called Hatchling Terrestrial Locomotion (diagonally opposite limbs moved together) among most sea turtles (Lutz and Musick). This pattern works well with the limb design. There are alternative gaits such as crawling type or eight-legged locomotions like spider. One possibility is to extend the servo arms with turtle flippers to enable swimming.

(Lutz, P. L., Musick, J. A., & Wyneken, J. (Eds.). (2002). The biology of sea turtles (Vol. 2). CRC press. Retrieved from http://www.science.fau.edu/biology/faculty/Wyneken/DOC050817-012.pdf)

Apperance / Material

In order to mimic the appearance of a turtle, I focus on mimicking animal limbs and the shell of turtle. I intended to build everything out of bamboo, which may imply natural life forms. But since bamboo is hard to cut and sand, I went with wooden material for limbs and base.

Interestingly, I found a plastic food platter that has the perfect shape and size for the base for making a turtle robot. Without that, I might have made the base out of wood or bamboo by handcrafting.

For the maze, I tried heating bamboo strips in order to create curvy maze walls. Then I found it is more effective to heat acrylic and bend them manually.

For the shell, I initially wanted to make a dome with turtle shell texture. Since the texture may block the sight of the maze inside, I decided to go with a clear plastic dome.

Implementation

Hardware:
Arduino Mega 2560
Mini Breadboard x 2
Toggle Switch x 1
9g Micro Servo x 4
Ultrasonic Sensor
5V extended battery for smartphone

turtleInteriorBlog

Code:
https://github.com/Minsheng/tamagotchiturtlebot/tree/master/turtlemotion

Algorithm:
The following types of emotions/movement patterns are implemented,

  • Calm, default pattern in each cycle
  • Cautious, triggered after fear level reaches a threshold (currently 3)
  • Panic, triggered if distance from obstacle < 10cm
  • Scared, distance from obstacle < 40cm, probability of event set to 70% chance
  • Bold, distance from obstacle < 40cm, probability of event set to 30% chance

fearLevel, a variable that keep tracks of the number of times Panic or Scared mode is triggered. If it reaches a threshold, Cautious mode is activated and this variable gets reset.

becomeCautious, if set to true, the turtle robot moves to the left and to the right sequentially after a random move. It gets reset after being cautious for certain amount of cycles in the main loop.

To make the turtle react slower, previous and current distance sensor values are checked whenever an obstacle is detected. Only when both of the values break certain thresholds, the emotion pattern changes.

Library used:
NewPing (http://playground.arduino.cc/Code/NewPing)
VarSpeedServo (https://github.com/netlabtoolkit/VarSpeedServo)

Challenges / Resolution

  • achieving smooth movement/added two wheels at the bottom of the base and adjusted limbs’ length
  • synchronizing multiple servo motors with variable speed/used VarSpeedServo library functions; always wait until the slowest servo to complete before set the next positions for the other ones
  • failed to implement backward movement/let robot make U turn instead
  • unable to upload programs to Arduino Nano at some point even after reburnt bootloader/replaced with Arduino Mega 2560
  • working with two or more ultrasonic sensors/used one to detect front
  • determining where to showcase the robot/let it go and pray no one steps on it

 

 

Curious

IMG_1954

Curious is an interactive installation that introduces viewers to the idea of technology having its own personality. This behaviour is achieved by making Curious interested yet timid around the viewers observing it within the gallery. Curious is composed of main sections. The first section is the frame which is suspended a couple feet below the ceiling. The frame holds the majority of the electronics along with the two required two stepper motors. The second section is a horizontal light bar that is suspended by two cables, one at each end of the light bar. Each cable is attached to the stepper motors that is located directly above the light bar attached to the installation frame. The final section is the sonar sensor array that consists of 5 sensors that are located a couple cementers off of the ground.

The default state of the light bar is suspended a few feet below the installation frame and is glowing by the one meter 60 pixel NeoPixel strip that has been feed through the acrylic tube. The light bar creates light patterns to try and generate interest from people in the gallery. This is the installations way to try and beckon the viewers towards itself. When the viewer becomes close enough to the installation the light bar start to mimic the user’s movement if they move from the left side to the right side of the bar. It essentially tries to  bond with the user through imitation in motion. If the viewer gets too close to Curious it become nervous and scared. When this happens Curious will fade it’s lights to be completely off and raise itself back up to be just under the suspended frame. Curious is  trying to hide from the unfamiliar creature, the gallery viewer. Finally, when the viewer backs away Curious starts to lower itself from above in an investigative manor. This includes lowering one side and than the other of the light bar and animating the light back and forth as the bar is being lowered as well as slow lowering of the light bar at a consistent speed while the LEDs preform a number of creative light patterns.

IMG_1972

Features

The frame that holds all of the electronics is held from the ceiling,  there is no guarantee for what the hight of the gallery ceiling will be. This made it pivotal to devise a solution to be able to adjust and configure Curious to be able to behave with these uncertainties. To solve the problem of not knowing how high the frame will be off of the ground and how far the motors will have to raise and lower the light bar I created different modes in the Arduino logic to set the starting position of the light bar and to adjust the steps for each motor incase they went out of sync. This communication was through the serial port or by using the HC-05 Bluetooth module. Having this configurability in a “Configure Mode” I was able to wirelessly control the motors individually to correct any errors in the motors preventing the light bar from being completely horizontal in its default state. Another state that was programmed in and controllable via serial or Bluetooth was the ability to control how the installation behaved depending on how busy the gallery is. There is a low traffic and high traffic setting to change how the installation would react to its surroundings. The low traffic mode would behave like explained above with Curious having a sudo-personality and the high traffic setting sets the light bar at a static height and illuminates the bar with glowing lights and rotating colours animating from left to right.

IMG_1970 IMG_1937

Prototypes

When coming up with the form and design of Curious I went through more than a couple  iterations and revisions. I sketched out my ideas and got feedback from classmates and friends on which they preferred. Below are some selected sketches to illustrate my thought process on the design before I started manufacturing the installation.

IMG_1986 IMG_1990 IMG_1989 IMG_1988

Challenges

The main challenges I faced when working through the different iterations and stages of prototyping and developing Curious were:

  • getting the proper volts for the NeoPixels to run correctly
  • getting the correct volts and amps for the stepper motors to work properly without burning out the Adafruit Motor Shield and its Integrated Circuits (IC)
  • the weight of the cables and their flexibility when to winding up the light bar
  • the stepper motors axle diameter in comparison to the diameter of hole meant for the axle in the pulley
  • reading and storing returned distances from the five different sonar sensors on the sonar bar

To solve the volts issues with the NeoPixels and the stepper motors it was a matter of lots of reading and testing different circuits. I had to use a 1000uF capacitor for the NeoPixels  to work correctly. With the stepper motors, I was originally trying to use 5 volt and 1 amp to control them but that was too weak — I needed more power. I switched to trying 9 volts and 2 amps, this also didn’t work; the motors needed exactly 5 volts. With the requirement of 5 volts being required I started to use a voltage regulator which helped me get 5 volts to the motors but the regulator was overheating. The voltage regulator also would only output 1 amp, as mentioned above this wasn’t powerful enough. I still needed more amps. The solution was to use a different power supply that would output 5 volts and 3 amps. Once I had this piece of the puzzle working correctly I was in business. Even with 3 amps there was still a little bit of motor slippage which I think is that the motors can still accept more amps but the motor shield I am using can only handle 3 amps — even after I added heat sinks to the motor shield motor driver IC chips.

In the next iteration of Curious I would like to use smaller but more powerful motors that also would need fewer amps. With what I learned about stepper motors I would be able to improve greatly how they are integrated into the installation. For this current version I had to spend a lot of time figuring out the correct power and configuration to control the stepper motors.

It was a challenge to come up with a solution to connect the pulleys with a large diameter to the motor axle of the stepper motors that have a very small axle. Initially I had tape wrapped around the motor axle to increase the diameter, this became loose after the motors were running for 10 to 15 minutes. As the motors were running they heated up and the glue on the tape started melt and lost its adhesive properties. The second solution I came up with was creating an axle extender that attached to the motor axel. This was crafted out of wood. This worked but it pushed out the pulley too far from the motor which isn’t what I was looking for either. The end solution, as embarrassing as it is was to fold thin pieces of cardboard to the height of ½ the diameter of the pulleys diameter and use that to center the pulley on the axle when screwing down the pulleys axel bolt. This solution would need to be revisited in the future for a better solution. Perhaps a bushing or a piece of custom made hardware would be needed.

IMG_1903

With my prototype I used a very flexible and lightweight rope that was connected to the pulleys on the motors to the light bar. When I was getting ready for the install in the gallery, for aesthetic purposes, I switched over to a thin cable. This was problematic due to the cable needed more weight applied to it, pulling down to keep the cable taut. The light bar did not offer enough weight. Not having enough weight also prevented the cable from winding up on the axel tightly. There was some slippage from the cable because of this. To solve this in the future I would use another form of cable or weigh down the light bar enough to keep the cables taut and wind up on the pulley nicely.

To store the sonar values I forked the Arduino Array library https://github.com/jshaw/arduino-array and updated it to work with the newer versions of the Arduino IDE as well as added the `getMinIndex` and `getMaxIndex` functions to help with Curious.

The final hiccup with the development of Curious was the way that the Ping sonar sensor library behaved when I was when trying to read the returned values from the multiple  sensors. This was an issue because I was also controlling 60 NeoPixels and two stepper motors a bluetooth chip along with the five sonar sensors. To solve the issue of inaccurate returned data and the fact that there is a delay in the Ping sonar library messing up some timing I switched over to using the NewPing library which allowed for better readings of multiple sonar sensors.

I’m looking forward to working on this project further iterating some features and functionality and fine tuning it and taking it to the next level!

Video

Schematic

curious_bb

Source Code

https://github.com/jshaw/cc-antisocial

https://github.com/jshaw/cc-antisocial-processing

https://github.com/jshaw/arduino-array

Under the Table

 

My plan was creating a playful interaction with the public by referring to a cliché scene.
At this occasion, re-interpretation of a cliché facilitates the process of telling the joke and allows setting-up the scene with the minimum elements since there is already a reference in people’s mind.

After some complicated ideas, I have decided to imitate the typical scene of someone who plays footsie under the table. One of the great advice of Demi, in class, was not to approach to the piece as an on/off action, instead make sure that it keeps moving even in a subtle way and then with the trigger reveal the bigger surprise. Therefore, we’ve included the constant rocking movement, which helped to catch the attention of the people in the gallery as soon as they saw the legs.

 COMPONENTS
 – Two high-torque servos
–  Maxbotix Ultrasonic range finder: http://www.maxbotix.com/Ultrasonic_Sensors/MB1000.htm
Maxbotix is more expensive comparing to what we are using in general, but I recommend especially for people detection. Its coding is easy and from testing to the execution process, it went smooth.
– Chicken wires
– Galvanized wires for shoes and forming the legs
– N-Channel MOSFETs (2 MOSFETs for each servo)
– Capacitors and resistors

 

EXECUTION

photophotophoto-2

Another great recommendation was from Reza. The chicken wires saved me incredible time in the execution process. I wrapped the wires around my legs to create the form instead of using a display model as a template.

This amazing robot hand tutorial provided me insights about how I can solve the mechanism. The most time-consuming part of the structure was creating the shoes. The core movement was focused on the foot, so it has to look elegant as much as possible. I kept everything light and avoid using any clothing or shoes not only because of the weight of the materials but also was afraid to make the concept looks tacky.

The biggest challenge was to make the movements looks organic and smooth as possible in the coding process. In order to do that, I’ve been recommended to use ‘counter’ instead of ‘timer’, which kept running things at the same time. Also, I’ve created different possible states for each leg position to control all the actions. The rest was fine-tuning to make the movement looks as human as possible. I created the shoes/foot by combining wires and fishing lines, which helped to produce a more natural movement.

photo-5

During the show, I’ve noticed women were having more fun playing with the piece. They were approaching right away to figure out if something will happen when they get closer. They were coming back and playing again. Men were more distant if they are not in a group. Besides, one guy kicked the foot couple times, which was weird. (not the one in the picture)

The legs are now able to act just based on primitive instincts. In the future, adding some smartness could help the legs to be more selective, instead of checking everyone, and make the whole experience more interesting.