Monthly Archives: December 2015

Weezy Wonder

Weezy Wonder is a Stevie Wonder fan down to the dark glasses and his mushy song loving heart. He waves you over, falls chronically in love when you come over ( as indicated by his blinking heart) and insists on singing you a song while you hold his hand.

I wanted to keep my tamagotchi design simple but able to evoke amusement or joy in whoever interacts with it. Hence Weezy’s romantic crooning action.

Making Weezy: The Body & Computation

I bought a  ‘rocker’ hamster toy and removed some of his old wiring parts from the inside to make room for this version of this toy. The wiring which included:

  • Blinking heart sonar
  • Handwave Servo
  • Light sensor trigger for the hand-holding action
  • MP3 Shield to play songs

 

The initial idea was to use a pulse sensor as it was suggested it would be much more interesting to have Weezy sing a song based on someone’s heartbeat. This was abandoned because the readings seemed unreliable and would not always trigger music as it should.

IMG_20151205_211327
Pulse sensor try out

The handwave that stopped working a day before the presentation:

 

All the wiring was connected to an arduino board overlayed on the Sparkfun VS 1053B MP3 Shield, and a breadboard and inserted in to various parts of Weezy’s body.

Weezy’s emotional cues and reactions are based on the following computation:

  • the code for a servo motion of 180 degrees.
  • A sonar code that made his heart blink when anyone came with 450cm
  • A light sensor code that triggered a song on the mp3shield when the hand is covered or ‘held’.
  • The MP3 shield sample code modified to play a random song when the light level falls.

Git hub link to code: https://github.com/manikpg/Weezy-Sings-Your-Heart-Song

 

Challenges

Despite the testing of multiple code variations, the song trigger would not occur more than once unless the arduino was uploaded each time. Two codes were written using the sample code library provided by the Shield and these resulted in no song playing at all so they were abandoned.

In the days leading up to the Gallery presentation of Weezy, I tried to a few back up plans that were still based on using the MP3 Shield.  I wanted to somehow get Weezy to play random songs based on a sensor. Since my code was so faulty, I resorted to trying  a tested code where  a motion sensor triggers the playing of songs. While the sensor worked, the song would not play so that was then abandoned. The shield seemed very sensitive to most other coding add ons.

I tried to have just one song play instead of a number of random songs, just so that it would still play each time someone held Weezy’s hand instead of needing to be uploaded on the arduino application each time. Removing random still did not work. It would only still work once and an upload was required so it was laptop dependent to the end. Regardless the only code that worked even with the uploading each time was the only workable version I could get to perform close to the concept of playing a song each time.

 

It was baffling how the slightest change to related codes stopped the music playing. On the day before the show, the hand wave had to be scrapped as the songs would not play if I included the servo code for hand waving and the sonar stopped working on matter what I tried. His heart stayed ‘on’ all the time.

 

 

 

 

 

 

 

 

 

 

 

 

 

Marty

Marty
By Andrew Hicks

IMG_5118

Concept: First Thoughts 
When first introduced to the tamagotchi project, the challenge of eliciting an emotional response through computational means guided me to think of using computational elements to be as human as possible. When thinking about parts of the human body that visually express the most emotion, I considered eye brows as being such. This stemmed from my brief involvement with studying (more so admiring) traditional character animation in the past and how simple angled lines can convey so much.

Concept: Theme
Over the last year, I have been very fixed on the definition of “value” and what it means to not only myself, but to others. What creates value with or without money? What do we get out of value and how long does the value of something last, or how quickly can value depreciate? When do people start caring or stop caring over the value of something, be that with the use of time, money or emotion? I circled around these thoughts for my final tamagotchi presentation.

I knew I wanted to involve currency of some sort, but was not set on using actual money. After expressing my focus for my tamagotchi project to my colleague, Leon Lu, he quickly guided me in the right direction of using time as the value for my project. Although using time was a great idea, I still wanted someones time to be valued  or validated, other than expression being the final transaction. I wanted emotion to manifested in by some qualified means and a transaction to be the (figurative) period at the end of the sentence when interacting and connecting to my tamagotchi pen pal.  As a result, I used pennies as the final trade off for someone’s time. 

In the end, I wanted the tamagotchi to do the following:
1) Upon entry, spectators would notice the sad looking tamagotchi.

2) Upon facing the tamagotchi (within 5 to 25cm), the tamagotchi would slowly shift mood from sad to a happier state by slowly raising it’s eyebrows within 20 seconds.

3) If the user spends time with the tamagotchi for over 20 seconds, the tamagotchi would then provide a penny to the spectator. After the spectator would leave, the tamagotchi would default to a sad looking state.

4) However, if the spectator leaves the tamagotchi before 20 seconds, the tamagotchi would then become angry for 5 seconds, and then default to a state of being sad.

Prototype
I started building a simple prototype using some white foam-core, two 180 degree servo motors, a sonar sensor, Arduino with breadboard and two popsicle sticks as eyebrows. 

From here I was able to establish basic emotions such as anger, sadness and contentment using a few lines of code.

Initial Code
I knew early on that the timing of emotion would have to be controlled in order for spectators to understand the process of emotion with my project. As a result, a servo library that allows speed control of 180degree servo motors was adopted for my project. When referencing the library and code, you can see an array that controls the angle, speed and boolean value. ex: (45,20,false). The speed of the motors is controlled between the values of 1 and 175.

Screen Shot 2015-12-18 at 11.49.58 AM

Fabrication
Having tested my eyebrow prototype and basic code set up, I began fabricating the coin dispenser for my final piece. I was lucky to find a plastic spool, which I believe was used for wire, that had the same circumference of an american penny.

004

Penny Dispenser
Referencing the shafts I created for our research group assignment, I fabricated a shaft that would push pennies of of the plastic spool chassis one by one. In order for this to work, a lot of time was spent on allowing the proper space for only one penny to be dispensed. Coincidentally, the popsicle sticks I used had almost the same depth as a penny. Along with some glue, the space created was enough for one penny to be ejected from the spool at a time. I got very lucky. I glued the servo and shaft into a slot I made (using a Dremel tool) into the plastic base of the spool. All worked well, except for each penny to drop from the spool.cI had to provide a front end lift for each penny on the spool, in order for the dispensing arm to hit it exactly on its edge. To do so, I glued some wire that I would adjust to the exact height that was needed for each penny to be pushed out and dropped.

Face
The face of my tamagotchi project was created using half inch width foamcore. Using adobe illustrator, I drew out the face over six 8.5×11 art boards to be printed and then transferred on to the foamcore. Doing this provided a solid plan as to how and where the backend and front end components should meet, as precision had already become a big learning curve for myself when going forward with this project.

03

Using the maker lab, I cut the foamcore down using an exacto knife. I was hoping ot use a bandsaw, but the foam core was not dense enough to do so. My concern was to get a perfect looking circle for the face of my tamagotchi, but I was surprised at how just an exacto knife and fine grain sandpaper can smooth out a circle when using the foamcore material.

I cut out the eyebrows and moustache by hand, as well a mouth piece that I did not have time to implement entirely on the face.  The mouth piece was considered in order to provide a happier looking face/ expression when the tamagotchi was meant to be overly happy with someone’s prolonged presence.

Presentation
Closer to the final presentation, my strategy to code the entire states of emotion and user interaction variables became a challenge for myself. As a result, I coded a plan B state that would have the tamagotchi provide a penny for every two seconds a spectator would spend time with it, while emoting an angry face when providing each penny.

I placed my piece on a pillar, facing the entrance of the gallery. This was intentional, so that user’s could see the sad state of emotion the piece was expressing form a distance.

I used an iPhone power adapter to provide a constant 5V power supply to the Arduino board on the back of the piece. Oddly enough, power supply was the last thing considered when presenting the piece, considering the attention I provided to everything else on the project.

Conclusion
Overall, I was very pleased with the outcome of the project. I believe I set up an obtainable challenge for myself that was enjoyable, yet still challenged me without becoming frustrated.  Going forward, I would add the mouth servo motor and spend more time with the coding component of the project to have a better understanding on how to code states.

Fritzing Diagram:

Marty Fritz_bb

Code:
https://docs.google.com/document/d/1d0y-OmjS3nz2YWyBzdpO7WhLzdrY4CTKLhjEPiSS-VM/pub

SPOT & DOT – Tamagotchi Project

SPOT & DOT

General Summary

For my Tamagotchi project I created two robots that each take sensory input that guide output behaviour. SPOT is a web camera mounted on a robotic arm that controls its movement behaviour along three degrees of freedom (i.e., 3 servo motors). This version of SPOT is a re-iteration of a research project completed by Egil Runnar, Ling Ding, Tuesday, and myself where we had to create a life-like moving robot inspired by a robotic installation from the 70s named The Senster (see image below). DOT on the other hand is a robot consisting of three concentric rings attached to each other by hinges and servo-motors, thus allowing for three degrees of rotational movement. DOT’s movement along these three axes is pseudo-random, leading to a sense that DOT is exploring its immediate environment and potential combinations of movement and orientations. DOT has one sensory input, a photoresistor sensitive to the amount of light at its surface. The degree of light DOT senses is mapped and used to control the speed of DOT’s rotation where more light leads to faster movement, and less light leads to slower movement.

senster10-lrg

SPOT

DSC_8264

Summary

Since this iteration of SPOT is based on a previous group project I will not go into too much depth about the inspirational background and underlying concepts, but the core features, updated fabrication and computed behaviours deserve mention here.

Behaviour

SPOT’s behaviour can be summarized as three potential behaviour modes: random movements during inactive phase, face tracking, and motion tracking. SPOT’s behaviour is based on its camera input. SPOT is interactive as he can follow your face as you move around, you can distract him with physical motion in front of his camera, or you can simply watch his well-orchestrated random movements. SPOT’s camera input is fed straight into a Macbook Pro via USB and processes the visual information using Processing and a series of visual processing libraries.

Random Movements: When there are no faces or movements detected, SPOT’s three servo motors are given random movement commands (varying in final position and speed) and moves in burst intervals. In this mode, SPOT can take on a number of random movements and positions, often leading the viewer to anthropomorphize his actions. Each of these movements are executed in three steps – a fast initial movement, followed by medium speed movement, and then brought to the concluding position via a fast movement. Combined these three movements give the impression of organic behaviour where animal movements follow a similar pattern in nature, thus adding to the sense that SPOT is a living and intentional object.

Face Tracking: When SPOT detects a face in his visual field he moves his head up or down, or his base left and right to center the face in his camera input. If the face is far into the camera’s periphery, he will execute a fast corrective movement, otherwise smaller adjustment movements are made.

Motion Tracking: SPOT’s final behavioural mode is motion tracking, which uses an algorithm that calculates the degree of pixel change from one instance to the next. This behaviour was novel compared to SPOT’s original iteration. If enough change is registered compared to a threshold value, it will be interpreted as movement. If behaviour was detected along either the left/right axis or the up/down axis relative to SPOT’s centre of vision, he would make a compensatory movement to bring the moving object closer to his center of vision. One issue with the motion detection is that motion would naturally be detected during SPOT’s random movements, so motion detection was always temporarily disabled until his movements were completed, otherwise he would get locked in a neverending loop.

Parts

  • 3 servo motors (2 TowerPro servos + 1 mini-Servo)
  • laser cut pieces of Baltic Birth plywood (1/8th of an inch thick)
  • 1 generic webcam compatible with OSX
  • various nuts, bolts, washers, and mini-clamps
  • Arduino Uno + USB cable + jumper wires + breadboard
  • 5V DC Power Supply
  • Computer (here a 2013 Macbook Pro)

Fabrication

The plywood body composing the arms and base structure were designed and laser-cut from Baltic Birch plywood. The CNC file used to cut the pieces incorporated both the designs for SPOT and DOT to be cut in one print to save money and time. The CNC file was created via Adobe Illustrator as shown in Figure 1. The logic of the CNC laser cut file is that there is an order that lines will be cut such that blue lines are cut first, green lines second, and finally the red lines.

Screenshot 2015-12-17 23.05.22

The servos and camera were fastened to the plywood body and the parts were put together. The servos were locked into the CNC slots and held in place via a nut, bolt, and a piece of metal used as a clamp. The camera is held onto the arm via a nut and bolt. None of the parts of SPOT were glued together; it is held together entirely via clamps or nuts and bolts.

Software

Some aspects of SPOT’s programming were somewhat complex, particularly refitting the motion algorithm to suit the present purposes, and the overall control flow taking into account the camera input to Processing, which controlled the Arduino (and moved the servos), which then sent feedback back to Processing to control camera input. The face tracking algorithm was borrowed from an OpenCV library, while the motion tracking was taken from an algorithm adopted from the course instructors.

I adapted the motion control algorithm to not only distinguish between movement at the left vs. right of the screen, but also to determine if there is motion near the top or bottom of the screen. Motion was not registered in the center of the screen but only in the peripherals.

Processing’s control of the Arduino was through the serial port, where the Arduino would then send data back to Processing through the same serial USB port. Arduino would send data back to Processing because SPOT needed a way to suppress motion detection while moving, or else he would be stuck in an infinite loop. This was where the control flow started to become complicated, managing face tracking and motion tracking while making the robot behave as smoothly as possible.

DOT

IMG_0615

Summary

DOT is a relatively large (about 2 feet tall)           interactive installation with 3 concentric rings positioned within each other, affording 3 axes of physical rotation about a static central position. The three concentric rings are rotated using servo motors via independent commands. Two of the rings rotate about the y-axis (leftward and rightward) while one ring rotates about the x-axis (upward and downward). When all rings move simultaneously, it presents a mesmerizing combination of motions that provoke one to reflect on physical movement itself. DOT intentionally invokes a relation to astronomical devices, or to the rotations of the human eye. When DOT moves about, DOT can even give a sense of intentionality as one can relate to some of the movements it independently makes, and some of the positions it finds itself in, and in some ways ominously reminds one of biological motion. The speed of DOT’s motion is influenced via its one photoresistor, where more light leads to more agitated and faster movements.

Inspiration

Smithsonian11a

The concept for DOT came about as I randomly stumbled onto a website focused on mechanical movement (http://507movements.com/mm_356.html). One of the mechanical devices presented on the website was the Bohnenberger’s Apparatus (http://physics.kenyon.edu/EarlyApparatus/Mechanics/Bohnenbergers_Apparatus/Bohnenbergers_Apparatus.html), which was the mechanical device that inspired the gyroscope. This device got me thinking about rotational movements in general, for example how the eye moves about within its socket. I felt that this movement pattern was unique and worth exploring, so I made some preliminary designs. I got feedback from the shop instructor who told me that the best and most sturdy material to use would be either metal or plywood. Working with metal would have been an exciting project but for the looming deadline I surmised that laser cutting plywood would give me the most satisfactory and reliable results.

When the device was complete and finally working I could not believe how interesting the movements were to watch. The Brohnenberger apparatus was initially conceived as an astronomical device used to study rotational movement, but the movement of DOT as it behaved on its own was surprisingly organic. As the rings slowly moved into new orientations relative to each other it became easier and easier to anthropomorphize the result. Sometimes it would move smoothly, and sometimes it would move into what appeared to be strained and dramatic looking positions. This final product will give me the adaptable opportunity to study mechanical movement via microcontroller for the foreseeable future. Future projects may include providing DOT more sensors to guide its own movements and reduce the reliance on randomness.

Behaviour

DOT’s behaviour was relatively less complex in comparison to SPOT’s multiple behavioural modes, but DOT’s behaviour is in many ways more mesmerizing. DOT’s three concentric rings rotate independently (except the outermost ring) and update their position relative to their current position randomly. DOT’s rings move to a new position by rotating up to 10 degrees clockwise or counterclockwise from its current position (the value is determined randomly for each servo). New movements commands are sent to DOT’s three servos simultaneously, and do not update until all servos have completed their movements. The speed by which DOT moves is determined at any given time by how much light is detected via its photoresistor.

When any of DOT’s rings reaches the maximum rotation of the servo in either direction (i.e., zero or 180 degrees), DOT goes through a routine where that ring then rotates to the opposite end of rotation with 30 degrees of slack to prevent a series of redundant movements. While DOT is not as interactive as SPOT, which will track your face and movements in real time, DOT’s behaviour is more independent and fascinating to watch for its own sake, but one can indeed control the relative speed of DOT’s movements. In further iterations, DOT’s central circle could be fitted with sensors to guide its behaviour, leading to a fascinating potential series of intentional rotational movements.

Parts

  • 3 servo motors (1 TowerPro + 2 mini-Servos)
  • laser cut pieces of Baltic Birth plywood (1/8th of an inch thick)
  • 5V DC Power Supply
  • Photoresistor
  • Arduino Mega + jumper wires + copper wire + breadboard
  • Nuts, bolts, washers, spacers
  • Rubber caps

Fabrication

The body of DOT is also composed of plywood laser cuts. The main challenge of DOT was the fabrication, specifically, how to have multiple concentric rings fixed within each other, all turning on their own axis via servo motor. The current solution to this problem was to print two copies of the rings and fasten them together via nuts, bolts, washers, and spacers, and to fit the servo motors within them. Axles were attached to the servo motors and an axle was attached to the opposite side as well. The axles were made of long bolts, and fitted loosely into holes attached to the spacer between the next inner ring. This allowed the pieces to rotate smoothly, or in our case, to turn via servo motor.

dot

The wires for powering/controlling the servo motors were fitted through the space in between the rings alongside the spacers, and fed along each axle, until the wires reached the base where they were connected to the microcontroller. Similar to SPOT, there was no gluing involved in this project. The servos and axle joints were set in place purely by pressure, where tightening or loosening the adjacent ring prints (held apart via spacers) would allow one to fix or release the servo motors. The arms of the servo motors were connected to the axles via circular clamps or rubber caps, which held together the arm of the servo to the arm of its associated axle.

Software

Compared to SPOT, DOT’s software aspect was relatively simple. Unlike SPOT, DOT’s program could be housed entirely on the Arduino. Using the varspeedservo.h library, which allows you to control a servo’s position as well as speed, each servo was given independent commands. Photoresistor levels were mapped to servo speed values ranging from very slow, to a medium pace. Using a Boolean in the third argument to the varspeedservo function, the function was told to wait until the movement was complete before issuing new commands. This had the effect of waiting until each servo completed its rotation before new commands were submitted to each servo.

 

 

 

 

 

Hypnotoad Demands Harmony

 

HYPNOTOAD DEMANDS HARMONY

[Video]

 

WHAT DOES HYPNOTOAD WANT?

Hypnotoad is kind of complicated. When you’re happy, he’s happy. When you’re sad, he’s sad. When you and a friend are both happy, he’s ecstatic! Just both don’t be mad. Hypnotoad wants to read your mind, and he insists, everyone must be happy. Remember…

Hypnotoad demands harmony…

This froggy character is a culmination of two ideas. The ambition to create a tandem biosensor that communicates back and forth between you and a partner, and a creature that mediates between two helplessly less psychic humans.

Silly humans.


WHAT IS HYPNOTOAD MADE OF?

Components

Yard Frog: Hypnotoad was upcycled from an abandoned corner of my mom’s front yard. Neglected and alone, for nearly twelve years, this ancient frog gained timeless wisdom and psychic powers, through a period of extended hibernation and meditation. As a partially cybernetic, water-permeable superbeing, he needed some functional repairs.

He came equipped with a micro controller, a weather corroded battery drawer, a raspy speaker, and an irreparably defunct motion sensor. Or light sensor. Or soul sensor. The mysterious lens lodged in his mouth was caked in years of mud and would not fire.

Adafruit Neopixel ring: I daisy chained a power-hungry (960mA) ring of 16 addressable RGB LEDs, with sturdy throughputs, to a 7 pin, possibly cloned Neopixel disk, with flimsily connected, flat contact pads. (I destroyed 3 of these disks just lightly applying solder)

Pulse Sensor – Earring style: An open source, optical pulse sensor created by ITP and USCD designers, Murphy and Gitman, that collects your pulse, based on the penetration of light through your ear lobe or finger, relative to the amount of blood coming and going,  moment to moment. This sensor differs from typical heart sensors, which detect heart rate through electrical muscle activity near the heart. Unfortunately, it is sensitive to ambient light, and will always return a signal. Testing was originally done on SparkFun’s dry electrode heart sensor, which had 2 extra wires (Lo+, Lo-) designed to filter out junk data when the electrodes lost contact. It was dropped because it dropped too much data, far too often.

Grove GSR: Originally based off of lie-detector polygraph technology, this arduino-specific sensor measures electrodermal activity to calculate moments of stress and excitement, through changes in skin conductance (AKA Electro-Dermal Activity). Testing was originally done on a Bitalino™ micro controller sensor board, but the EDA data that Bitalino generated, rolled through broad changes of stress slowly, while Grove’s sensor picks up moment to moment changes.

Arduino Micro Pro: I tried out a variety of micro pro microcontrollers, including two clones (Funduino and __), a Trinket, and the original variety. Other than a few offset Analog pinholes, and messily scattered labels that were only visible from the bottom, the cheap micro pro clones worked exactly like the real deal, which in turn worked just as well as their more popular cousin, Uno.  The main difference from an Uno is that Micro comes pre-soldered with male pins, which you’ll need to mate to a breadboard or female jumper cables to make use of. This costs you some of the size that Micro saves.

Green EL wire: Electro luminscent wire, that apparently gobbles a dangerous 120 volts straight from the wall, if your microelectronic store doesn’t remind you to buy an inverter. Buy an inverter. Just do it.

Woodcut letters, letter stickers, foam core boards, a cafe table, folding chairs, 3 sizes of hobby boxes, fake pond flowers, green modelling clay, green glitter paper, green aquarium stones, green mesh ribbon, green foam frog eggs, green and black fingerless gloves, un-sewable green scaly embossed cowhide, a tiny purple dog cape.

Code

Varspeedservo Library: This library let me control two servos independently, and most importantly, run them in parallel with a simple True/False parameter. I wish I could say  multithreading other components on Hypnotoad’s Arduino was as easy.

Adafruit_NeoPixel Library: This library lets you control multiple strips of neopixels from the same controller, as long as you create unique objects for each one, or daisy chain them all together, and tally up your total LED count.

Pulsesensor.com’s Interrupt: Not a library per se, the script on the pulse sensor’s website is indispensable, if you’re not familiar with introducing interrupts to your main script. I’m not, and I didn’t fully understand the components of the interrupt, so I discovered the hard way that anything you add to these functions, out of logical necessity, will run much faster than anything else. It interrupts the main Void Loop every 2 milliseconds.


 

HYPNOTOAD’S DEMANDS

(Functionality & Logic)

 

A 2x2 grid of each of the four conditions I might come across in comparing live biodata of two people
A 2×2 grid of each of the four conditions I might come across in comparing live biodata of two people

Hypnotoad responds differently, through a series of short and quick ribbits, relative degree of rotation, and glowing eyes, to express what he likes and dislikes.

3 different configurations of micro controllers
3 different configurations of micro controllers

This rough diagram shows the three options I had for arranging one or three wearables, with either concealed cables or a bluetooth module in every device. In the second option, both displays are slaves communicating to the master creature controller by bluetooth, and between the untethered wrist sensors, and pendant controllers

See Hypno in action (IMG_0419)


 RESTORING HYPNOTOAD

(The Build Process)

I spent weeks testing all of the bio sensors, and trying to daisy chain the cheap neopixels to the 16 pixel rings. I wired the necklace suspended rings directly in front of a micro pro, and ran signal, ground and voltage down arm-length cables to the pulse sensor and GSR finger cuff electrodes. After heat shrinking the cables in place, I realized I was going to need to run cables back up to the sensors and ring instead, coming up from a single, hidden controller board under his throne. This was necessary, as networking 3 arduinos of alternating roles was more trouble than controlling everything from one.

Original necklace prototype with wires bundled incorrectly
Original necklace prototype with wires bundled incorrectly

Ultimately, the necklace’s and Arduino Micro Pro should fit into a case that look like this…

Necklacle_cropped

I throttled the neopixel’s activity to stay within the Arduino’s power limitations, and turned both displays into separate instantiated objects of the same Adafruit library

The next step was completing the creature assembly. I knew I was going to:

  1. Fix, replace, or augment the frog’s internal motion sensor
  2. Make the frog jump
  3. Add lights, servos, and metal components to the frog chassis, adding to its weight

I had to figure out a way to rotate the frog to look at you, without exceeding the amount of torque the servo could handle before breaking. I bought a ballbearing assembly, sandwiched between two plates, and bolted the frog to existing holes on the plate, that happened to line up with speaker holes on the frog’s base.

I picked a small round servo blade that could pass through the center of the ballbearing assembly, protruding enough to clear the top opening. I bolted a rebar across the top assembly plate, and drilled two holes into the sides of the plastic blade to mount to the rebar and then screwed the blade into the servo itself (it needed two so that the blade wouldn’t unscrew itself).

Ballbearing Assembly with crossbar to spin mount plate with servo blade, independently
Ballbearing Assembly with crossbar to spin mount plate with servo blade, independently

On the bottom of the ballbearing assembly I attached two C-shaped metal bars, with only one screw each, to allow them to turn freely. Then I drilled holes in the side, attached springs, sandwiched my servo between them, and tensioned the springs to each, across the servo, with two springs. This kept the servo from twisting itself out of alignment, while still being mounted independent of both the frog and the case the frog would sit on. It also gave me back up space in case I needed to change the servo to a smaller or larger size later on.

 

A large servo spinning the ballbearing frog mount, and the smaller "jump" servo
A large servo spinning the ballbearing frog mount, and the smaller “jump” servo

With the mounting assembly finished, discrete enough to hide under the frog, but strong and modular enough to take any load, any servo, and mount anywhere, I moved on to the second servo.

I mounted the servo inside the frog, on top of its battery drawer, fixed a v-shaped lever to the servo blade, fed the second arm through another hole in the frog’s base, and made a simple kick for jumping.

The corroded guts (and power supply) of the frog
The corroded guts (and power supply) of the frog

Hypnotoad’s Stomach

I bypassed the frog’s previous power entirely, and wired directly to its board. I considered doing a back and forth conversion of analog to digital and digital to analog recording of a frog Croaking, but I knew the frog had a prerecorded ribbit sequence. Sadly, the motion sensors were clogged with mud, and damaged somehow. I debated splicing wires in from the arduino’s digital pins to the output of the two sensor wires, and faking a signal, but there was a chance that the motion sensor might produce a specific amount or pattern in voltage.

I realized the frog made a special croak, independent of the sensor, every time he boots up. So rather than messing with the sensor, adding a waveshield, or tackling the existing board and processor, I decided to exploit Hypno’s boot cycle. All I needed to do was briefly cycle power for single ribbits, and give him a lengthier cycle of power to do his full croak. This provided two sounds for two different contexts. At slightly below nominal voltage, I found I could modulate and deepen his croak opening up a third possible croak.

The frog had plastic eyes, with unretractable pegs blocking a perfectly usable hole for LED eyes. I melted through this with my soldering iron, daisy-chained two single LED Neopixels, and gave Hypno indicator eyes.

The eyes were epoxied in place. I melted them out and flat, so the LEDs could color his glass eyes
The eyes were epoxied in place. I melted them out and flat, so the LEDs could color his glass eyes

I built a small protoboard inside the frog’s body, to combine the grounds and voltages of the jump servo, eyes, and croak, feeding it out the bottom with a ribbon, and through a safe hole in the ballbearing plates.

I used the varspeedservo library to create three motion profiles.

  1. The frog turns and looks at the most anxious person in the pair
  2. The frog points in between the two friends and waggles if both people are upset (his eyes glow red, and he stutters short croaks)
  3. The frog points in the center of the two friends and hops up and down if both people are happy (his eyes glow green, and he does his full cycle of 3 uninterrupted croaks)

The jumping mechanism worked fine, but required long, unanchored bolts to go into Hypno’s chassis. I considered grinding the last half of the bolts down, so that he couldn’t snag his bolts on the (speaker) mounting holes at the top of a jump.

“Option Three”

My first, and largest coding hurdle, passed on from the Golf project, was communicating from necklace to necklace, and between necklace and frog by bluetooth. I already used bluetooth directly, from arduino to computer IDE, in previous projects to upload scripts wirelessly and print data. Networking three devices, to both send and receive data from one another, alternating AT roles between master and slave… I decided to avoid this early on. It already failed once with the HC05 modules for Golf.

The second option involved combining four sensors, two vibrating motors,  two ring displays and a frog display, two servos, and the croak, all on the same board. This worked well up until I realized I had created my fluttery light show with fixed delays, and an interruptor timer for the heart sensor. Timing two pulse sensors and performing the inter-beat calculations in the same interrupt, and timing two arduino displays, proved too difficult to program procedurally without creating my own classes. I needed to run two instances of the same class, with independent timers, at the same time: multithreading. Arduino isn’t meant to do multithreading.

Relative heart rates. One male one female, differing in weight by 70 lbs. Values are not directly comparable!
Relative heart rates. One male one female, differing in weight by 70 lbs. Values are not directly comparable.

A third option came to me. I could compare the two heart rates in their raw data, average or select from them conditionally, and just display one rate. This only needed one interrupt service routine, and would output the same values to both necklace. Even if I couldn’t time both separately, I could already treat the colour of the necklaces separately, letting me indicate the dominant beat.

Sadly, my script broke. The previously functional interrupt suddenly decided it was  “not declared in this scope“. I debated returning to a single display, as I had already accomplished one before, but I hadn’t saved enough incremental versions of my script between major changes, to recover all of the working functions I had added.

Hypnotoad awaits the day he can sit and judge us.


 CIRCUIT DIAGRAM

(Coming)


 

CHALLENGES

Millis: The death of multithreaded timing in Hypno’s code came from using a variable delay, with the interval matched to the interval between real beats. The next iteration will include an increment with identical timing criteria for the first 12 loops,  and a variable time condition to meet to restart the pulse. Both displays can have their conditions nested in the same loop.

Corrosion: Determining why the frog wouldn’t power up on its existing system took multimeter testing from battery terminals to motherboard (the battery drawer terminals had corroded in the rain)

Copper Traces: I spent more time than necessary trying to learn how to fix lifted solder pads. I scraped, sanded, and alcohol-washed the exposed metal, then tried using plumbers solder and other metals to wet the bare copper. I found an obscure and offbeat youtube video about replacing traces with rare conductive epoxy, and not so rare copper sheets. Then I gave up.

Voltage Shortages & Simple Solutions: I kept getting an inadequate 2 volts of power from the digital pin outs on Arduino, trying to feed Hypnotoad’s hardware 5 or more. I spent half a day failing to step it up, with every combination (or combined rating) of diodes, capacitors,  and transistors. I went as far as using a reed switch, supplying 5 volts directly, with the 2V digital pin powering the switch, and still couldn’t trigger the delivery of 5 volts I needed out the other end. It turned out I was missing the PinMode Output command; a one minute fix.

The Reed switch takes voltage through a coil inside to magnetically pull an open gate above it closed, and let supply voltage through. It didn't work. Still 2V
The Reed switch takes voltage through a coil inside, to magnetically pull together a broken gate, and let supply voltage through. It didn’t work. Neither did the TIP120 or 121. Two volts always came out
I added three diodes to filter voltage fluctuations giving the frog inconsistent power and croaking. The 2020 transistor tried to regulate 5V from a digital pin.
Diodes filtered voltage fluctuations that gave the frog inconsistent power and modulated croaking. The 2020 transistor tried to regulate 5V from a digital pin.

Hypnotoad’s Throne: Figuring out how to rotate the frog without breaking the servo or snapping a cord, or mounting the entire assembly off of the fragile servo blade, was tough. The mounting assembly alone, took 2 days to figure out, build and calibrate.

 

Like or Dislike by Leon Lu

IMG_0528

 

Introduction

I wanted to explore the irreverent nature of people through an inanimate object which people could still connect with. I wanted it to be strange and grungy looking but something you might still consider cute. The idea was to get people to connect with it and try and get it’s attention. As people, we are easily distracted and change our minds without reason. That’s what I wanted to explore through my project. I called him ‘Gunther’. IMG_0472

The Build

IMG_0457 IMG_0455

Simply used servo motors which controlled his motion and used a ultrasonic distance sensor as an actuator. The challenge was creating smooth balanced motion which simulated life.

I also used wool, various forms of material, blue foam, Neopixel RGB LEDs and wood to construct the body and the structure.

Code

https://gist.github.com/exilente/682e4e0ccaadc49f56f0

(Note: I used the random variable function to create a sense of mystery to the actions of gunther, it would either come out and meet you or be afraid and run in based on a number it generated)

Challenges/Advancements

I had initially planned to use a web camera to detect faces through processing so the robot would not only react to one person but react to multiple people, say if more than one face was detected it would hide because it would be scared of multiple people approaching it at the same time. I faced trouble in communication between processing and arduino and decided to go with something simpler in the end. If given the opportunity to redo this project I would attempt making computer vision work in a more seamless way thus adding more character to Gunther.

Furgie and Furgurt: Virtual Cuddling Pets

Overview

Furgs_overview_diagram

Meet the Furgs – Furgie and Furgurt!  The Furgs love to cuddle, and that is their main purpose – to allow partners to cuddle virtually.  By hugging one Furg, you create vibrations within your partner’s Furg.  When both partners hug their Furgs at the same time, the Furgs vibrate with greater intensity and purr.  In this way, hugging your new furry friend allows to you feel the vibrations of your loved one – from anywhere in the world.

Watch the video:

 

Structure

The Furgs are made with the following:

Sensors: Bend sensors are used to enable the detection of hugging.  If a sensor is bent past a pre-determined threshold, a hug gesture is detected and the actuators are triggered.

Actuators: Multiple vibration motors and a buzzer motor are sewn into the inner layers.  When Furg A is hugged, the first vibration motor of Furg A and Furg B vibrate, and the buzzer starts making a purring sound.  When Furg A and Furg B are hugged at the same time, all vibration motors on both Furgs vibrate – the Furgs vibrate with greater intensity.

Boards: Each Furg runs off of its own Arduino board and its own Lilypad board.  The reason that two boards are included in each Furg is to provide many pins for future functionality (described below).  The Arduino and Lilypad boards communicate across a Software Serial channel.

Connectivity: The Furg Arduino boards are linked via a Software Serial channel.  This was done for demonstration purposes only – future versions will include WIFI chips, so that two people, located in different locations, can hug via the Furgs.

Note, that initially Bluetooth chips were used, but the chips proved to be unreliable, often dropping pairings and/or running with inconsistent timing.  Sometimes the Furg communications would transmit simultaneously, while sometimes it would take several seconds.  As a result, Bluetooth was dropped.

Materials: The furgs are built around an egg shaped stuffed animal core.  Their custom outer layer is made of very soft, shaggy fake fur.  The ears are made of another type of fake fur that resembles cheetah fur, sewn together with the shaggy fur.  Fabrication of their shape was the most time consuming process over all others.  (Sewing is not one of my top skills, but it was great fun giving the Furgs their playful form).

 

Core Functionality Furgs_functionality_grid

 

 

Code

Code for the project is available here: https://github.com/jamesessex/TheFurgs

 

Future Functionality

Although I believe that the ability to hug and cuddle virtually is awesome, I see a lot more potential in the Furgs and will be building out more features.  In addition to cuddling, multiple gestures will be added that enable virtual play fighting.

There will be no rules built into the pets, only a rich set of gestures.  Users can then make their own games out of the gestures.  For instance, ear tugging and belly poking will be enabled.  Imagine a girlfriend poking the belly of her boyfriend.  The boyfriend can poke back, or tug an ear, or send a playful sound or  playful gesture (by selecting one from  an LCD screen that will be added to the Furg bellies).  Users can create their own playful rituals around the Furgs, and can change and evolve those rituals as they please.

An initial set of features to add include:

  • Rich sound set
  • Head petting
  • Belly rubbing & poking
  • Nose touching & nose rubbing
  • Ear tugging
  • Jiggle and bounce
  • LCD screen – displays animations like heart bursts, and provides a menu that enables sending of messages, sounds, and animations

 

 

 

 

 

 

 

 

 

 

 

Hope: The bird comes alive as you approach

vlcsnap-2015-12-15-11h00m59s4051vlcsnap-2015-12-15-11h01m59s573मनम

Overview:

My project is a bird which reacts to human presence & evokes, triggers a certain emotion in humans.

Inspiration: We everyday see birds around us but we hardly notice them or stop &think about how they feel. what they think, etc. The main motive of  this project is to make people realize the beautiful environment around them. When we try to control the environment, it affects the creatures living in it. Thus the cage is a symbol, it is a metaphor used depict how we all have knowingly or unknowingly control the environment surrounding us.

1989818538_129306bb8e_b

Interaction Model:

Stage 1:  when no one is around, the bird shivers & looks around for someone to rescue it.

Stage 2: as a person starts approaching, the bird starts flapping its wings & neck happily with the hope that it is going to be rescued.

Stage 3: when he departs the bird is again sad &awaits for someone to approach him again.

DSC_6044 - Copy-min DSC_6052 - Copy-min

Attempt:

I basically tried to capture the natural movement of a bird. The way it reacts when it is happy, frightened, sad, etc. There were 3 movements involved:both the wings & the neck. hence, 3 servos were controlled by a sonar sensor.

GOPR0089-min

Challenge:

The most difficult challenge was crafting a bird. the look had to be very natural & beautiful, eye pleasing to evoke an emotion in the user. The bird cannot have a rusty, grunge look. The most difficult was to hide the 3 servos & still control the movement. It should not be fragile but it should look delicate.

DSC_6177-min DSC_6257-min DSC_6115-min DSC_6334-min DSC_6285-min DSC_6300-min

Appearance:

I crafted the complete bird in pure white foam board with joining the pieces thus making it stylized. The cage was purposely black & dark to create a contrast &evoke an emotion.

Taking this forward, I would like to add more gestures & movements in my bird which will make it more realistic. Sound will add more to it. The neck can move even up & down with the addition of one more servo. So my next steps would be to explore & try out new possibilities to make my project better.

The project was really inspiring since this required all the 3 skills- observation, articulation & execution.

Thank You.



DO NOT DISTURB

DSC_6079-min

 

An interactive public installation was a project done as a part of creation computation in digital futures.

The process started with brainstorming o of the interactive installation was to create some character which reactions on the environment and also reacts to the people interacting with it.At the very beginning I started with thinking of different emotions and reaction people normally deal with and later on i thought of selecting disturbance as a metaphor to start with.

iStock_000011513482Large-minReading Mode is the best state of mind where man is in his own world of imagination . Getting disturbed by people noise or any of interrupt is frustrating and disturbing which no one likes.DSC_6233-min

DSC_6197 - Copy-minAn reaction of the installation towards the audience was a amazing.

jgkjkjk