Experiment 2 – Pet Me (If You Can)

2222

Project Title
Pet Me (If You Can)

Team Members
Jignesh Gharat, Neo Chen, and Lilian Leung

Project Description
Our project explores the ability to create a creature character with the ability to surprise viewers with interactivity with the use of two distance sensors. The experiment is an example of living effect to give a machine a life of its own and the use of different modes of operation to create distinct emotions with the creature.

The creature was created with the use of two Arduinos, three servos, a row of LEDs and two distance sensors. The creature sits on a pedestal and moves on its own accord, surprises viewers when they come nearby by closing its mouth and eyes becoming from erratic.

Project Video

You can access the code for the experiment here
https://github.com/lilian-leung/experiment2

Project Context

To create a creature with the use of servos and sensors. We explored the ongoing questions as to “Why do we want our machines to appear live?” as mentioned by Simon Penny, a new media artist and theorist. Caroline Seck Langill in The Living Effect: Autonomous Behaviour in Early Electronic Media Art (2013) argues that we create lifelike characteristics to elicit a response from the audience that is suggestive of a fellow life-form to achieve living effect, where as we do not attempt to re-create life but rather to “make things have a life of their own,”.

Our original intention for the project was to create a creature to be halloween themed or to have a security-like box that would guard a valuable item such as jewellery or used for everyday use such as guarding cookies within a cookie jar-like shape.

Langill (2013) proposed the three characteristics of living effect being first, an adherence to behaviour rather than resemblance, the second; the effect is one of a whole body in space with abilities and attributes, and the third being potential for flaws, accidents and technical instabilities as imperfections allow one to acknowledge to living effect within a synthetic organism.

We initially began prototyping using the oscillation of two servos to create the eyes. We began our prototyping with the use of post-it notes over the two servos for the eyes to get the movements of the pupils to move at a natural speed with the use of easing using Nick’s animationTools Arduino library.

In Robotics facial expression of anger in collaborative human–robot interaction (2019) Reyes, Meza and Pineda describe expressive robotic systems favour feedback and engagement with viewers. Emotions such as anger created the most effective responsonse with participants. Using the minimal amount of facial expressions with the components available, we tried to replicate a human-like expression as an indicator for possible modes of operation the creature could react to.

From there we incorporated the main body (the box) of the creature and began exploring ways we could have the box open. Our initial thoughts were to save some sort of lever outside and above the box that would pull the lid open with the use of thread or fishing wire.

 exp2_wip-img1

We also explored possibly having the servo on the side of the box, but were concerned that the motor wouldn’t be able to handle the full width of pushing the lid open from one side. In the end, we landed on having the servo inside the box, in the back centre area where it could push the lid open with the assistance of a curved shape to reach the lid. We then began testing to get the right angle to use for the servo as well as what range it should have, so as not to push the servo out from its spot inside the box or to open to box too wide.

Servo Testing Gif

Before laser cutting out all our final shapes, we tested each component separately using the breadboard to make sure the circuit was functioning before soldering each piece. From there we built out new facial features by using the opening box and laser cutting a tongue-like shape which we then lit up using red LEDs.  We laser cut the pupil and iris to attach onto the servos, as well as making a small enclosure to hide the actuators afterwards. All the cables are then looped inside the box and tucked in the back to keep the cables tidy when the creature opens its mouth.

exp2_wip-img5

exp2_wip-img3

The creature has three modes of operation:

  1. The eyes oscillate from 0 to 180 degrees slowly within the two meter “safety zone” away from viewers at a speed of 0.1. Within this safety range, the servo controlling the mouth also props the mouth open as it deems the area “safe”. During this time, the LEDs within the tongue piece are lit up.
  2. Within the middle zone, the creature becomes “conscious” of viewers and the speed increases to 0.2. The increased speed of the eyes signify hesitation or caution with the creature.
  3. When viewers come in to the “danger zone” within approximately one meter from the object, the speed of the eyes increases to 0.8 and the mouth shuts closed.

To avoid overloading one of the Arduinos and to make sure the electrical circuit was consistent, we divided the sensor controlling the two servos for the eyes from the sensor that controls the LEDs and servo motor that controls the mouth opening. 

One of our challenges was working with the noise generated from the sensors which caused some of the modes of operation to fluctuate from opening the mouth to immediately dropping even when viewers are within a safe distance away from the distance threshold. Though we adjusted the settings to make the middle range shorter to turn the noise disturbance within the sensor to appear more like a laughing motion by the creature.

After our presentation, we got feedback as to how we could better incorporate the sensors into the experiment, so that the experiment can become more mobile and easily placed in different situations.

exp2_wip-img6

Our solution was to attach the creature onto a pedestal and have the sensors hidden away below the surface. The creature stands upright as if it were an exhibition piece. The creature takes a personality of its own when the eyes oscillate as if they’re patrolling the surrounding area and closes its mouth and looks downwards when viewers approach it in a more humble composure. 

1111

2222

6666

5555-copy

1010

Sources

Arduino. (2017, January 13). How to Use an Ultrasonic Sensor with Arduino [With Code Examples]. Retrieved from https://www.maxbotix.com/Arduino-Ultrasonic-Sensors-085/.

Circuit Digest. (2018, April 25). Controlling Multiple Servo Motors with Arduino. Retrieved from https://www.youtube.com/watch?time_continue=9&v=AEWS33uEwzA

Langill, Caroline. (2013). “The Living Effect: Autonomous Behavior in Early Electronic Media Art.” Relive Media Art Histories. Cambridge MA: MIT Press. pp.257-274. .

Programming Electronic Academy. (2019, July 2). Arduino Sketch with millis() instead of delay(). Retrieved from https://programmingelectronics.com/arduino-sketch-with-millis-instead-of-delay/.

Puckett, N. (n.d.). npuckett/arduinoAnimation. Retrieved from https://github.com/npuckett/arduinoAnimation.

Reyes, M. E., Meza, I. V., & Pineda, L. A. (2019). Robotics facial expression of anger in collaborative human–robot interaction. International Journal of Advanced Robotic Systems, 16(1), 172988141881797. doi: 10.1177/1729881418817972

Leave a Reply

Your email address will not be published. Required fields are marked *