Synchrobots

By: Amreen Ashraf, Lauren Connell-Whitney, and Olivia Prior

Overview

Partner robots

Figure 1: Our two robots preparing themselves for the their first demo in front of a large group of people.

Synchrobots are randomly choreographed machines that attempt to move in synch with each other. The machine’s receive commands to start and stop their wheels, and as well the pause in between movements. Even though the two machines receive the same commands, the differences in weight, shape, and environment can cause the robots to fall out of synch. The machines  execute the commands as they travel across the room, sometimes bumping into each other or bumping into walls. The randomness of the movements creates a perceived intelligence, though there is none. The machines may nearly hit a wall but avoid it at the last second, it could be interpreted as very intentional, but it is the human interaction that creates these stories. These randomized in synch movements create life for these machine’s and make users delighted in following what they will do next.

Code

https://github.com/alusiu/experiment-4/

Idea Process 

We had a huge range of ideas from the beginning. We did manage to keep the idea of duality in our final project.

Figure 2: We had a huge range of ideas from the beginning. We did manage to keep the idea of duality in our final project.

We began the ideating by talking about how we could affect machines using outside data, for instance, using the data of the amount of people passing through the OCAD doors, or the wind at any given moment in Antarctica. We began developing an idea to make bots that would rove around and crash into things. Their construction would be of two pieces of steel that could crash together and make a thunderous noise. However, the idea of constructing something that was made to bump into things with force seemed like a tall challenge, and possibly not one we wanted to tackle just yet.

Quick sketches of how we wanted our kinetic sculptures to move

Figure 3: Quick sketches of how we could install our “thunder bots”.

Next we had an idea to create bots that would find each other in the dark; our Marco Polo bots. This was the idea that we began moving forward on: three bots that would seek and find another “leader” bot, then once it was found, another bot would become the leader. This idea lead into a thought about migration and how birds use this idea of passing around the leadership role so that the whole mechanism can stay strong.

One of our initial process sketch for our Marco Polo bots

Figure 4: One of our initial process sketch for our Marco Polo bots.

Our workflow written our for Marco Polo bots

Figure 5: Our workflow written our for Marco Polo bots.

Our "pseudo-code" hand written out for Marco Polo bots

Figure 6: Our “pseudo-code” hand written out for Marco Polo bots.

Creation Process 

Our prefab bots were simple to put together, but made of very cheap parts. One of the DC motor gear ended up breaking inside of the casing and so we did an emergency surgery using another 180 degree servo's gears. It was nice to take stuff apart and see how everything worked.

Figure 7: Our prefab bots were simple to put together, but made of very cheap parts. One of the DC motor gear ended up breaking inside of the casing and so we did an emergency surgery using another 180 degree servo’s gears. It was nice to take stuff apart and see how everything worked.

We began with a trip to Creatron, where the array of wheels and servo motors was vast and costly. So we ended up bringing home a lovely red chassis kit that came with two rear wheels that attached to two DC motors and a stabilizing front wheel. It was a great place to start and getting the wheels working was proof that our little bot could move if we wanted it to.

Video 1: A clip showing the robot moving with both DC motors connected to the feather micro-controller. 

Coding, Fabricating, & Development

Networking Platform & Iterating on Ideas

Our first step in development was to find a resource for networking our robots that was not PubNub. Unfortunately, PubNub API was not able to receive any data that was published from an Arduino – users were only able to subscribe to data. Since our initial idea prior to Synchrobots, was for three robots to talk to each other by sending coordinates to the other, we needed to have a platform that would allow us to send and receive data. After some research, we decided to pursue adafruit.io to network our robots.

Adafruit.io allowed us to very easily send and receive data through different feeds. The primary issue that adafruit.io presented was that the account could only send and receive data up to 30 times a minute. This meant that we could not be continuously be sending or receiving data like PubNub.

This presented some issues for our initial idea; we wanted our robots to be talking with each other continuously. Because we were developing for three robots that meant each one could only send and receive data ten times a minute. We discussed it amongst ourselves and decided to develop for an idea that required two robots that were not continuously sending and receiving data. As well we decided that if we changed our idea we would not scrap everything and start from the beginning. We would be able to re-use most of the code developed and thoughts from our previous hand written pseudo-code.

Additionally, at this time we had purchased out prefabricated platforms for the robots. The movement of these bots was not linear, nor consistent. We decided that our initial idea of “Marco Polo bots” would not reflect well with these movements, and decided to pursue an idea that would allow the robots to move more freely.

This brought us to the idea of Synchrobots: two robots that would receive the same data set, and execute at the same time. We were interested in how the robots would respond to the same data, and how the presentation of two robots would bring up connotations of partners, a couple dancing, or friends moving around.

At the start of development, we found that adafruit.io has a feature that allowed us to view the data in a visualization that was being sent in real time. This was a very useful tool for our development since it let us see what values were being sent by each feed while the robots would be moving. We were not required to have our microcontrollers connected to our machines to use the serial logs. 

Adafruit dashboard showing a data visualization of the randomly generated times for our robots.

Figure 8: Adafruit dashboard showing a data visualization of the randomly generated times for our robots.

Issues with DC motors and Servos

Our prefabricated platforms came with two DC motors and wheels each. Upon purchasing we were certain that the motors in the case were servos and were surprised that they were DC motors in servo casing. The DC motors would have worked well, but they did not have a data pin. This posed problems for us as we needed to be able to command both wheels independently through the microcontroller. As well, the motors were very cheaply made and one of the gears broke fairly early into development.  We took the motor apart and attempted to fix the gear by using one of the same parts from a servo we were not using. 

We attempted to fix the gear of our DC motor

Figure 9: We attempted to fix the gear of our DC motor.

Additionally, one of our feathers stopped working on us while attempting to control the DC motors; we suspect this was a power management issue that we were unaware of could happen. After a day’s worth of research, a feather passing away, and attempts of controlling the DC motors with our feather microcontrollers, we decided to get servo motors instead.

One of our feather microcontrollers stopped working during development

Figure 10: One of our feather microcontrollers stopped working during development.

Unfortunately, this did not solve all of our problems. The servos were installed in a way that they were “flipped”. When executing code that would turn them on and off, the wheels would turn in opposite directions. The next issue to tackle was finding out how to rotate the servos in the same way. We finally found a solution that starts the wheels in opposite positions: one starts at 360 degrees and goes to 0, and the other goes from 0 degrees to 360.

Video 2: A clip featuring both of our wheels spinning in the same way, and pausing in between times sent to the robots. 

Set up

Our set up for the microcontrollers was to have three feathers in total: two robots with feathers that would receive the data, and one feather connected to wifi sending the data. We designed the process as follows:

Synchorobots process diagram

Figure 11: Synchrobots process diagram.

We wanted to use randomly generated times so that the robots would be anthropomorphized as viewers watched them. The random times would reduce that chance of patterns that viewers engaging with the robots for a long time could pick up. As well, we were attracted to this idea because we had been talking throughout the project of the character of machines that can occur through real life obstacles and variances, which became apparent in our little bots as we constructed them. One of their wheels was slightly wobbly, one of them was heavier because of a different battery pack. All small details that would make the real life outcome slightly different between the two.

Deep in thought with the code and construction!

Figure 12: Deep in thought with the code and circuit construction with the servos.

Next we began soldering the circuit, a simple one at that. Something we also had in our plan initially was to add in blinking LEDs to the robots and possibly a robotic chatter with a speaker, as if they were talking to one another while they were roving. Near the end of the process we realized that it may have been too much for this one project and decided to keep this on for further iteration at another time.

Soldering the circuit, picky, fun, detail work.

Figure 13: Soldering the circuit; picky, fun, detail work.

Fabrication 

Once we had our components soldered we discussed the casing for the microcontrollers. Our debate amongst the team was how to keep the robots looking industrial, while also adding in features that elevate the anthropomorphized qualities we wanted. We thought that having an entire casing that hid the wires and the microcontrollers would create too much of a connotation of critters, while leaving the bare bones hardware would look unintentional.

We had some spare copper tubing and LED fairy lights around the studio. We experimented to see how adding some more industrial material onto the robots would change the experience of the machines roving around the room. We placed the copper tubing in the front with LED lights and found that it resembled an eye. This was the perfect middle ground between robot and critter. To compliment the front, we cased some of the wired in copper tubing in the back.

Assembling the robots with the copper tubing as casing for the wires, and as an "eye" in the front

Figure 14: Assembling the robots with the copper tubing as casing for the wires, and as an “eye” in the front

We had two types of LED lights, one “cool” strip and one “warm” strip. We decided, that to create small differences between the two robots, to have one robot that would house the cold strip, and one that would house the warm strip.

Experimenting with the LED lights and copper tubing on the front of the robot

Figure 15: Experimenting with the LED lights and copper tubing on the front of the robot

Final

Hardware

  • 3 X Adafruit feather ESP32
  • 2 X Feetech FT-MC-001 Kits(chassis)
  • 4 X Servo Motors
  • 2 X Protoboards
  • 2 X Rechargeable battery packs

    synchrobots-1

    Figure 16: Synchrobots ready and waiting.

GO TEAM!

Figure 17: Team of Amreen, Olivia and Lauren holding both finished Synchrobots.

Project Context

A work that provided a larger context for this work is Lichtsuchende: Exploring the Emergence of a Cybernetic Society by David Murray-Rust and Rocio von Jungenfeld. A project that created a community of robotic beings that interacted through light. The project examined how we as humans can design for other beings, namely, machines. It is a beautiful project that went through many iterations of how the robots reacted in the space and learned from their behaviours and group patterns.

Lichtsuchende: the robot community in action

Figure 18: Lichtsuchende: the robot community in action

Final Thoughts

Demoing our Synchrobots was a success. They seemed as if they were dancing at some points, they roved around separately, they even crashed into people and walls. It was a wonderful display of human and machines interacting. Peopled were delighted by them, some didn’t know how to react. It was a similar experience to watching people interact with a baby; some people were overly cautious and timid about the interactions, while there were others that actively and playfully interacted with the bots.

We received overall positive feedback and great advice on how we could carry our project forward. After the demonstration of our bots, Kate (Hartman) suggested that we could harness a Go Pro camera on the wall and have it observe the bots as they move about a room until the battery ran out. This is something we might like to pursue to track the patterns of our bots through time and space.

As we saw from the reactions of the our classmates, the movement and the meeting of the bots was such a cause for delight. Some other suggested direction was to look at LEGO mindstorms, which are tiny bots by the LEGO company that come as LEGO hardware with a preinstalled software. Nick Puckett had a thought to look at electric toothbrushes and dissecting them to get the vibration motors which would lead to creating small “dumb bots”. There were more suggestions to attach a pen or a marker to the bots as they moved around the room, as a look into bot art. This idea of letting the bot create work through movement was interesting because we had looked at a similar project while researching and decided against it because there were many such projects. We wanted a free flow movement of the bots without having a purpose attached to the movement. The feedback we will next implement if we do take this project forward is thinking about interactions between these machines. During this project we explored the concept of interactions via using LED strips. We got the LEDs working but we hadn’t coded any interactions on how or what the LEDs would do to react when the bots interacted. This point would be the most crucial in the further development of the project.

References

  • Šabanović, Selma, and Wan-Ling Chang. “Socializing Robots: Constructing Robotic Sociality in the Design and Use of the Assistive Robot PARO.” Ai & Society, vol. 31, no. 4, 2015, pp. 537–551., doi:10.1007/s00146-015-0636-1.
  • Conference, RTD, Dave Murray-Rust, and Rocio von Jungenfeld. “Thinking through robotic imaginaries”. figshare, 20 Mar. 2017. Online. Internet. 26 Nov. 2018. Available: https://figshare.com/articles/Thinking_through_robotic_imaginaries/4746973/1.
  • DeVito, James. “Bluefruit LE Feather Robot Rover.” Memory Architectures | Memories of an Arduino | Adafruit Learning System, 2016, learn.adafruit.com/bluefruit-feather-robot.
  • Gagnon, Kevin. “Control Servo Power with a Transistor.” Arduino Project Hub, 2016, create.arduino.cc/projecthub/GadgetsToGrow/control-servo-power-with-a-transistor-3adce3.
  • McComb , Gordon. “Ways to Move Your Robot.” Servo Magazine, 2014, www.servomagazine.com/magazine/article/May2014_McComb.
  • PanosA6. “Start-Stop Dc Motor Control With Arduino.” Instructables.com, Instructables, 21 Sept. 2017, www.instructables.com/id/Start-Stop-Dc-Motor-Control-With-Arduino/.
  • Schwartz, M. “Build an ESP8266 Mobile Robot.” Memory Architectures | Memories of an Arduino | Adafruit Learning System, 2016, learn.adafruit.com/build-an-esp8266-mobile-robot/configuring-the-robot.