As a Graduate Research Project at OCAD U, we were challenged to make an autonomous robot using the Arduino Uno microcontroller. We built RedBot as an introductory idea for Red Bull events that could set up a new adventure with these little autonomous monsters.
To create an autonomous device that follows a preset path to reach its destination carrying a ping pong ball. The challenge included three different courses listed below:
No details for ‘Course C’ was released until the presentation day. We were given 30 mins to figure out the third course after the criteria was given. Which, in the end, looked a little like this:
Create a new path above the lift. When the ping pong ball is put on the path, the first lift goes up so that the ball can arrive at point B automatically. When the ping pong ball is put on the path, the first lift and tree goes up. When the ball arrives at the position of the second lift, the lift goes down in order to use gravity to move the ball. Following the request, it would create the specific path to move the ball.
Inspired by Sphero, we were thinking to alter the ball by adding features within it. The main method to add those features was by using Orb Basic, the Basic language of the Sphero. Features included an accelerometer, gyroscope, bluetooth and most importantly, the Sphero’s spherical nature. This was supposed to be the main engine behind a chariot that would carry the ping pong ball.
The RedBot modus operandi
The inspiration was to make an autonomous robotic car powered by Red Bull. Hence, the name RedBot!
Motivated by Red Bull events, the car was a natural progression to associate with it. RedBot, as much as an energy drink as it was to be a fast car, was designed to detect a path and identify objects that helped it navigate its given course.
(2) N-Channel MOSFETs
(2) 5V DC Motors
(1) Arduino Uno
(1) Arduino Prototype Shield v.5
(2) HC-SR04 Ultrasonic Sensors
Each course has its own program specifically designed for it. It is indeed possible to combine all three programs into one, but at the current state they are not. Note that all turns are performed with one wheel moving, the other stationary. Turn durations are static, set within the programs themselves. Autonomy is derived by the proximity sensors, which are used to detect when to perform the pre-timed turns. The exact timing depends on the desired angle and turning speed, which in turn depends on battery power, motor specifications, balance on the device, wheel traction, etc.
There are three movement modes used for course A.
Wall Correction Mode
When the on button is pushed, the RedBot first checks to see if there is a wall in front of it. If a wall is detected the RedBot turns right for a certain amount of time, enough so that the device no longer faces the wall. RedBot then switches to orientation mode. If RedBot does not detect a wall when it is turned on, RedBot switches to orientation mode immediately.
RedBot turns right in small steps until it detects a wall (the same wall used for wall avoidance mode). RedBot then turns left for a certain amount of time, enough so that the device is now aligned with the course. The device then switches to movement mode.
RedBot moves forward in small steps until it detects a wall in front of it. RedBot then stops. The first wall is placed on the course at a fixed position away from the starting point to standardize RedBot’s alignment. Note that the wheel acting as the centre of the turn (the right wheel) must be placed at the exact centre of the starting point for the alignment standardization to occur. The second wall is placed on the course at a fixed distance beyond from the goal point, facing RedBot, so that the device will stop exactly at the goal when it detects the wall.
When RedBot is turned on it moves forward in steps until it detects the half-wall placed at the corner via its lower proximity sensor. RedBot then turns left for a specific amount of time, enough so that it now faces the goal point. The device continues to move forward until it detects the full wall placed a bit beyond the goal with both its proximity sensors. The device then stops.
Half walls detection is the code that tells the device when to turn, while full wall detection is the code that tells the device when to stop. Given this logic pattern, it is possible to have RedBot navigate its way around any course, so long as the turn durations are pre-declared within the program. For example, it is possible to set all turn durations to a standardized time necessary for 45 degrees turns, and use half walls to navigate the device around a course comprised of turns with angles with a multiple of 45 degrees.
Due to time constraints the program for course C is a quick rig of the code for course B. The upper proximity sensor is disabled, and the full stop at the goal is sensed via a variable that acts as a counter for how many walls the device has encountered.
When RedBot is turned on it moves forward in steps until it detects the half-wall placed at the corner via its lower proximity sensor. The device then turns left for a specific amount of time, enough so that it now faces the next wall. This bit of code is repeated for the next two angles. By now, the counter will have reached 3, meaning the device has so far encountered 3 walls. When it detects the last wall positioned beyond the goal point, the counter reaches 4, and upon reaching that count RedBot stops all actions.
To conclude, RedBot has been able to achieve its goal of following a path from point A to point B, and bringing its ping pong ball to the finish line. Although not as fast as we originally conceived, our Red Bull sponsored idea for an autonomous vehicle was a success.
Our minimalistic approach to RedBot’s design demonstrated how a crazy idea can eventually lead to a streamlined engine of simplicity. With sensors limited only to sonar, code logic simplified to left/right instructions only, RedBot achieved greatness through the speed of thought.
So Spidey Bird had a little trouble with its circuitry getting glitched by a soldering iron, but she still looks pretty!
My approach was to interface Spidey Bird with a web control switch (no pun intended). The purpose was to show the ability to have Arduino circuits controlled via the web with a little help from Processing. Here’s how:
1. First start a web server with PHP.
2. Then upload a power switch program to Arduino.
3. Create a Processing script to listen for web commands from a web site.
4. Go to http://localhost:8000 for the web interface switch
The following are screenshots of how simple the code is that runs on the web server:
In the end, the goal was to have the switch simply turn on a fan, by turning on the switch, then blowing or whistling into Spidey’s microphone to spin the fan intermittently.