The Golf Project was challenge to create move a ping-pong ball from the point A to point B by any means necessary. Each group must develop a robotic system that would sense the course, it’s surroundings and any obstacles, and then would react accordingly. We would be given course one and course two, however course three would remain a secret until the day of the presentation. Below we have included images of all three courses we had to navigate.
To start off with figuring out how we were going to accomplish building a semi-autonomous car we were researching possible solutions to allow the car to sensor its surroundings. Below are some ideas we (Jordan and Manik) considered:
- Multiple sonar sensors that would wrap around the vehicle.
- Infra-red (IR) emitters lining the track with IR sensors on the car ensuring it doesn’t travel outside of the IR bounds.
- Program different routes into the car’s “navigation system” so that the car would know its route: eg. “go straight straight for 1 meter than make a 30 degree left turn than go straight for 0.8 meters then stop”.
- Employ an array of IR sensors and receivers to track lines on the floor and use a sonar sensor for initial orientation for starting and stopping.
When we learned that the first challenge was going to start by the vehicle being oriented in any direction away from the start direction, we removed possibility 3 as an option. We knew that we needed to program the car to move based on sensors rather than hand coding its route.
We decided to go with the sonar sensor and the QTR-8A IR (infrared) array.
When we decided on a strategy, a number of issues needed to be addressed:
- we didn’t have extensive knowledge with motors before this project, so we started prototyping the motor and navigation system for the vehicle.
- The first step we confirmed was turning DC motors on and off
- we followed up by confirming that switching the direction of the current changed the direction of the motors.
To give us more control over the DC motors we added an L293 motor driver, also known as a dual H-Bridge microchip. This allowed us to control the speed of the DC motors and the direction the motors could spin.
The vehicle to make turns along various radiuses and speeds using PWM. To prototype a vehicle behaviour with the motor driver, we engineered a set of motor controls that looped using timers to control the motors in multiple directions. Using a potentiometer allowed for adjustments in the speed of the motors based on the values being sent from the potentiometer. The motor motions we tested were stop, start, left, right, reverse.
Our next challenge was setting up the IR array, allowing the vehicle to follow a line on the ground to its destination.
Once we had the basic navigation solved for the vehicle realized that we didn’t always want the car to start right away once we uploaded a new Arduino sketch or turned on the power. To give us a bit more control we added a start and stop button to the car. This allowed us to calibrate the IR sensors and set the vehicle up for its tests. This came in handy to prevent runaway projects.
Once we had the car following the set lines. we moved on to defining a solution to allow the vehicle to orientate itself towards the finish line. Initially, we considered boxing in the vehicle except for an exit door. For visual simplicity, Chris suggested we develop the vehicle so that it “looks” for a pillar parallel to the direction it needs to travel.
We also spent some time to think about ways to add some flair to the vehicle. We looked at adding a sound system or some sort of other movement. One difficulty was picking something that wasn’t going to use too much power and could be done with only the two Arduino pins available.
We decided to add some a fun tune with an Arduino.cc sketch for ToneMelody, but ran into problems with the coding. When the melody “turned over”, the delay halted with the car’s movement. We (Manik and Alison) tried replacing the “delay” in the melody with alternative coding without success. Unfortunately, we couldn’t resolve this issue before presentation time.
The parts used in the manufacturing the car is as follows:
- 1 Arduino Uno
- 1 L293 Motor Driver
- 1 QTR-8A IR array sensor
- 2 DC motors
- 2 wheels
- 1 ball bearing caster for the front wheel
- 1 ½ sized breadboard
- 1 button switch
- 1 sonar sensor
- 1 220k resistor
- 2 9V battery packs
- cardboard for the body
There were various debugging of a couple of issues in getting the QTR-8A array to properly control wheel directions and speeds We needed to ensure enough power was sent to all of the components.
The solution was to separate the power for the Arduino and sensors and the two DC motors. To get this circuit to work we also had to make sure all of the parts had a common ground.
We also had to do some physical debugging for the orientation of the vehicle in the first challenge. Considerations included how thick the line used by the IR sensor needed to be and at what turn radiuses the sensor was still able to continue following the line.
The courses that we were given are shown below. We were given them in actual size to be able to test and debug our cars before the final presentation.
While testing we saw the vehicle often worked, but would occasionally deviate from the line. From debugging and analyzing the situations, we were fairly certain that the cause of this was an inconsistent line colour. We used markers to draw the lines, but the black wasn’t always as dark in some places or the line edges were not 100% straight which affected the sensors. To solve this we used black electrical tape for the line. After implementing this solution, the vehicle’s performance was more consistent.
We also came up with ways to try and correct any navigation errors by adding additional lines to the course. This kind of worked.
The interesting aspect of the error checking lines is that they are visual representations of issues that could be added back into the code. Developers add error checking into their code to help control the end product of the executed code. Essentially we have visual artifacts of these error checks on the course, which we believe is very fascinating.
With our current setup, the car successfully navigates and completes course one and two. The difficulty came when we journey onto course three.
After going through the first two challenges we were presented the mystery course three. We hS 45 minutes to come up with a solution that allowed our vehicle navigate through this challenge.
The Infamous Mystery Course Three!
In this time, we realized that even though our car was semi-autonomous we still had some hard coded values and areas where we made assumptions based on our known courses, but hadn’t accounted for the third course. For instance, we didn’t consider making multiple turns on and off a patterned or textured surface that could prevent the IR array sensor from working correctly.
In the 45 minutes to update our car, we were able to make some tweaks so the vehicle could begin handling more complicated courses. We did this by switching the purpose of the sonar sensor: once the car found its orientation, the sonar sensor was changed to look for pillars on the course, telling the car to make a required left turn.
Reflecting on how we could improve our vehicle, it’s clear we could improve our logic. Instead of trying to find a line and follow it, ensuring the car doesn’t leave that line, we could reconfigure the logic so the car would go straight until a sensor tells it to change course. This change offers greater flexibility over a variety of scenarios
If we had a total of four sonar sensors, one on each side of the vehicle we would be able to use each sensor for a specific task in the navigation process. We could have the vehicle drive straight as long as the front sonar sensor returns a valid distance to continue moving forward, and we could have the left and right sonars look for and detect physical signifiers for which direction the car should turn and once the turn is executed continue moving forward.
This was done with Fritzing and is exported below.
We have the code for our vehicle on Github under Creation & Computation: Autonomous Vehicle.
Project by: Jordan Shaw, Manik Perera Gunatilleke and Alison Bruce.