All posts by Marcus Gordon

Ruby – The Fauxlographer

Ruby at rest

Project Description
Ruby is a kinetic light installation using a red 640nm 5mW laser.  An experiment in what I call “fauxlography” where a holographic-like presence appears when the light travels through the acrylic giving a ghost like effect to the 2” x 2″ red acrylic pieces at certain angles.  This effect is simply projecting coherent light onto a translucent surface, making the acrylic pieces and the refracted light “seem” holographic in nature, but is “not” holographic at all.
An ultrasonic sensor detects a presence and turns on the servo that moves a laser in a sweeping motion.  The Ruby pieces are sometimes hit with coherent light at angles that glow the acrylic and/or create long and short range shadows that are in constant motion.  This coordination is more semi-random than wholly calculated, as the presence of humans in the 100 cm environment of the installation affect the motion of the servo.  This “affection” is a frequent on/off switching of the servo and laser light.

Video
Before: The Prototype
After: In Motion

Circuit Diagram
Ruby Diagram
Photography
DSC_8307 IMG_0538IMG_0550

Process
The original intention was to create a kinetic sculpture that moves in normal lighting conditions, but was then changed to an idea that involved lighting motion instead, using a laser.
First step was to still design and create an acrylic sculpture.  Acrylic was cut at 100 McCaul plastics shop.  Building the sculpture with hot glue and acrylic proved less reliable (or pleasant to look at) than with resin bond (which I only discovered working with the Maker Lab).  Then a laser based structure was to be made.  The first interation of this involved a laser pointer stripped down, and placed into a 35mm film bottle.  This worked very well, however, the first laser became faulty after a few days of testing with it.  When time came to creating another, a new laser pointer was stripped down to its circuitry, but placed into a Staedtler Designer Series pen casing which worked much better for alignment purposes and for better servo control of the laser.  The servo and circuitry was then built into a cardboard box that I spray painted with a cadmium red colour.

Context
This installation was inspired by the Erwin Redl series known as “Breath of Light”.  Fauxlography was a word as I created when assisting in the academic research on “VISUALIZATION OF COMPLEX MEDICAL DATA USING NEXT-GENERATION HOLOGRAPHIC TECHNIQUES” by Michael Page (OCAD University, Canada).  Presented at the 10th International Symposium on Display Holography in St-Petersburg, Russia.  The minimal setup of Ruby acted as an additional exaggeration to the notions of video projection, Pepper’s Ghost and other lighting effects as being examples of holography.

RedBot – Autonomous Robot Vehicle

RedBot

As a Graduate Research Project at OCAD U, we were challenged to make an autonomous robot using the Arduino Uno microcontroller. We built RedBot as an introductory idea for Red Bull events that could set up a new adventure with these little autonomous monsters.


TEAM

Nimrah Syed
Ling Ding
Jason Tseng
Marcus Gordon



THE CHALLENGE

To create an autonomous device that follows a preset path to reach its destination carrying a ping pong ball. The challenge included three different courses listed below:

Course A

img58

Course B

img62

Course C

img66

No details for ‘Course C’ was released until the presentation day.  We were given 30 mins to figure out the third course after the criteria was given.  Which, in the end, looked a little like this:

img103

STARTING IDEAS

LEGO LIFT

Create a new path above the lift. When the ping pong ball is put on the path, the first lift goes up so that the ball can arrive at point B automatically. When the ping pong ball is put on the path, the first lift and tree goes up. When the ball arrives at the position of the second lift, the lift goes down in order to use gravity to move the ball. Following the request, it would create the specific path to move the ball.

SPHERO

img79
Inspired by Sphero, we were thinking to alter the ball by adding features within it. The main method to add those features was by using Orb Basic, the Basic language of the Sphero. Features included an accelerometer, gyroscope, bluetooth and most importantly, the Sphero’s spherical nature.  This was supposed to be the main engine behind a chariot that would carry the ping pong ball.

CONDUCTIVE PATH

img82 img84
To use a conductive material as the path for the ping pong ball. This was beneficial in using the fabric as a layout for the course design itself.


STRATEGY

The RedBot modus operandi
The inspiration was to make an autonomous robotic car powered by Red Bull. Hence, the name RedBot!

Motivated by Red Bull events, the car was a natural progression to associate with it. RedBot, as much as an energy drink as it was to be a fast car, was designed to detect a path and identify objects that helped it navigate its given course.

ELECTRONIC PARTS

(2) N-Channel MOSFETs
(2) 5V DC Motors
(1) Arduino Uno
(1) Arduino Prototype Shield v.5
(2) HC-SR04 Ultrasonic Sensors

PROTOTYPING LIFECYCLE

img86 img88

img91 img99

CIRCUIT DIAGRAM

RedBot Final - Sketch_bb

TECHNICAL OVERVIEW

Each course has its own program specifically designed for it. It is indeed possible to combine all three programs into one, but at the current state they are not. Note that all turns are performed with one wheel moving, the other stationary. Turn durations are static, set within the programs themselves. Autonomy is derived by the proximity sensors, which are used to detect when to perform the pre-timed turns. The exact timing depends on the desired angle and turning speed, which in turn depends on battery power, motor specifications, balance on the device, wheel traction, etc.

Course A:

There are three movement modes used for course A.

Wall Correction Mode

When the on button is pushed, the RedBot first checks to see if there is a wall in front of it. If a wall is detected the RedBot turns right for a certain amount of time, enough so that the device no longer faces the wall. RedBot then switches to orientation mode. If RedBot does not detect a wall when it is turned on, RedBot switches to orientation mode immediately.

Orientation Mode

RedBot turns right in small steps until it detects a wall (the same wall used for wall avoidance mode). RedBot then turns left for a certain amount of time, enough so that the device is now aligned with the course. The device then switches to movement mode.

Movement Mode

RedBot moves forward in small steps until it detects a wall in front of it. RedBot then stops.  The first wall is placed on the course at a fixed position away from the starting point to standardize RedBot’s alignment. Note that the wheel acting as the centre of the turn (the right wheel) must be placed at the exact centre of the starting point for the alignment standardization to occur. The second wall is placed on the course at a fixed distance beyond from the goal point, facing RedBot, so that the device will stop exactly at the goal when it detects the wall.

[Arduino Code for Course A]

Course B:

When RedBot is turned on it moves forward in steps until it detects the half-wall placed at the corner via its lower proximity sensor. RedBot then turns left for a specific amount of time, enough so that it now faces the goal point. The device continues to move forward until it detects the full wall placed a bit beyond the goal with both its proximity sensors. The device then stops.

Half walls detection is the code that tells the device when to turn, while full wall detection is the code that tells the device when to stop. Given this logic pattern, it is possible to have RedBot navigate its way around any course, so long as the turn durations are pre-declared within the program. For example, it is possible to set all turn durations to a standardized time necessary for 45 degrees turns, and use half walls to navigate the device around a course comprised of turns with angles with a multiple of 45 degrees.

[Arduino Code for Course B]

Course C:

Due to time constraints the program for course C is a quick rig of the code for course B. The upper proximity sensor is disabled, and the full stop at the goal is sensed via a variable that acts as a counter for how many walls the device has encountered.

When RedBot is turned on it moves forward in steps until it detects the half-wall placed at the corner via its lower proximity sensor.  The device then turns left for a specific amount of time, enough so that it now faces the next wall. This bit of code is repeated for the next two angles. By now, the counter will have reached 3, meaning the device has so far encountered 3 walls. When it detects the last wall positioned beyond the goal point, the counter reaches 4, and upon reaching that count RedBot stops all actions.

[Arduino Code for Course C]

 

CONCLUSION

To conclude, RedBot has been able to achieve its goal of following a path from point A to point B, and bringing its ping pong ball to the finish line.  Although not as fast as we originally conceived, our Red Bull sponsored idea for an autonomous vehicle was a success.

Our minimalistic approach to RedBot’s design demonstrated how a crazy idea can eventually lead to a streamlined engine of simplicity.  With sensors limited only to sonar, code logic simplified to left/right instructions only, RedBot achieved greatness through the speed of thought.

img101

LED Sprite Display

Pedestrian_signal,_Central_Park
Sprite : a computer graphic that may be moved on-screen and otherwise manipulated as a single entity.
My story is that I wanted to have an 8×8 LED Matrix used to mimic the display beside street lights indicating to pedestrians to either walk or stop.  To begin to replicate this, the purpose of my process was to test the ability to control all 64 LED’s of this matrix to display my choice of a sprite.  Starting simple, I wanted to create two sprite symbols, an “X” and an “O” to represent the 2 states that this display will show.
Materials List
– Arduino Uno
– 16 pin 8×8 LED Matrix
– Breadboard
– Jumper wires
– 1k ohm resistors (8)
– Lots of patience
LED Sprite Display_schem
Discovery
At first, working with these LED’s in a matrix seemed overly complicated and frustrating if your research isn’t thorough.  However, after just a little practice, the matrix became “a little” less complex.  The primary thing to notice is wether you are using a common cathode or common anode matrix.

What is the difference between Common Anode and Common Cathode?
 
• Using seven segment displays as an example, when all the anodes are connected to one point, it becomes a common anode. Common cathode means that all the seven cathodes of a 7-segment display are connected together.
 
• To function, a positive voltage should be supplied to the common anode and the common cathode should be grounded.

The next thing I did was test the lights to understand how these diodes were mapped to their pins.  My trial and error resulted in one diode getting burned out accidentally missing a resistor on one of the connections.  Luckily, the matrix continues to function with the remaining lights.
Long story short, understanding that the matrix works by a system of rows and columns, was key.  A datasheet is what provided this understanding.  However, this understanding still wasn’t enough as there seemed to be inconsistencies trying to simply isolate one LED with the code.
The fact of this matrix being able to show both red and green, may have bearing on me isolating one light. But throughout the testing, I have only been able to show red.  Further research showed that I may have been better off with a shift register added to the circuit.
IMG_0426
48 wires later

Code
Screen Shot 2015-10-10 at 6.59.15 PM
/*
 *  Creation & Computation: Daily Devices Workshop
 *  LED Sprite Display
 *
 *  Marcus A. Gordon
 *
 *
 *  ASCII MAPPING OF 8×8 MATRIX
 *  ——————————
 *  This ASCII map identifies LED Matrix rows, red/green columns and first pin (1).
 *  Below each row is a red and green column.
 *  Column numbers are the same as the row number above them.
 *
 *  ——————————————-
 *  ROW 4 —-|*  *  *  *  *  *  *  *|— ROW 8
 *  COLRG —-|*  *  *  *  *  *  *  *|— COLRG
 *  ROW 3 —-|*  *  *  *  *  *  *  *|— ROW 7
 *  COLRG —-|*  *  *  *  *  *  *  *|— COLRG
 *  ROW 2 —-|*  *  *  *  *  *  *  *|— ROW 6
 *  COLRG —-|*  *  *  *  *  *  *  *|— COLRG
 *  ROW 1 —-|*  *  *  *  *  *  *  *|— ROW 5
 *  COLRG —-|*  *  *  *  *  *  *  1|— COLRG
 *  ——————————————-
 *
 *  ^ < v
 *  Pins are read, from pin 1: up – left – down)
*/
// Arduino pins connected to the matrix
int rMatrix[] = {2,3,4,5,6,7,8,9}; //rows
int cMatrix[] = {10,11,12,13,14,15,16,17}; //columns
void setup() {
  Serial.begin(9600);         //Open the Serial port for debugging
  for(int i = 0; i <8; i++){  //Set the 16 pins used to control the array as OUTPUTs
    pinMode(rMatrix[i], OUTPUT);
    pinMode(cMatrix[i], OUTPUT);
    digitalWrite(rMatrix[i], HIGH);
    digitalWrite(cMatrix[i], LOW);
  }
  digitalWrite(rMatrix[5], LOW);
  digitalWrite(rMatrix[4], LOW);
  digitalWrite(rMatrix[0], LOW);
  digitalWrite(cMatrix[1], HIGH);
  digitalWrite(cMatrix[3], HIGH);
  digitalWrite(cMatrix[5], HIGH);
  digitalWrite(cMatrix[6], HIGH);
  digitalWrite(rMatrix[6], LOW);
  digitalWrite(rMatrix[7], LOW);
  //digitalWrite(rMatrix[3], HIGH);
}

Spidey Bird’s Web

So Spidey Bird had a little trouble with its circuitry getting glitched by a soldering iron, but she still looks pretty!

IMG_0422

My approach was to interface Spidey Bird with a web control switch (no pun intended).  The purpose was to show the ability to have Arduino circuits controlled via the web with a little help from Processing.  Here’s how:

1.  First start a web server with PHP.

Screen Shot 2015-10-05 at 9.03.38 AM Screen Shot 2015-10-05 at 9.03.16 AM

2. Then upload a power switch program to Arduino.

Screen Shot 2015-10-05 at 9.04.52 AM

3. Create a Processing script to listen for web commands from a web site.

Screen Shot 2015-10-05 at 9.06.59 AM

4. Go to http://localhost:8000 for the web interface switch

IMG_0423

The following are screenshots of how simple the code is that runs on the web server:

Screen Shot 2015-10-05 at 9.53.24 AM Screen Shot 2015-10-05 at 9.53.09 AM

 

In the end, the goal was to have the switch simply turn on a fan, by turning on the switch, then blowing or whistling into Spidey’s microphone to spin the fan intermittently.

MVI_0419(mini)