Project 01 – Apeiron

Description

My project consists of  a circular light box that combines video, light, sound and interactivity. The video is displayed using an Android tablet placed behind a translucent acrylic panel. Strong LED lights dim in and out to create a constantly but slowly changing glow emanating from the box. An infrared distance sensor detects when someone approaches the box from above (the box is placed flat on a table) and accelerates the LED dimming cycles in relationship to the distance of the viewer. At the same time, the video changes reacting also according to the distance detected by the sensor. When the viewer reaches a close distance, vibration motors are triggered inside the box that create a loud metallic noise. The wires inside the box were left purposely longer than needed to contribute to the creation of shadows on the panel that resemble an organic creature’s circulatory system.

Apeiron is a greek word that means “infinite”. It was used by the ancient Greek philosopher Anaximander to name an eternal and boundless reality from which everything comes. This piece is a reflection on the interactions between technology and our biological nature. The video was made by combining and manipulating videos showing microscopic life forms, the human circulatory system and images from space, mostly from the Mars HiRise database and the early moon expeditions. The images that appear when the user approaches were made by mixing images of the human skull with images of dump sites showing different kinds of waste: metal, wood, plastic and electronics.

Apeiron is an object left behind by an alien species that becomes a window to our subconscious and essence as techno-organic beings.

The ideal presentation of this piece would be in a small/medium (minimum 4×4 meters) dark room with clean white walls (gallery white box) where the piece is the only source of light (if needed, additional soft indirect lighting can be added near the floor level to illuminate the ground). The piece would be embedded into a short cylindrical plinth (about 1.4 m tall) or on an S shaped metallic structure, as shown in the mockups below (made using Blender, click images for full view):

apeiron_setup04

apeiron_setupB02

apeiron_setupB01

This is the full video used in this piece:

These are the skull images (click images for full view):

img0 img1 img2 img3

 

Creation Process Journal

Day 0

This project is part of the larger project “Space Station Haunted House” for the Creation and Computation class at OCADU (Master’s in Digital Futures). Short after the theme was decided, it came to my mind the idea of someone being in an unknown abandoned space station where they would get in contact with some kind of bio-cyber presence via an object or machine. I first thought of a lab setup where the explorer would find lab containers with unknown small bio-mechanic creatures that react to someone’s presence in some way.

Day 1

To get some inspiration I went to Active Surplus in Toronto. Just by looking around the wide variety of unexpected objects has served me before as a way of getting ideas on how to solve a problem. After looking at some of the lab equipment they have, I came across a speaker enclosure that triggered ideas of using it as some kind of viewport into something.

20141001_110334

I imagined it as a window to some other dimension where the apparent size of the object doesn’t represent the infinity it contains. It brought to my mind the short story “The Aleph” by Jorge Luis Borges:

“… The Aleph? I repeated.

Yes, the only place on earth where all places are — seen from every angle, each standing clear, without any confusion or blending. I kept the discovery to myself and went back every chance I got. …”

Day 2

After struggling for some time with the final concept, this day I got a sudden idea: I could use the speaker enclosure and cover it with a translucent acrylic (Plexiglass) panel to create a sort of glowing light box. this box could be used to also house a video display that could show images that change according to the interaction of someone looking at it. For video display I could use a tablet and to control the lighting and monitor sensors I first thought of using a IOIO board, which I found very stable and easy to use for other projects I’ve done before. Later I thought it would be interesting to find a way to communicate with an Arduino Uno via a USB serial connection (using the tablet’s USB host capabilities). Searching around I came across a library named usb-serial-for-android that allows an Android app to communicate in this way.

Day 3

Today I sourced all the materials required to build my idea:

  • Speaker enclosure and translucent acrylic panel: The panel was hard to find at the beginning since the local hardware stores had  only transparent acrylic. I thought I could buy a transparent one and use sandpaper to make it opaque. I’v also read that acetone can be used for this purpose since it melts away the material but also found that some people got mixed results with this. At the end I found out that at OCADU there’s a plastics workshop that sells materials so I was able to get it there. Next challenge was to get it cut precisely in a circle but fortunately I was able to get it done at OCADU’s Rapid Prototyping Centre where they have laser cutters. I just had to create an Illustrator file with the paths to be cut and I got it done in a few minutes.

20141003_115330

  • Electronic components: A infrared distance sensor (SHARP GP2Y0A21YK), two powerful 12v LED arrays (4 LED each) in red and green, two 5v vibration motors, an Arduino Uno and a Nexus 7 Android tablet. I decided to feed the Arduino using a 12v DC power supply then power the LEDs from the Arduino’s Vin pin. Since I didn’t have the full power specs of the vibration motors, I decided to power them independently to avoid burning the Arduino so I require a 5v regulator, connected to the 12v supply. To control the LEDs I will use a ULN2003 IC transistor array (easy to configure and cheap), a L78S05 voltage regulator and a 2N2222 NPN transistor to control the motors. Extra capacitors and resistors needed to interface everything together (view schematic below).

Day 4

This day I worked on getting the electronic circuit working and figuring out the layout. I also programmed the Arduino (see code section below).

20141001_212755 20141003_095019 20141003_102034 20141003_104325 20141003_130921 20141003_115303

Day 5

Today I worked on programming the Android app for playing video, overlaying images on the video and communicating with the Arduino. To read more details about the app, read below in the Code section.

Day 6

Created the visual materials (video/images), installed the electronics and stands for the tablet. I attached the vibration motors to small zip-ties using hot glue to give them some rigidity but allow them to bounce. A hole was cut on the rim of the enclosure to make room for the distance sensor matching the laser cut window on the acrylic panel.

20141005_172312 20141005_182947 20141005_183753 20141006_101243 20141008_124509

The finished project:

20141008_124551

 

 

Circuit diagram

(click image for full view)

apeiron_schematic

 

The main sections of the circuit are:

  • Arduino Uno: It probably would have been easier to use a IOIO board, as mentioned in the journal, since the communication with the Android app would have been easier. I decided to stick to the Uno because I wanted to explore the USB serial interfacing with Arduinos.
  • 5v voltage regulator: This is needed to power the vibration motors without risking drawing too much power from the Arduino 5v out. The 12v are still within the accepted limits for powering the Arduino itself so I decided to use a single power supply. The C3 and C4 capacitors are needed to stabilize the power from and to the regulator.
  • Vibration motors and controller: I used the Arduino digital out connected to the 2N2222 transistors to switch the power supply to the motors. The R1 and R2 resistors are needed to limit the amount of current flowing and the D1 and D2 diodes are there, in parallel, to protect the transistor and Arduino from any back current feeding from the motors (as with any motor or solenoid).
  • LEDs and controller: The LED modules consist of two arrays of 4 LEDs each (one red and one green) requiring 12v that are supplied via the Vin pin on the Uno. To switch the power to the LEDs using PWM, I decided to use the ULN2003 transistor array IC in case I wanted to add more LEDs in the future and also because it is a simple and inexpensive solution that allows switching with PWM (from the digital pins in the Uno).
  • IR distance sensor: This one is very straight forward to connect. It requires 5v and its power consumption is low enough to use the 5v out from the Arduino.

See pictures above (Day 4) to see how everything was laid out on the circuit board.

 

Code

The full source code for the Arduino and the Android app is available at GitHub: https://github.com/hectorC/Apeiron

Arduino:

I wanted the LEDs pulsation, motors and proximity sensor to not interrupt each other so I needed to use timers instead of delays. For past projects I’ve programmed the timers by hand using the millis() function but for this project I came across a handy library named arduino-softtimer that does this for you and emulates concurrent tasks that can be started, stopped or scheduled. To send the proximity sensor data to the Android app, I fixed the size of the data to 3 digits, preceded by a has-tag to be used for sync.

The LEDs are dimmed in and out constantly following different rates to achieve a more organic and varied mix between the red and green colours. The green LED dims at a fixed rate whereas the red is dimmed applying a random variation to the step size. The dimming functions are called via a Softtimer at two different frequency rates, a slow one when there is no interaction with the piece (the distance sensor is below the FIRST_LIMIT threshold constant) and a faster rate when the interaction begins. The LEDs are dimmed using PWM varying from low to high (0-255) at steps that increase or decrease according to the distance reported by the IR sensor, for example:

void dimLED2(Task * task) {

  led2Level = led2Level + (led2Dir * (random(0, 2) + led2Step));
  
  if (led2Level > 255) {
    led2Dir = -1;
    led2Level = 254;
  }
  if (led2Level < 0) {
    led2Dir = 1;
    led2Level = 1;
  }

  analogWrite(led2Pin, led2Level);
}

For the vibration motors I also added a random factor to make the sound more organic but in this case I randomized the on/off state:


if (motorState) {
    if ((int)random(0, 2) == 0) {
      digitalWrite(motor1pin, LOW);
    } else {
      digitalWrite(motor1pin, HIGH);
    }

    if ((int)random(0, 2) == 0) {
      digitalWrite(motor2pin, LOW);
    } else {
      digitalWrite(motor2pin, HIGH);
    }
  } else {
    digitalWrite(motor1pin, LOW);
    digitalWrite(motor2pin, LOW);
  }
}

The value from the IR sensor is zero-padded to a length of 3 digits and a # character added at the beginning. I did this t0 facilitate  the communication with the Android app, as explained below.

Android app:

The Android app uses code from the example app included with the usb-serial-for-android library for polling the USB manager, detecting a connected Arduino and establishing the connection. I then created a second Activity that is launched after a successful serial connection. This VideoActivity uses a VideoView  and an array of Bitmap objects containing the still images. The video and images are loaded from the internal memory of the tablet (this can be changed in the source code to point to any other files or location). The app enters fullscreen mode and starts playing the video. Each time data arrives from the Arduino, a listener is triggered and the sensor data is parsed. The challenge here was that the data could arrive divided in the middle of a message (each message composed of a # symbol plus 3 digits, as explained above) and this division had to be detected and handled by storing the incomplete message and appending it to the next data chunk:

private void updateReceivedData(byte[] data) {
        int position;
        String message;
        String value;

        try {
            message = remaining + new String(data, "UTF-8");

            while ((position = message.indexOf("#")) != -1) {
                if (message.length() >= 4) {
                    if (message.length() > 4) {
                        value = message.substring(position + 1, 4);
                    } else {
                        value = message.substring(position + 1);
                    }
                    if (value.length() > 0) {
                        controlVideo(Integer.valueOf(value));
                    }
                    message = message.substring(position + 1);
                    remaining = "";
                } else {
                    remaining = message.substring(position);
                    message = "";
                }
            }

        } catch (UnsupportedEncodingException e) {
            e.printStackTrace();
        }
}

The sensor values are checked agains the same threshold levels as the Arduino, using the same two constants: FIRST_LIMIT for when the viewer first approaches and SECOND_LIMIT for when the viewer reaches a close enough distance to trigger the vibration motors and blink the LEDs at fast rate. When the values go above the FIRST_LIMIT, the image of the skull is composited over the video using variable transparency and scaling that follows the increase or decrease of the IR sensor value (distance changes). The skull images alternate randomly at a rate that varies also randomly but with a non-uniform distribution (this again to create a more “organic” variation):

private void controlVideo(int distance) {

        textView.setText(String.valueOf(distance));

        if (distance > FIRST_LIMIT) {

            imageView.setAlpha(linLin(distance, FIRST_LIMIT, SECOND_LIMIT, 0f, 1f));
            imageView.setScaleX(linExp(distance, FIRST_LIMIT, SECOND_LIMIT, 1f, 2f));
            imageView.setScaleY(linExp(distance, FIRST_LIMIT, SECOND_LIMIT, 1f, 2f));

            int random1 = getRandom(1,10);
            int random2 = getRandom(1, 4) - 1;

            if (random1 > 6) {
                imageView.setImageBitmap(bitmaps[random2]);
            }

        } else {
            imageView.setAlpha(0f);
        }
 }

Context

My piece could be contextualized within the area of interactive video art. An example of this type of work would be the piece by Canadian artist Marie Chouinard titled Cantique 3 that I had the fortune to see live in Montreal (link to video and description of the work). Although obviously much more complex, this piece serves as an illustration of the type of direction I would like to take when developing further projects in the same line as Apeiron.

 

Conclusions

I really enjoyed making this project. Until now I’ve been working as an independent sound artist/electroacoustic music composer and sometimes collaborating with visual artists as a system designer/programmer, but I’ve always been interested in starting producing my own installation art pieces. I wanted to take this project as an opportunity to start my explorations into this area and I feel that this has been a good small first step. The fact that I had to stick to a deadline also helped me to learn to brainstorm concepts for this kind of projects, decide for one and take it to completion. Having to build a physical project also got me started into fabrication and figuring out how to source materials and process them. I’m looking forward to future projects!