Experiment 2 final presentation Google slide link

Link: Google slide document.

In the slide presentations,  each student presents their own rule and its expression in P5.js, Processing desktop, or not programming based form.

Students will also select a rule from one of their fellow artists from the current class or the previous class with at least one expression for each.

Experiment 4 – Script redux

Nik Szafranek

I ended up trying to push the code further for Experiment 2. The code still doesn’t work but I’m getting somewhere. I was particularly drawn to this as I’ve always been fascinated with writing and linguistics, I really enjoyed the challenge despite it not getting it working in the end and I feel like I’ve gotten more comfortable with code. I hope to explore further.

Here is the link to the code:


I have gotten it to recognize chunks of text and split them into component characters but I’m still struggling with concatenating and displaying the artboard-1

Jin Zhang(3161758)
Siyue Liang(3165618)

Documentation: Sound Interactive Installation


Project Description
For this project, our group decided to experiment with the capacitive sensor. It is a library in Arduino that turns any conductive material into sensors that senses proximity. Our original idea was to make an installation with metal wire and people can put their hands close to it to control the amplitude and speed of the audio. We didn’t achieve it the end because the sensor didn’t work the same as we thought it would. Therefore the thing we ended with was that we had to touch the metal with our hands in order to make the values change, which was not what we aimed for.
Inspiration & Related Works
In the beginning, we inspired by the wire loop game. The player needs to use a loop wand to pass the metal wire maze without touching the wires. This game is famous. We think the cool thing about this game is the metal wires. It can be any different shapes as you want and play the role as a sensor. Based on this game, we thought the idea of using metal wire as a sensor to control sound/visual effects will be super cool. (this is the link of a simple wire loop game that we found online: https://www.instructables.com/id/Wire-Loop-Game-Tutorial/)

We did some research online and found out that there are lots of cool metal artwork. We wanted to build cool metal sculpture and connect each of them to the different piano note.


Building Process

  • Code For Arduino

#include <CapacitiveSensor.h

* https://forum.arduino.cc/index.php?topic=188022.0
* CapitiveSense Library Demo Sketch
* Paul Badger 2008
* Uses a high value resistor e.g. 10M between send pin and receive pin
* Resistor effects sensitivity, experiment with values, 50K – 50M. Larger resistor values yield larger sensor values.
* Receive pin is the sensor pin – try different amounts of foil/metal on this pin
CapacitiveSensor cs_4_2 = CapacitiveSensor(4,2); // 10M resistor between pins 4 & 2, pin 2 is sensor pin, add a wire and or foil if desired
CapacitiveSensor cs_8_7 = CapacitiveSensor(8,7);
CapacitiveSensor cs_7_6 = CapacitiveSensor(7,6);
CapacitiveSensor cs_9_8 = CapacitiveSensor(9,8);
CapacitiveSensor cs_11_10 = CapacitiveSensor(11,10);
CapacitiveSensor cs_13_12 = CapacitiveSensor(13,12);

void setup()
cs_4_2.set_CS_AutocaL_Millis(0xFFFFFFFF); // turn off autocalibrate on channel 1 – just as an example
/* cs_8_7.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_13_12.set_CS_AutocaL_Millis(0xFFFFFFFF); */

void loop()
long start = millis();
long total1 = cs_4_2.capacitiveSensor(50);
/* long total2 = cs_8_7.capacitiveSensor(50);
long total3 = cs_7_6.capacitiveSensor(50);
long total4 = cs_9_8.capacitiveSensor(50);
long total5 = cs_11_10.capacitiveSensor(50);
long total6 = cs_13_12.capacitiveSensor(50); */

// tab character for debug windown spacing

Serial.println(total1);// print sensor output 1
/* Serial.println(total2);
Serial.println(total6); */

delay(30); // arbitrary delay to limit data to serial port

  • Code For Processing
    import processing.serial.*;
    import processing.sound.*;

Serial myPort;

int sensor1;

// Declare the processing sound variables
SoundFile sound;
Amplitude rms;

// Declare a scaling factor
float scale=5;

// Declare a smooth factor
float smooth_factor=0.25;

// Used for smoothing
float sum;

public void setup() {

String portName = “/dev/cu.usbmodem141301”;

myPort = new Serial(this, portName, 9600);

//Load and play a soundfile and loop it
sound = new SoundFile(this, “sound2.mp3”);

// Create and patch the rms tracker
rms = new Amplitude(this);


public void draw() {

// Set background color, noStroke and fill color

//println(“sensor1: “+sensor1);

sound.rate(map(sensor1*0.05, 0, width, 1, 4.0));
//sound.amp(map(mouseY, 0, width, 0.2, 1.0));

// smooth the rms data by smoothing factor
sum += (rms.analyze() – sum) * smooth_factor;

// rms.analyze() return a value between 0 and 1. It’s
// scaled to height/2 and then multiplied by a scale factor
float rms_scaled=sum*(height/2)*scale;
if (sensor1 > 1000){
else {

//if (sensor1 > 1000){
//rotateY(PI/3 + sensor1 * PI);}

for (int size = 10; size < 400; size += 50){
box(size,size, rms_scaled);

void serialEvent(Serial myPort) {
//yPos = myPort.read();//like serial.write in arduino
// read the serial buffer:
String myString = myPort.readStringUntil(‘\n’);

if (myString != null) {
// println(myString);
myString = trim(myString);

// split the string at the commas
// and convert the sections into integers:
int sensors[] = int(split(myString, ‘,’));
for (int sensorNum = 0; sensorNum <=0; sensorNum++) {
print(“Sensor ” + sensorNum + “: ” + sensors[sensorNum] + “\t”);
if (sensors[0] != 0){
sensor1 = sensors[0];

/* sensor2 = sensors[1];
sensor3 = sensors[2]; */


20181208002240 20181208002228 20181208002215 20181208002157 20181208002147 20181208002135

Features & Goals
Our goal was to make a sound interactive installation where the speed and amplitude are controlled by the value of the capacitive sensor.

In terms of code part, we were planning on using 4 to 6 capacitive sensors for the installation. However, after wiring and connecting for 6 sensors, they all stopped working for some reason. So we took one out at a time and check to see if it worked, but it wouldn’t work until there was only one sensor left in the circuit. We also wanted to have the sensor controlling some LEDs apart from the visuals, but when we added the output pin for the LED, the serial port would stop writing values to Processing, which was really frustrating to see.

In terms of physical build, when we noticed that the metal wire wasn’t reacting to proximity, we tried using some other materials to replace it. We were not sure if it was because of the conductivity of the metal is affecting the effectiveness of the code. So we put conductive paint on paper and used it as the capacitive sensor but the same thing happened.
So we switched back to using metal wires. We spent a lot of time considering the form of the installation. We thought it would be better to make the visual and installation somehow relates to each other hence the music notes. A cube shape was the other option but we decided to go with the notes eventually.





The Team

Salisa Jatweerapong, 3161327

Melissa Roberts, 3161139

Mahnoor Shahid, 3162358

Samantha Sylvester, 3165592


About AerForge

AerForge is a project about change, and how the different spaces in which we create change the creation itself. The entity with which the user interacts transitions through realms and dimensions of creation, exploring and connecting the environments we learned in class.

To begin with, the AerForge experience is intangible. The user draws an imaginary line into the air in front of them, and out of thin air comes a visual. Projected on a screen in front of the user is the line they began to draw, and as the user continues to move their hand through nothing, they create something. Thin columns appear on the projection, their height matching the height of the user’s line. With a wave, the user rotates the projected image and out of lines and columns emerges a form. Though empty-handed, the user is able to create and engage with a virtual object. The user places their hands out in front of them, palms up as if asking for something. This gesture cues the virtual object to be downloaded and sent to a 3D printer. The final transformation brings AerForge into the physical world, and into the hands of the user.

Continue reading “AerForge”





Experiment 4: Final Report


Isaak Shingray-Jedig
DIGF 2004-001
December 7, 2018


The inspiration for this project came from past experience in the Digital Future program in the form of a past project.  The culminating assignment of Accio from first year that Roshe Leynes and I worked on introduced me to computer vision.  The project that we developed allowed people to interact relatively easily and naturally with an onscreen render and that was in large part thanks to the amazing potential of computer vision (in that case through the use of Kinect).  With experiment 4 I thought to explore the uses of Kinect further through a drawing experience that would use a large piece of conductive or resistive fabric as a makeshift paper or medium in combination with a Kinect for the tracking of the tip of a finger.  In practice, Kinect 1 wasn’t well suited to fine movement detection, so instead, I looked to video analysis computer vision and found my answer; colour tracking.  Colour tracking allowed for high levels of accuracy without any of the general depth noise that the Kinect was prone to producing, so I settled on it as my method for computer vision.



Users sit in front of a computer running the program.  They place a small fabric ring on the  index finger of their dominant hand.  They hold the space bar to use the color tracking calibrator and see the webcam feed, where they click on the colored band attached to the ring on their finger. they then release  space and begin to draw in the same way that one would finger paint or draw in sand.



Features and Components

The components used in Contact were:

  • USB webcam
  • Boom arm desk lamp
  • Large piece  of conductive fabric for desk
  • Small conductive fabric ring for finger with a patch of red electrical tape
  • Wires
  • Arduino

The physical construction of Contact was relatively simple but had to be set up in a very particular way because of the fact that the webcam had to have a clear line of sight on the entire pad of fabric, which in turn had to be very brightly lit.  In combination with the code, these components came together to produce a responsive touch drawing experience.  The most difficult part by far of the development of Contact was the implementation of reliable color tracking.  A few hurdles in that process were understanding how to think of colours in a space relative to X,Y,and Z, as well as understanding that every single pixel on screen had to be checked each frame.  In terms of rendering, when I tried to use an array to store line values, the system became very slow and unresponsive.  This is due to the heavy load that checking every pixel puts on the computer every frame.  For this reason, I put the background in the setup function and made 2 arrays 2 places long so that the program would draw one line at a time without erasing the previous line s




The goal of the project was to create a responsive and reliable, but above all else natural, drawing experience.  I think that in many ways Contact succeeded, especially in terms of usability and the responsive and reliable operation of the program through computer vision.  In terms of a natural experience, the physical setup of using a camera attached to the back end of a lamp seemed to produce a relatively natural feeling to Contact, but after having received feedback from the professors and seeing my peers interact with it, I think that the gestural nature of the experience should be further explored.  I will look to expand upon what exactly makes the experience unique from other forms of touch technology and play on its natural strengths as a system.  The most immediate area for experimentation would to me is to move the experience away from a seated position and into some form of more physical experience.  After having written the word sand in the experience section, I would also like to explore sand in a similar format.





Contextual works

Drawing using Kinect V2



Computer vision for Artists and Designers



The pixel array




The Soundscape Experience

Experiment 4: Final Report

Francisco Samayoa, Tyra D’Costa, Shiloh Light-Barnes
DIGF 2004-001
December 5, 2018


project details

  1. User starts at any location on the map
  2. They put on the headphones
  3. Depending on the position of the head, the headphones will play a particular sound.
  4. If a particular sound is playing then the screen will display an associated image or video
  5. The user must experience all the audio and visual data in order to try and make a connection between the media and a place that exists in the real world before their opponent.

project inspiration

Our project seeks to encompass elements of sound, sensory perception, and nature in a installation based artwork. Our process began by discussing the types of sensors we wanted to explore, eventually we decided on the MPU 9250 Accelerometer and Gyroscope. Our proposed idea, is to integrate the sensor into headphones, which can then change the sound and visual experiences of the user. As the user navigates the room and moves their head about, the world in which they are digitally immersed in will enhance the real world. Essentially, it is a 4D-sound experience. If we had more time we would add visual elements such as 360 video, or a 3D Unity-based environment.

Background Work

In the end, we were able to get the MPU sensor working with Arduino so that we receive X,Y and Z coordinates. After scouring the web for solutions to library related issues, and calibrating the code in order to receive accurate data readings. Additionally, we were able to take these coordinates and map them to a camera in Unity, so that the sensors orientation changes the perspective of the viewer. This meant we had the pitch, yaw and roll functioning for the gyroscope component. For the purpose of this experiment we disabled with roll since the user wouldn’t be rolling their head per se. However, there were a few “blind spots” the camera couldn’t pick up, such as the 180 degree mark. The interface was fully functional, for the most part. The main issue was the data overload kept causing Unity to freeze, so our solution was to reset the Arduino.

Links To Contextual Work

How to play multiple audio files

Research on Arduino + Gyroscope

Good Link explaining how IMU sensor works

Tutorial for IMU and Unity set up

How to set up MPU and Arduino Library

Unity + Arduino Library

Goals & Features

  1. Engaging UX Design: We want to solidify the conceptual ideas of the soundscape experience to create an immersive UI/UX design. We want this to include, elements of storytelling and gamification.
  2. Sound-Sensory Experience: Moving forward we will have to test that the sensor can allow smooth transition between sound files. The media assets will consist of ethereal soundtracks, data visualizations (natural algorithms and patterns), and nature related images.
  3. Integrate Wearables: Also, we need design a way to integrate the sensors into the wearable technology ( the headphones).
    The MPU unit was housed in a sewn leather pouch. There was velcro attached underneath in order to stick to the top of the headphones. This way, the wires were completely out of sight since it was hanging from above. For debugging purposes we wanted to the MPU unit detachable from the headphones. In the end, we were successful.
  4. Discuss concepts of nature, ecosystems and natural algorithms: Lastly, we want to think about how these concepts can work together to create a narrative and game play element.
    Using a blindfold we acquired, we were able to gameify the experience. With the blindfold on, the user would have to guess which environment they were placed in. We would randomly select 1 of 9 unique ecosystems, including 2 songs created by Francisco and Shiloh. These include a war scene, temple, beach, rain forest, and city intersection.

Pictures & videos



46854703_644056072656401_7774253524338081792_n 47573880_349446675631454_7184122986348150784_n 47571987_341695319963113_3360251171074736128_n 47380658_359783828089595_3941993869863813120_n




The Light


Kiana Romeo3159835 || Dimitra Grovestine3165616



When it came to our project, everything we did in it was based on our original inspiration and our vision never wavered from it.There was a very specific message we wanted to get across in our project and the concept of the product trumped the need for the actual execution in the end. We were overall inspired by the idea of dying and “going into the light”; something grim and undesirable becoming something beautiful and inviting. Our project aimed to allow people the experience of dying and going into the afterlife without it actually happening. We also aimed to more or less allow people to interpret the afterlife as they saw fit as to not make any assumptions. Using visuals and audio associated with this type of scenario, we immersed people into a heavenly world in which they could briefly escape life on earth and ascend into something higher.


Contextual material

https://www.youtube.com/watch?v=axSxCo_uMoI&frags=pl%2Cwn-Don’t go into the White light philosophical video

This video goes over philosophical and religious reasons why one should not go into the light when they die. It was an interesting video to watch as for the most part, people believe that the white light is a good thing, meaning you are going up into heaven and towards God. But for this happy light to be a trap, going towards it would be a bad thing. This is why, when during the presentation of the project we decided to push people towards the light and then pull them out quickly in order to give them a chance to decide what they felt about the light.

 https://www.youtube.com/watch?v=hOVdjxtnsH8 – choir of angels singing (audio used in project)

Church choirs can have some of the most beautiful sounds and music coming from them, and after researching our concept, this music was incredibly inspiring. It expanded our concept as we wanted to create a fully immersive experience and while strong visuals and lighting could definitely help create this environment, sounds are very important too which is why we found it necessary to find just the right sounds to use in our project.

https://www.youtube.com/watch?v=lWqHRLjNZbE&frags=pl%2Cwn Man details what it was like going to heaven after death

although the credibility of this video was questionable, the idea of heaven in a religious and philosophical sense is all guesses of what its really like. Therefore, having this individual’s take on it was important as it covered some of the beliefs we wanted to be in the video.


Features of the project

Overall, we had set out to create a heavenly atmosphere. we believe that we achieved steps towards creating this however, we feel that our installation required more elements to bring the piece together. Obviously, we would have loved for all of our elements to work together cohesively, but after receiving feedback we think we would have added additional elements to the piece. I think that we would have considered an alternate shape of projection, verses the classic rectangular projection. We believe that this is necessary to add a unique element that feels like it belongs more in the piece. Where, the rectangular projection, looks plain and slightly out of place. We also would consider adding smoke and mirrors to create an ambiguous and more interesting space. Not being able to see and not necessarily knowing what you’re looking at would help create and add interest to the piece.


Proximity sensors:


The proximity/ distance sensors were meant to control all the elements of the installation. We were to use two sensors in the exhibit where one would control the led light strips that would illuminate as one got closer and control the whisper sound effect in the room while the other would control the brightness of the cloud visuals at the front of the room and the volume of the angelic singing as well. Unfortunately for us, the quality of the sensors wasn’t the best and their effectiveness did not work as well as I had wanted.

The room:


Upon finalizing our concept, we had decided we wanted to construct our installation in a critique room where it would be dark enough to have a great effect. Unfortunately, though we could only book a meeting room where it was not as dark and was very large. We made it work but having a smaller space may have been a better option.

The lights:

picture1 img_4156








It was  a real challenge getting the lights to work even without the sensors. we ultimately fixed it by using a battery pack after realizing the lights needed a 12V power source instead of 5. This was a crucial part of our process and took a whole class to figure out but when we did it was more or less smooth sailing.




atelierfinal (Clouds and visuals code)

atelierfinal (Lights and sound code)


Experiment 4 – Snow Day

Brian Nguyen – 3160984

Andrew Ng-Lun – 3164714

Michael Shefer – 3155884

Rosh Leynes – 3164714

Snow Day



Experiment 4 has gone through many developments and alterations when compared to our initial stage of the assignment. Essentially, our inspiration came from the concept of how movements could manipulate an environment. With the use of matter.js, ml5js, and the Posenet library, we set out to create an interactive installation that tracks an individual’s body movement and builds it into a skeleton that is capable of interacting with the environment within P5. The environment is set to mimic a snow day where particles gradually drop to the bottom of the canvas and the individual is able to interact with their physics capabilities through movement of the arms. The purpose is to provide an experience of playing in the snow via P5. Additionally, the installation promotes interactivity with others as it is capable of registering more than one individual onto the canvas and allowing all participants to interact with the environment.


Related Work

The inspiration for our concept stemmed from an article  that introduced us to Posenet and it’s capabilities described it more in depth. With it’s basic understanding and implementation into P5 we then continued to explore and develop the idea of physical body interactivity by looking at inspiration of particle systems in codepen before looking into other various libraries. Additionally, some of our group members have previously worked with the webcam and its capability to manipulate particles in P5 via a webcam feed, this previous knowledge allowed us to jump start our concept development.

Background Related Work



Goals for the Project


Our first goal was to implement Posenet into P5 to register a body with a background particle system on the canvas. This was achieved as pictured above. The basic points of the head, shoulders, and limbs was able to be registered and constructed in to a skeleton, furthermore it managed to capture more than one person. From there we continued to refine the presentation of the project by altering the particles, canvas, and skeleton.


With Posenet and our particle system working well in P5 our next goal was to actually implement the interactivity. While this goal was achieved come presentation day, we did encounter difficulty when attempting to implement it. With the body movement tracked and represented as a skeleton onto P5, we added squares to the points of the hands that would follow the movement of the arms and eventually interact with the falling snow via physics upon touching it. The boxes weren’t always responsive especially when they had to follow the movement of multiple people. Additionally, we experimented with what shape would be able to manipulate the snow better and ultimately settled on squares.

Our final goal came from an issue that we encountered during critique day. In order to register the subject effectively, the body had to be well lit. We managed to achieve this by installing light stands to illuminate the subject. We experimented with different ways of eliminating shadows, and angles in which the light dropped onto the subject. At the end we used two light LED studio lights installed alongside the webcam to a white backdrop in order to capture the subject movement effectively.


Code with References and Comments


Working Demo


Alternative Methods of Input: Synchronization of Music & Gameplay by Angela & Denzel

Alternative Methods of Input: Synchronization of Music & Gameplay

a project by Denzel Arthur & Angela Zhang

Denzel Arthur: Unity Gameplay

For this assignment I wanted to continue exploring unity, most importantly, the developer side of things. I wanted to connect my love for music and video games, in order to help give meaning to the back end side of the project. Through this assignment I got to work with the animator in unity, scripting with c#, serial port communication between arduino and unity, and music synchronization within the unity application, and ableton in order to create and cut sound clips.


The first part of this project was based on doing research on the concept in order to find out how I could replicate it. I immediately found resources from the developers of “140” a music based platformer that became the main inspiration for this project. As the game was developed in unity, the majority of the information provided by the developers aided me in my quest. They had information about frame per second, beats per second, and how to match them in the unity game engine. Although the code the added to the pdf was no longer active, the explanation itself was enough for me to create an prototype of the concept.

The second part of the project included setting up the unity scene. This part of the process involves using basic unity primitive objects to create a basic concept of the game. The primitive objects where used to arbitrarily represent the player object, enemies, and environment. With these basic assets, I was able to implement program collision systems, death and respawn animations, and other triggers like button events.

The third part of the process included the physical computing part. I was initially supposed to work with another classmate to create more elaborate buttons, therefore I created a basic protype buttons found in the arduino kit. These buttons became the main source of communication between the game and the player during the presentation. A very lackluster physical presentation, but that seems to be a trend in my work here in the digital futures program. Nonetheless, after the buttons were created I proceeded to connect the physical buttons attached to the microcontroller, to the unity engine. This proved more challenging than need be due to poor documentation on the free methods of connection, but after purchasing the Uduino kit, the connection became a seamless process. This process also included programming the buttons and adjusting the animations, mechanics, scripts, and audio files in order to get a prototype that was playable and had the right amount of difficulty.

The final part of this process was creating a visually appealing product within the game engine by adjusting the virtual materials and shaders within unity, and also swapping out any assets with ones that fit the concept of the game. I still went with primitive shapes in order to achieve a simplistic aesthetic, but certain shapes were modified in order to to make the different levels and enemy types seem more diverse.

unnamed-2 unnamed-3 unnamed unnamed unnamed-1


Angela Zhang: Physical Interface

For Experiment 3, I wanted to use capacitive sensing to make a series of buttons that would control the gameplay in Unity for Denzel’s game. I started with a 9×9” wooden board primed with gesso, that also has a ¾” thick border so that there is a bit material to protect the electro galvanized nails that will be nailed in as solder points.

I did a digital painting in Procreate to plan out my design for the buttons.

conceptual drawing - digital drawing on iPad & Procreate
conceptual drawing – digital drawing on iPad & Procreate

I ended up using the blue triangle design and splitting the yellow design in half to be two triangles that look like left and right arrows, which I intended to use as FRONT and BACK navigation; the blue triangle would be the JUMP button.

process - stencil for design, graphite on tracing paper
process -stencil for design, graphite on tracing paper
tracing paper stencil - shaded w 5B pencil on backside
tracing paper stencil – shaded w 5B pencil on backside

I traced the design onto some tracing paper with graphite pencil.

On the opposite side of the tracing paper, I used a 5B graphite pencil to shade in the entire shape to make a transferable stencil.

Tracing with a pen with the shaded side down I transferred the design onto the gesso board.

process - transferred design onto gesso-ed 9x9" wooden board.
process – transferred design onto gesso-ed 9×9″ wooden board.

Once the design was transferred, I applied green painter’s tape around the edges so that when I applied the black conductive paint overtop, it would be clean around the edges. I added three more rectangular buttons for additional functionality. Once I had transferred it, I used a hammer and nailed some electro galvanized nails, about an 1.5cm long into each of the ‘buttons’ [not really buttons yet, but empty spaces where the buttons should be]. Because the nails were so small I used a thumb tack to do some of the registration holes for better accuracy.

process – back of gesso board with electro galvanized nails.

I then applied a generous coat of conductive paint by Bare Conductive mixed with a little bit of water, as the carbon based paint is a little sticky and hard to work, and water is conductive so this did not prove to be a problem. After I finished painting the designs with conductive paint, I sealed it with a layer of acrylic varnish spray to prevent it from rubbing when being touched. For some of the buttons, I planned to put another layer of acrylic paint to see if it was possible to activate the conductive paint with coloured paint overtop, to allow for more than just black designs as I had planned.

process - conductive paint applied and tape taken off
process – conductive paint applied and tape taken off
final button - conductive paint, acrylic varnish spray, regular acrylic paint, final coat of acrylic varnish
final button – conductive paint, acrylic varnish spray, regular acrylic paint, final coat of acrylic varnish
Final Painting.
back of board – set up

I painted the background with regular acrylic paint to make the board more aesthetically pleasing. With the final layer of acrylic paint, and a final coat of varnish, I was ready to test my connections. Using a soldering iron, I soldered wires to each of the connections, then alligator clipped these wires each to an electrode on the Touch Board micro-controller.

The LED on the Touch Board lit up when I touched each button, so it was all working. The only thing I noticed was that the connections were very sensitive to touch, so even if the wires in the back were touching on another it would sometimes trigger an electrode it was not supposed to. This can be solved with better cable management and enclosing the micro controller inside the back of the board if I want to do a standalone project.

The original idea was to hook up the board to Unity so that they could replace the tactile buttons that Denzel described using in his half of the documentation. Using the Arduino IDE, I uploaded the following code to the Touch Board [screenshots do not show code in entirety]:

screenshot - Uduino code
screenshot – Uduino code

screenshot - Uduino code [cont'd]
screenshot – Uduino code [cont’d]
screenshot - Uduino code [cont'd]
screenshot – Uduino code [cont’d]
The Uduino code (to bridge the serial connection between Unity and Arduino) uploaded successfully onto the Touch Board’s AtMega32u4 chip, which is the same chip as the Arduino Uno or Leonardo. The problem with the connection however, was that the conductive paint buttons were using capacitive sensing logic and not digital ON/OFF switch  logic, and neither Denzel and I were proficient enough in C# to change the Unity scripts accordingly so that the capacitive touch sense buttons could be used to trigger movement in the game engine. I tried looking at a few tutorials on this and watched one about analog inputs to Unity, which used a potentiometer as an example. I wasn’t sure if this was going to be what I needed in the scope of time that I had so I ended up settling on another idea and decided to attempt the Unity game controller with Denzel at a later date when we both had the time to look at forums (lol)

I changed the function of the conductive painting to be a MIDI keyboard, as the Touch Board is particularly good for being used as a MIDI/USB device. I uploaded this code instead to the Touch Board:

Arduino IDE - MIDI interface example sketch from Touch Board Examples Library
Arduino IDE – MIDI interface example sketch from Touch Board Examples Library
Arduino IDE - MIDI interface example sketch from Touch Board Examples Library
Arduino IDE – MIDI interface example sketch from Touch Board Examples Library

I then used Ableton Live as a DAW to make my MIDI make sound. I changed the preferences in Ableton > Preferences > Audio Inputs > Touch Board, as well as the Output. I also turned on Track, Sync, and Remote so I could map any parameter within Ableton to my conductive painting just like any regular MIDI keyboard. I used the Omnisphere for a library of sounds I could play with my MIDI; because the capacitive button is analogue, I can map parameters like granular, pitch bends, etc onto the buttons as well as trigger tracks, pads in a drum rack or sampler, or any of the Live view channels in Ableton to trigger whole loops.

Omnisphere 2.0 VST in Ableton - sound library
Omnisphere 2.0 VST in Ableton – sound library
Conductive painting inside of Ableton
Conductive painting inside of Ableton

Even though we did not successfully link Unity and the painting together, I still feel like I learned a lot from creating this unusual interface and I will push this idea further in both Unity and Ableton; I want to use Live for Max to trigger even more parameters in physical space, eventually things like motors.

Physical Interface w Digital Feedback: MIDI Data using Unity/Ableton/MaxMSP with Soft Tactile Buttons [Analog and Digital, Cap Sens

created by Denzel Arthur & Angela Zhang

For this final assignment, we decided to continue working on what we explored for the previous project, which was controlling a 3d object in a game engine or some other digital environment with non orthodox physical input switches. We made a lot of good progress during the initial experiment, and because we have an affinity for music, we decided to continue pursuing it for this next project, to solidify a foundation for how we create multimedia works in the future. The initial goal was to find a unique way of visualizing audio, and having it be interactive, either in the progression of the music itself, sonic feedback, or changes in the visual component that corresponds to the music, visual feedback. This lead us to experimenting with the Unity game engine, Ableton, and building physical buttons for extended and an unusual combination of hard and soft user inputs.

conductive thread



arduino micro
arduino micro
conductive ribbon
conductive ribbon

We wanted a variety of tactile experiences on our game-board, including both analog and digital inputs and outputs. Using methods explored in class, we used materials such as conductive thread, conductive fabric, velostat, and some insulating materials such as felt and foam to interface the buttons and the electronic elements. We also wanted to use some elements that would normally be used in regular electronic setups but not necessarily in a soft circuit, such as light cells, infrared sensors, and actual tactile buttons.

schematic for digital button
Angela’s schematic for digital button, [+] and [-] intertwined but not touching, in order to maximize the surface area that can be activated.
schematic for analogue button
Angela’s schematic for analogue button [top down view]
diagram of construction of velostat button
Angela’s schematic diagram: construction of velostat button [layers, component view]

mid production - analogue button, velostat, conductive thread, conductive fabric, felt, foam, embroidery floss.

Our first soft button button was an analogue pressure sensing button made of coelostat between two pieces of conductive fabric, with 3 connecting lines of conductive thread sewn into each of the pieces of conductive fabric on either side of the velostat, in the middle. One of the sides of conductive thread is positive, the other negative. These are sewn to the edge of the square that is cut, deemed to be the button, and come off of the square approx 2 cm away from each other, and are eventually sewn into the blue felt that becomes the base of the button. The yellow foam and red felt are added for haptic feedback of a range of pressure, and the idea was to hopefully allow for a wider range of pressure sensitivity from the velostat, as well as aesthetic purpose. The without the added layers of material the button felt very flat and there did not seem to be a big margin of input for the user, especially as an analog input which is meant to provide a range of numerical data, which would then be used to control some element in Unity, as for the other components of the project. 

The completed pressure button on the board, and a light cell beside it. Two bits of silver thread on the button are conductive; one is positive and one is negative, detecting how much the velostat is being pressed.
legs of the components are fed through the board, and twisted to be flush to the back of board so they can be sewn into the prototyping board, also on the back.

a tactile button, three 5mm neopixel LEDs, and the analog pressure button, with some conductive thread connections [front view]
LED connections sewn with conductive thread to the prototyping board [mid prod, back view]
LED connections sewn with conductive thread to the prototyping board [mid prod, back view]

The main idea was to use these buttons to control gameplay within Unity for a game that Denzel had programmed for Experiment 3. The Arduino Micro however, as well as the Touch Board by Bare Conductive that Angela used for Experiment 3 to create the conductive painting [to be used as input for Unity as well, but was last used for Ableton] both have the ability to send MIDI data. We decided to switch it up and see if we could get a 3D object from Unity to work with Ableton and Max MSP’s patch for Ableton, Max for Live, to make it respond in real time to a MIDI signal sent from one of our buttons or sensors. Unfortunately we did not have time to hook up all the different sensors and components, but there is potential to hook up all sorts of different sensors and to keep adding different analog and digital parameters to the board and this is going to be an ongoing experiment for us to see how many different components we can combine.

For the connection of Unity -> Ableton -> Max:

screenshot: GitHub for The Conductor
screenshot – YouTube video describing connection of Unity and Max using The Conductor
screenshot – Ableton 10 Remote Script OSC to MIDI Note forum
screenshot – we looked at some of the possibilities between Ableton and Max using Max for Live
mapping audio parameters (OSC data) from ableton to Max to visualize change in audio through color mapping

The final product is a set of buttons that sends a MIDI signal from the button, to Ableton, which triggers or changes a sound. The track that the sound is on has a script that connects its phenomena to Max, which takes the MIDI data it receives and outputs a corresponding behaviour in the 3d object in either the change of its shape or colours, relative to the change happening in the sonic component. In theory anything that sends a MIDI signal can work with this setup, so it can work with both soft circuit buttons, conductive paint, or any regular MIDI keyboard; any input device works as long as you can get it to communicate with MIDI data. We experimented with other MIDIs such as the OP-1 [by Teenage Engineering] as well as the conductive painting [uses Touch Board by Bare Conductive] from the previous experiment, which outputs MIDI.

Works with Touch Board – which outputs MIDI


The Conductor – Initial setup inside Unity
Ableton - Live view; jit.window Max object with Ableton tracks (audio source)
Final set up – jit.window Max object, Max patches and MIDI tracks in Ableton Live View (audio source)


Yay!! Now you can be a VJ 😀