Alternative Methods of Input: Synchronization of Music & Gameplay by Angela & Denzel

Alternative Methods of Input: Synchronization of Music & Gameplay

a project by Denzel Arthur & Angela Zhang

Denzel Arthur: Unity Gameplay

For this assignment I wanted to continue exploring unity, most importantly, the developer side of things. I wanted to connect my love for music and video games, in order to help give meaning to the back end side of the project. Through this assignment I got to work with the animator in unity, scripting with c#, serial port communication between arduino and unity, and music synchronization within the unity application, and ableton in order to create and cut sound clips.

 

The first part of this project was based on doing research on the concept in order to find out how I could replicate it. I immediately found resources from the developers of “140” a music based platformer that became the main inspiration for this project. As the game was developed in unity, the majority of the information provided by the developers aided me in my quest. They had information about frame per second, beats per second, and how to match them in the unity game engine. Although the code the added to the pdf was no longer active, the explanation itself was enough for me to create an prototype of the concept.

The second part of the project included setting up the unity scene. This part of the process involves using basic unity primitive objects to create a basic concept of the game. The primitive objects where used to arbitrarily represent the player object, enemies, and environment. With these basic assets, I was able to implement program collision systems, death and respawn animations, and other triggers like button events.

The third part of the process included the physical computing part. I was initially supposed to work with another classmate to create more elaborate buttons, therefore I created a basic protype buttons found in the arduino kit. These buttons became the main source of communication between the game and the player during the presentation. A very lackluster physical presentation, but that seems to be a trend in my work here in the digital futures program. Nonetheless, after the buttons were created I proceeded to connect the physical buttons attached to the microcontroller, to the unity engine. This proved more challenging than need be due to poor documentation on the free methods of connection, but after purchasing the Uduino kit, the connection became a seamless process. This process also included programming the buttons and adjusting the animations, mechanics, scripts, and audio files in order to get a prototype that was playable and had the right amount of difficulty.

The final part of this process was creating a visually appealing product within the game engine by adjusting the virtual materials and shaders within unity, and also swapping out any assets with ones that fit the concept of the game. I still went with primitive shapes in order to achieve a simplistic aesthetic, but certain shapes were modified in order to to make the different levels and enemy types seem more diverse.

unnamed-2 unnamed-3 unnamed unnamed unnamed-1

 

Angela Zhang: Physical Interface

For Experiment 3, I wanted to use capacitive sensing to make a series of buttons that would control the gameplay in Unity for Denzel’s game. I started with a 9×9” wooden board primed with gesso, that also has a ¾” thick border so that there is a bit material to protect the electro galvanized nails that will be nailed in as solder points.

I did a digital painting in Procreate to plan out my design for the buttons.

conceptual drawing - digital drawing on iPad & Procreate
conceptual drawing – digital drawing on iPad & Procreate

I ended up using the blue triangle design and splitting the yellow design in half to be two triangles that look like left and right arrows, which I intended to use as FRONT and BACK navigation; the blue triangle would be the JUMP button.

process - stencil for design, graphite on tracing paper
process -stencil for design, graphite on tracing paper
tracing paper stencil - shaded w 5B pencil on backside
tracing paper stencil – shaded w 5B pencil on backside

I traced the design onto some tracing paper with graphite pencil.

On the opposite side of the tracing paper, I used a 5B graphite pencil to shade in the entire shape to make a transferable stencil.

Tracing with a pen with the shaded side down I transferred the design onto the gesso board.

process - transferred design onto gesso-ed 9x9" wooden board.
process – transferred design onto gesso-ed 9×9″ wooden board.

Once the design was transferred, I applied green painter’s tape around the edges so that when I applied the black conductive paint overtop, it would be clean around the edges. I added three more rectangular buttons for additional functionality. Once I had transferred it, I used a hammer and nailed some electro galvanized nails, about an 1.5cm long into each of the ‘buttons’ [not really buttons yet, but empty spaces where the buttons should be]. Because the nails were so small I used a thumb tack to do some of the registration holes for better accuracy.

process – back of gesso board with electro galvanized nails.

I then applied a generous coat of conductive paint by Bare Conductive mixed with a little bit of water, as the carbon based paint is a little sticky and hard to work, and water is conductive so this did not prove to be a problem. After I finished painting the designs with conductive paint, I sealed it with a layer of acrylic varnish spray to prevent it from rubbing when being touched. For some of the buttons, I planned to put another layer of acrylic paint to see if it was possible to activate the conductive paint with coloured paint overtop, to allow for more than just black designs as I had planned.

process - conductive paint applied and tape taken off
process – conductive paint applied and tape taken off
final button - conductive paint, acrylic varnish spray, regular acrylic paint, final coat of acrylic varnish
final button – conductive paint, acrylic varnish spray, regular acrylic paint, final coat of acrylic varnish
img_0700
Final Painting.
img_4594
back of board – set up

I painted the background with regular acrylic paint to make the board more aesthetically pleasing. With the final layer of acrylic paint, and a final coat of varnish, I was ready to test my connections. Using a soldering iron, I soldered wires to each of the connections, then alligator clipped these wires each to an electrode on the Touch Board micro-controller.

The LED on the Touch Board lit up when I touched each button, so it was all working. The only thing I noticed was that the connections were very sensitive to touch, so even if the wires in the back were touching on another it would sometimes trigger an electrode it was not supposed to. This can be solved with better cable management and enclosing the micro controller inside the back of the board if I want to do a standalone project.

The original idea was to hook up the board to Unity so that they could replace the tactile buttons that Denzel described using in his half of the documentation. Using the Arduino IDE, I uploaded the following code to the Touch Board [screenshots do not show code in entirety]:

screenshot - Uduino code
screenshot – Uduino code

screenshot - Uduino code [cont'd]
screenshot – Uduino code [cont’d]
screenshot - Uduino code [cont'd]
screenshot – Uduino code [cont’d]
The Uduino code (to bridge the serial connection between Unity and Arduino) uploaded successfully onto the Touch Board’s AtMega32u4 chip, which is the same chip as the Arduino Uno or Leonardo. The problem with the connection however, was that the conductive paint buttons were using capacitive sensing logic and not digital ON/OFF switch  logic, and neither Denzel and I were proficient enough in C# to change the Unity scripts accordingly so that the capacitive touch sense buttons could be used to trigger movement in the game engine. I tried looking at a few tutorials on this and watched one about analog inputs to Unity, which used a potentiometer as an example. I wasn’t sure if this was going to be what I needed in the scope of time that I had so I ended up settling on another idea and decided to attempt the Unity game controller with Denzel at a later date when we both had the time to look at forums (lol)

I changed the function of the conductive painting to be a MIDI keyboard, as the Touch Board is particularly good for being used as a MIDI/USB device. I uploaded this code instead to the Touch Board:

Arduino IDE - MIDI interface example sketch from Touch Board Examples Library
Arduino IDE – MIDI interface example sketch from Touch Board Examples Library
Arduino IDE - MIDI interface example sketch from Touch Board Examples Library
Arduino IDE – MIDI interface example sketch from Touch Board Examples Library

I then used Ableton Live as a DAW to make my MIDI make sound. I changed the preferences in Ableton > Preferences > Audio Inputs > Touch Board, as well as the Output. I also turned on Track, Sync, and Remote so I could map any parameter within Ableton to my conductive painting just like any regular MIDI keyboard. I used the Omnisphere for a library of sounds I could play with my MIDI; because the capacitive button is analogue, I can map parameters like granular, pitch bends, etc onto the buttons as well as trigger tracks, pads in a drum rack or sampler, or any of the Live view channels in Ableton to trigger whole loops.

Omnisphere 2.0 VST in Ableton - sound library
Omnisphere 2.0 VST in Ableton – sound library
Conductive painting inside of Ableton
Conductive painting inside of Ableton

Even though we did not successfully link Unity and the painting together, I still feel like I learned a lot from creating this unusual interface and I will push this idea further in both Unity and Ableton; I want to use Live for Max to trigger even more parameters in physical space, eventually things like motors.

Leave a Reply