CONVOY 2017 – Assignment 2 – Multiscreens


Chris Luginbuhl

Sean Harkin




Players must work as a part of a team to complete a relay race game. Each team (10 players) stand in a line opposite one another. The objective of the game is to complete the mazes before the opposing team.



The year is 2117 and the world has been ravaged by nuclear war, climate change, general terrible dystopian tragedies etc. etc. You, and your comrades home has been destroyed and you must seek new shelter. News has traveled of a safe haven to the North – but wait! You have also received word that your bitter rivals are also seeking refuge. You must arrive before them to ensure your survival.

You must board your trusty Travasphere™ and race across the barren wasteland. Each player will take 1 leg of the journey consecutively, ensuring to collect the fuel source – the rare and valuable WD-50 lubricant –  in each level before reaching the end. Only once you have arrived at the end of your portion of the journey can the next player begin theirs

Good luck.


Github repository for Convoy 2117

(note – the folder /js/ace is not needed but is hard to delete from git!)

Github repository for “shake to play” mobile percussion with GUI interface is here. Try it at



Figure 1 – Drum battle concept


Figure 2 – Sound wall concept




Figure 3 – Video capture + animation overlay testing


Figure 4 – First working prototype for tilt-ball maze


Figure 5 – Maze layout 1


Figure 6 – Maze layout 2


Figure 7 – Maze layout 3


Figure 8 – Working prototype for instrumental phone concept



Introducing Convoy 2117:

More photo and video



Battle royale:


We generated a bunch of concepts and spent some time experimenting with them. Here are a few:

Fractal painting

Each mobile phone screen becomes a canvas: use a recursive/fractal algorithm to break up a line to form a mountain range, meadow or water.  Each student controls the landscape they create – mountains, water, meadows, starry sky and colour. They then physically arrange the phones to form a landscape “painting”, with each phone providing one tile of the mosaic.

Inside a party box

Phones line the inside of a space (possibly a cardboard box that you wear, covering your head). They create a surround sound and light show that becomes your personal party in a box.

Lighting umbrella

Ever stumbled into a patch of bad lighting, just when photographers approach or you see the person you’re trying most to impress? By lining the inside of an umbrella with mobile screens and controlling the intensity & colour of light on each with touch (or clicks and whistles!), the person holding the umbrella carries a patch of photogenic lighting with them wherever they go.


This concept was to take the sophisticated power of a mobile phone back to the earliest days of motion pictures. Mobile screens are arranged inside a tradition spinning drum-and-slit zoetrope. Rotating the drum and peering through any slit, the observer sees the static images on each screen combine to form a moving image. By making use of each phone’s orientation sensor, the images can cycle on each screen so that a different “movie” is shown depending which slit you peer through.

Interactive Drum machine

We experimented with this quite a bit. Our initial concept was to create a sort of interactive drum machine with 20 different inputs. We explored multiple concepts: 1 person using 20 devices; 20 people using 1 device each; 2 people using 10 devices each in a drum battle (Fig. 1). Although we liked the idea of this last concept the team felt, that to work effectively, it would require networking which is outside the brief of this project.

We began by exploring video capture as a source of input. The idea was to have the users interact with an object on screen to activate the sound effect mapped to each device. The devices would be arranged in a manner that would require the user to dance around in front of the set up in order to create an improvised musical performance (Fig. 2). To test this concept we used examples from the p5.js Examples page to combine video capture with example code from Make: Getting Started with p5.js (Fig. 3).

The functions of capturing video from the laptop’s webcam, animating shapes on top of the video feed and creating a randomised sound board from a library were all relatively straightforward, however, when we began trying to find methods to use the video as an input, things became difficult. We found a number of examples online for using different kinds of video capture as input within the p5 script (colour tracking, symbol tracking, movement tracking, converting video to pixels etc.) however since neither the camera or the client were not designed for this purpose, the reliability of these functions were questionable to say the least. The team decided that due to the time constraints of the project that a new direction should be taken.

Initially, we returned to one of our initial ideas of 20 people using 1 device each. This had earlier been discarded due to being too simplistic – so we decided to explore the space. We looked into different sound libraries and sliders which would enable users to choose their own sound board and play music together, however this still didn’t feel like enough. We wanted to create both a co-operative and competitive experience for users. This is how the project became about designing a game.

We researched some uses of p5 for games and found the library which would help us create sprites and check for overlap and collisions.

We used one of the simple example programs from ( understand how to use sprite groups, collisions and collection with this library. In the end, only a few lines of the example code were left in our program.

Unlike the example, we wanted to make use of device tilt as an input technique rather than keyboard or mouse. After some experimentation we created a tilt-controlled sprite. This founded the basis of our game design (Fig. 4).

Initial experiments worked well. We could tilt the device to move a sprite, until it collided with another stationary sprite. We designed a our first map layout (Fig. 5), built it within the code and tested our concept. In order to add variety to the gameplay, we also added in 2 more mazes which would be randomly assigned to the players when they open the game (Fig. 6 + 7).

We initially planned to implement a maze creation algorithm, and wrote the code to be scalable to any size screen, and any number of cells. Even the simplest maze algorithms are not super easy to implement, however, and we realized we didn’t have time.

Most of our testing at this point was focused on removing bugs and fine tuning the physical parameters of the ball. After testing with a few classmates, we understood quickly that this game ran very slowly on Android phones.

We were able to add algorithms to limit the speed of the ball (which also helps prevent the ball from “teleporting” through walls. We also added “dish” – a tendency for the ball to roll towards the centre of the screen. Without this bias, the small random fluctuations in the gyroscope cause the ball to jitter and roll around randomly even when the device was laid flat on a table.

In testing we found that adding friction and a speed limit to the ball caused it to run slower on older iPhones, so we removed the speed limiting code and any other unnecessary processing that could slow the animation of the ball.

In order to add more depth to the game play, we added an artifact that each player in the team must collect in their map before they can finish the map. This fitted well into the narrative we had created for the game. The original concept for this collectible was that each team member would have to decide whether to “risk” going out of their way for the artifact to gain points.

We added sounds for start, artifact collection and end, as well as nerve-wracking techno. The reason for the music was so that people on opposing teams could gauge the progress of their opponents.  

We tested the final version on multiple different devices (makes and models) in order to test the reliability. We found that older phones seem to have issues with frame rate – however newer models (and specifically iPhones) seem to work well.



Music by JGaudio – ‘Aspiration to Success’ (

Splash screen modified from stock image:

The story was inspired by dystopian futurscapes of the classic 80’s films such as Mad Max franchise, along with our own bleak perspectives of the future based on political, socio-economic and environmental issues which exist today.

Another game that inspired our careful choice of music to fit our setting is Pac-Mondrian (2002), a take on Namcoi’s classic Pac-Man (1980) that blends Mondrian’s signature graphic painting with the boogie-woogie jazz Piet Mondrian loved. I couldn’t find a playable version of Pac-Mondrian, but you can see a screenshot here:

Pac-mondrian showed how a maze game could be completely transformed by a small change to its graphics along with the right musical soundtrack.

These games can be related all the way back to the first patented ball-maze game in 1889, with “Pigs in Clover” by Charles Martin Crandall ( However this concept has been adapted and recreated countless times (everything from Super Monkey Ball to every third free puzzle game on Android Play Store) – however what makes our game unique is the physical, cooperative aspect of the game play. We believe this interaction adds another level of enjoyment to the game, which cannot be found in a single player experience.

McCarthy, Lauren, Casey Reas, and Ben Fry. Getting Started with P5.js: Making Interactive Graphics in JavaScript and Processing. Maker Media, 2015. Print.

These texts were referenced when writing the code, although the majority of the code was written by the team specifically for the project.