Final Design Journal

The Blender Workflow – Creating Assets for use within Unity

My general workflow in regards to creating assets through Blender started with finding appropriate reference material; typically what I would look for would be a set of images of different angles and viewpoints for each object in order to create an accurate rendition of said model with decent proportions and scale. I would then create a general rough model using primitive shapes and slowly refine each shape through the editing tab. Because we were going with a low poly aesthetic within our game in order to ensure that it ran smoothly I had to make sure that my models had a generally low poly count. This meant that I only had to get the more important details down while I could hide the less important details with textures.

I can’t deny that I have a pretty intense and detailed workflow that takes a decent amount of time to create assets; this is why I could only make a few models within my time constraints because I’m a bit of a perfectionist, however, my role in creating models would also halt as I had to focus all my attention towards game development and coding. However, it was an interesting learning experience as my main modelling expertise is mainly in the fantasy world. The fantasy aesthetic is usually very loose and imaginative while creating furniture and appliances is very detailed; you’re trying to create an exact copy of something in the real world as opposed to something you created with your imagination.

radiotelevision      piano


Creating Immersive Game Design through Intuitive and Creative Interactions – VR and Events .

Creating VR interactions within Unity was a bit tricky at first and it was especially convoluted to come up with more intuitive and creative designs as the game’s development progressed. With the XR Interaction toolkit you’re given a basic set of interactions, controls, and mechanics to use; it is up to the developer to use these components in creative and intuitive ways by thinking out of the box and use these mechanics in ways that the creator of the package didn’t originally or intentionally design and plan for. To accomplish this, I combined various other non-VR related mechanics within Unity as well as my own scripts and with the XR Toolkit to create my own unique interactions.

If I were to list the most creative interactions I managed to create I would have to say the main 360 video viewing mechanic, placing all the cans within the recycling bin, as well as placing the flower on top of the burning grave. 

The main 360 video viewing mechanic was the simplest mechanic but behind the scenes it was the most convoluted in terms of coding/scripting, that of which I will describe in a following entry. As this was the main mechanic, it was also one of the first ones that was developed and the one that needed to be perfected; the main idea was viewing past memories through objects. It started with many iterations such as the 360 bubble instantiating as soon as it was picked up, this first version was later tweaked so that it spawned directly over the player’s centered position. Next, the buttons would have to be spawned; this was done in a rather flawed and imperfect way that would layer haunt my development as it spawned onto a canvas that wasn’t parented to the actual player object, this meant that I would have to manually transform it to the player’s position when spawned and manually tweak the 3-axis coordinates. Halfway through the build would see it’s next mechanic implemented wherein the bubble would spawn using a listener event with a player input. The mechanic would then reach its final iteration with the object having its mesh renderer disabled as to not obstruct the player’s sight when viewing a scene.

Finally, the next two interactions were not as script heavy as the previous but I think they were the most creative and intuitive as they were the last to be implemented and were shaped through the skills I learned along the way through the game’s development. I wanted to create something that wasn’t inherently complicated in nature rather I wanted something that had an immediate reaction to the player’s interaction that spawned in an interesting way; that said, these interactions probably had the most thought and conceptualization behind them. 

I thought about the idea of the player performing some extra events in order to get extra karma points to get the best ending, the ones that immediately came to thought were depositing the beer cans into the garbage to reflect the grandfather’s alcoholic nature and the player overcoming it as well as placing the flower on the burning grave as a sign of forgiveness, letting go of the past, and the daughter’s resentment towards her grandfather. The immediate reaction in these cases were the cans being immediately destroyed in the fire as well as the flower being ‘socketed’ on top of the grave and the fire being extinguished.



Constructing the Scene and adding the Element of Surrealism – Particle Effects and Events

Creating a sense of surrealism can be difficult as there are many renditions of what a dreamlike landscape looks like; my idea was something along the lines of Salvador Dali and the unknown. While implementing surreal elements was somewhat limited because of our given timeframe I was able to implement a few elements.

First was the perpetual rain and clouds, I wanted to create a thick miasma of clouds to obstruct the player’s view and give them a sense of mystery and the unknown; they don’t know where they are or if they’re even in the real world. This was done through a particle system which included rain with collision (so that it didn’t go through the room of the house) as well as a pretty performance heavy cloud emitter (which I think emitted around 500 clouds into the scene). To complement the cloud system I also added a blue fog using Unity’s lighting system to further obstruct the player’s field of view as well as objects in the distance.

Next was the fire, as normal fire isn’t very dreamlike I decided to color it a combination of a ghostly teal and green. The fire was composed of three different particles, the first being the fire’s core, the next being the outer glow, with the third being the black smoke.

Finally, I added a little spooky touch by having the front doors of the house open when the player enters a collider. This was done by having the doors having a specific timeline animation that is played through an event. This one isn’t too important as the others but I like this one in particular as it’s unexpected and catches the player off guard.



Developing the Gameplay/Mechanics – Scripting and Unity

Transitioning from languages such as Javascript and Python to something more hard coded such as C# was a bit of a transition; gone were the days where I can just put any value into a var and just call it a day. Likewise with Unity it was a bit disorientating at first as I had little to no experience with the game engine before. However, after a bit of a learning curve I was surprised how much I was able to learn and develop my skill set within 3 months; I now know Unity like the back of my hand in term of its specific class and methods as well as core C# concepts such as interface classes, inheritance, and dynamic dispatch (polymorphism) – all of which were essential in developing my VR interactions.

The core gameplay loop is probably the interaction that I spent the most time on; the process of picking up an object, viewing a scene, and getting an ending depending on your karma score. Essentially, it boils down to four scripts; spawning the 360 scene, regulating the button inputs and location, a conditional loop that checks if all scenes have been viewed, as well as an ending that assigns the player an ending depending on their score. Naturally, the first script in the list would be the most important as the majority of the other scripts in the game reference, is a child class, or is called upon at some point in time as an instance; if the spawn scene script doesn’t work then the game is pretty much a dud.


What I learned from the Experience as well as Future Plans – VR, Unity, and C#

Overall, the experience of developing Opa has been an invaluable learning experience and I was surprised at how much I was able to learn within three months; I learned the core and essential aspects of Unity and C# and I’m excited at what else I can learn and improve upon in the months before I begin my thesis. 

In terms of VR I was also pleasantly surprised at how relatively easy it was to develop games if you understand how the VR kit works on a technical level and a more programming centric viewpoint. That said, VR is something that I will definitely explore in further detail and maybe even try to develop another game in the future that is more responsive, sophisticated, and more detailed on a technical level than my first attempt at a VR game within Unity.


Blog Post 6 – Getting to Completion

The week leading to the submission of the final build would mainly involve incorporating our final assets, building the scene, as well as improving the controls/mechanics and fix any existing bugs; as the main script was completed much earlier than expected this meant that I had more time to add more script components as well as an extra layer of polish.

The first step was to assemble the final scene; after receiving the final house and graveyard assets from Yasmine and more furniture assets from Pedro I assembled the floating island scene. There were a few issues with the assets wherein most of their assets were fbx wherein they forgot to apply transformations; this meant that they didn’t have very good normal and meshed which made them particularly difficult to detect collisions and the environmental rain would fall through the house for example. I quickly fixed this by incorporating the assets that weren’t going to be interacted with while I reimported the intractable objects within blender, applied transformations, and reimported them into Unity as obj files. After placing all the assets in a way that looked relatively normal and streamlined I added my particle effects and completed the main scene.

With a decent scene complete I incorporated Kiana’s 360 new video assets; this was relatively quick as I already created the framework/template a long time ago for a multiple ending system so all I had to do was modify the ending script to incorporate a video player with multiple clips and change the math in the ending conditional statement to support a fifth video clip.

After assembling the scene I found that I luckily had enough time to add in the two bonus interactions that I had originally included as a stretch goal, namely throwing all 10 beer cans into the garbage bin and placing the flower on the flaming grave in the graveyard for extra karma points to get the best possible ending. Implementing these was a lot faster than intended as I learned a lot more in regards to C# and Unity than I had a month or two ago when I had originally planned these interactions; they worked a lot better than intended and I didn’t have to tweak much. The next step was the creation of an intro scene, something of which was another stretch goal that I had extra time to implement. After spending a day learning cinemachine I was disappointed to find out that it didn’t work very well with VR as it dropped a lot of frames and was fairly nauseating to look at, luckily we were able to salvage this cinematic with an intro scene that Yasmine had made that would go between the title screen and the main game.

The last step in terms of code was to fix any pre-existing bugs; this mainly included the mechanic that would disable the mesh renderer for each 360 video intractable object so that they didn’t obstruct the player’s view when viewing scenes. Although the game ran perfectly fine in the editor I ran into some serious issues with the build version as a good chunk of the code didn’t work as intended or at all; worst of all this happened the night before submission. This was luckily fixed by changing the load order of the scripts as well as tweaking and altering a few lines and values.

After finishing the game I am very satisfied with what me and my team had accomplished in three months, I started off the year knowing little about Unity and C# and the transition from Javascript to something more hard coded was a bit of a learning curve. However, with hard work, studying, and many sleepless nights I am now very familiar with C# and I now know Unity like the back of my hands. With the knowledge I have obtained during this class I know that there is more I can learn in terms of being a game developer and that I can greatly improve my coding skillsets in preparation for my thesis project in the fall and winter term. On a more theory based sense I was also able to learn a lot about gameplay, level, and interaction design with particular focus on VR.

Blog Post 5: VR Game Progress – 2020/03/03

In today’s playtesting session the most important thing to take away for me as the game’s coder was the importance of playtesting. Wherein I thought that I provided a relatively simple linear experience I was surprised when players didn’t follow the rules and walked around the mechanics of the game. While this is a relatively simple fix in terms of code where I can halt movement and force the players to pick an option before doing anything else, today’s session taught me the importance of having people try and ‘break the game’ to weed out any possible bugs or flaws in mechanics that I might have missed. Since the main mechanics are pretty much complete I have a lot more time to flesh out the interactions of the game as well as add more mechanics and actions that the player can perform in order to give a greater feeling of immersion and exploration of the environment.

When exploring the idea of portraying ‘world-ness’ to the player in order to give a better sense of immersion I experimented with several VR games in order to find inspiration and ideas that I can possibly incorporate into my own VR game. First, I tried Empire Soldiers VR, a story based narrative wherein the player followed the contribution and stories of several south Asian and Caribbean troops throughout World War 1. In terms of building an immersive world the game starts off with an introductory scene describing the main goal of the game before putting the player in a moving train and trench scene. I think what this game did well in terms of developing a feeling of ‘world-ness’ was to establish a quick story in the beginning of the game in order to provide some introspective context before easing the player into the first scene.

Next, I tried Dispatch, a game wherein the player takes control of a police emergency dispatcher. Along the same lines of the previous game, the game sets up the game with an introductory scene that establishes the personality and background of the main character as well as set up the visuals of the cyber-esque world that represents the phone line. Moreover, I think that the game does this especially well with the use of camera work by cinematically guiding the player through a scene wherein events are transpiring and unfolding around the player in a 360 manner.

Finally, I tried BBC Home, a simulation of a spacewalk on the international space station wherein the player navigates the environment through space and makes repairs. I think this game tackles immersion and ‘world-ness’ well through the use of the environment’s scale, it didn’t have a fancy introductory scene like the previous games but it managed to give a sense of immersion and grandiose through its perceived large environment which is actually smaller than you think in a technical regard but it seems much larger than it is through the massive emptiness of space.

In conclusion, what playing these three games taught me was that it’s important to establish an immersive introductory sequence that kind of builds up and fleshes out the world and narrative of our VR game as opposed to just immediately placing them into the world for them to explore; I feel that a cinematic 360 introductory scene would also improve the flow of our game and it would be more effective if events transpired around the player as opposed to obvious main focal points. Next, with games such as Dispatch the use of directional sound helps bring the world to life and encourages the player to look around the scene as opposed to following a more obvious linear path. Lastly, I believe that it is also important to flesh out the general environment. Currently our game consists of a house on an empty flat plane and I feel like we can bring out a higher level of immersion by building out the environment around the house so that our scene feels more like an actual world.

Blog Post 4: Contextual Review – 2020/02/11

Today in class I was able to try out three VR games: Face your Fears, Cosmotic Blast, and Superhot; each of a different genre with varying levels of speed and types of interaction in terms of player input.

In regards to what worked well within a VR environment I found that Face your Fears was able to further immerse the player through an audio and pull the player further into the environment. With a slowly rising soundtrack incrementally increasing the amount of tension along with directional auditory queues in the environment that make the player feel uncomfortable and uneasy the player is further immersed into the VR environment as they are encouraged to look around the scene to see what exactly made that noise, what has changed in the environment in relation to the noise, and look out for what is slowly making their way towards the player’s position. Next, Cosmotic Blast had a purposely clunky and relatively difficult way to control the onscreen ship which oddly worked well in the games favor. By separating two critical movements such as directional thrust and aiming between two different controllers the player is encouraged to enter uncomfortable and interesting poses in order to find the right angle to destroy blocks and collect capsules. Finally, Superhot had a similar experience with Cosmotic Blast in that the player had to utilize their whole body outside of VR in order to properly dodge bullets or throw projectiles in just the right manner.

I think what worked best in terms of meaningful interactions would have to be how the player is able to control the scene in Face your Fears through the player’s gaze. By doing this, the player must look at some possibly frightening things in order to progress the scene. For example, I caught the demonic kid in game standing beside the bed in my peripherals but the scene wouldn’t progress and he would continue to stand there unless I looked directly at him. I found this interesting in that I knew what was coming and even though I didn’t want to look at him I eventually had to face my fears in order to progress the narrative, much to the game’s namesake. Next, Superhot and Cosmotic Blast share a similar feeling of meaningful interaction in that that the player is rewarded for their interaction by aiming at the right position and pressing an input. In Superhot’s case the reward is usually instant as the player is quickly rewarded with a visceral enemy kill while Cosmotic Blast is more slow paced in that the player is rewarded by lining up the perfect shot and angle but they have more time to do so than Superhot.

In conclusion, I feel that Face your Fears has given me a lot to think about in terms of the use of audio in my own project; I might have underestimated its importance within the VR environment as a tool to further pull the player into the scene and I will contemplate and experiment with ways in which I can use audio in order to set a specific mood that I want the player feel as well as include some little auditory queues in some fashion to encourage the player to explore the house even more. Furthermore, it might be useful to experiment a little more with button inputs within my game in order to make it feel more meaningful and impactful than a simple ‘pick up and view scene’ input that is currently implemented.

Blog Post 3: Workshop Reflection – January 21st

Through today’s workshop my group was able to create a more clear and consolidated goal of what we wanted to achieve with our final project. Perhaps one of the most important things that we managed to come to a conclusion with in today’s workshop was the scope and scale of our project; we needed to contemplate and decide on what we were able to complete within our given time frame. Because of this, we had to scale down our project and cut out a good chunk of content and ideas that we might not be able to implement within our given time.

Next, the idea of our project had partially changed in terms of story; while the original idea of the project was a comedic story wherein the spirit of a deceased grandpa would return to his house to haunt the family that had potentially wronged him, we decided alter the story in a more grittier fashion in that there is an optional backstory that the player can pursue in order to discover who had potentially murdered him. Furthermore, we also gave the grandpa character a more consolidated backstory and character profile in order to support this new plot branch. Aside from story elements, we also made the controls and gameplay loop of our final project more clear and defined. The conclusion of the workshop would end with our first meeting time on Friday wherein we hope to have a completed storyboard as well as a fully furnished test room that we can test out through VR in Unity.

Finally, we hope to take some of the valuable feedback we received in order to make our final project a more immersive and engaging experience. In particular, I feel the need to reconsider the black and white choice that the player receives at the end of each flashback and possibly add a dialogue branch leading to the final choice wherein the player can fish out additional info that might give them more insight when making their final choice. Furthermore, it might be interesting to give the player a handful of final choices, each with a different value on the forgiveness scale/meter in order to give more of a nebulous ambiguity in their final choice and that no choice seems like the obvious correct or wrong option.

Blog Post 2: 360 Video and VR Workshop ( January 14 and 23)

360 Video Workshop:

For today’s exploration we experimented with the Samsung Gear 360 video camera by recording scenes that related to the theme of small and large environments within a 360 video environment. We began with recording the ‘large’ scene by placing the camera over a mailbox outside, we felt that the outdoor area in front of the school building was probably one of the largest spaces in which we can capture some great footage. Next, for the ‘small’ scene we decided to capture footage within the school’s elevator, we felt that the small enclosed space of the elevator combined with the low angle ground shot would accentuate the feeling of being small within a contained enclosure. Through this experiment we found that the 360 camera was perfectly adept at conveying the feelings of large and small areas within a 360 degree setting and if given time to upload the scene onto a VR headset we’re certain that these feelings of large/small environments would only be amplified.

Technically, while the recording of video is fairly straightforward, we could see a few issues that we might run into in the future. First is the time it takes to render and process the raw export from the camera, since it takes a considerable amount of time and processing power it might be wise to set up a rendering schedule early in the project in order to have enough time for possible post-process editing for the clips. Next, we’re going to have to figure out efficient methods to edit said video clips, since the raw exports are somewhat warped as opposed to a standard 2D plane we might run into difficulties if we are planning to do things such as plane detection or camera tracking within After Effects.

Conceptually, our experience with recording and processing 360 degree video has given us a lot to think about in regards to our final project and how we might integrate 360 degree video along with our VR elements. In correlation to some of the example VR experiences that we had seen last week that integrated both VR and 360 video in a seamless way we will have to find creative and intuitive methods to seamlessly transition between 360 video and VR that doesn’t seem too jarring or clunky. Next, we will also have to make sure that both elements are equal in their presence and impact; I feel as if we are focusing too much on the Unity/VR aspect as opposed to the 360 degree video element and I want to make sure that the video element doesn’t feel out of place or forced in any way.



VR Workshop:

In today’s workshop my team and I worked mainly on figuring out some of the fundamentals of VR within Unity; we managed to build a relatively simple scene with a basic floorplan of a house similar to what we are planning to construct and flesh out for our final project. To our surprise it was fairly easy to set up a basic and workable scene within unity and we were able to explore various rooms of the house through VR. What surprised me the most was the difference in viewpoints and perspective between viewing our premade asset through VR and Blender. For example, you get a more close and intimate feeling with the house through VR rather than the top down perspective of Blender on a 2D screen.

Moving on from what we learned today we plan to learn and implement object intractability through triggers and movement through the house through teleportation. We plan on implementing basic additions to player interaction such as the ability to open doors, open drawers, and close windows. If possible, we would want to add other elements such as a UI interface and on screen text or buttons that trigger an event when selected. Finally, we will want to also experiment with 360 alongside 3D VR elements to see how the two may work side by side with one early example being a 360 recording of the night sky as a skybox for our Unity scene.

vr2 vr1

Blog Post 1: VR Experimentation – January 9th

For my VR exploration I decided to experiment with Google Blocks as well as Google Earth VR. When I constructed my dog model using Blocks I was pleasantly surprised to discover that the software was more intuitive and easy to pick up for beginners than I had originally anticipated; I was able to easily maneuver through the menus, discover what each tool did, and place the shapes I spawned with a surprising degree of accuracy. I found that one of the strengths that modelling through VR had over a 2D screen was the ability to move and look around the model in a 3D environment, I found that through this new environment and method I felt more close and attached to the object that I was creating and I felt a surge of creativity to mess around and experiment with the software and its environment. All in all, I found that Blocks was a lot more difficult to create complex shapes as I would with Blender but the inspiration I felt with Blocks to create interesting objects was something that I haven’t felt with 2D modelling in Blender in quite a while. In the future it would be interesting to see sculpting and more intuitive texturing feature added to Blocks.

I use Google Earth on a somewhat regular basis in a rather monotonous fashion to find directions to places that I need to go, but with Google Earth VR I found that implementing this software into VR has made this otherwise tedious and dull process into something more spectacular and exploratory. The grand scale that the world is first presented to the user has made me feel the urge to explore the world through places that I already know to new and interesting places that I have never seen before. Another interesting aspect was the perspective of the user through VR; the viewer is presented with the viewpoint of an omnipotent observer over the world being able to look around in a 360 degree viewpoint at any altitude, I felt that this only added to curiosity and encouraged the user to travel to different places around the world.

In conclusion, my in depth exploration into some of the examples in VR have encouraged and inspired me to ponder some of the experimentation and possibilities that I can attempt through the development of my final VR game project for Atelier. I realized that my original pitch for a VR game that I presented in the first day of class could be expanded even further than I had originally planned and that I can possibly push my idea even further beyond its original frame.


Experiment 4 – Final Report

Joseph Eiles

Ermar Tanglao

Vijaei Posarajah

Experiment 4 – Final Report

Project Description:

For our final project we wanted to create a twist on the traditional two player competitive arcade game by altering how the players would typically interact with the game. Our game mechanic was centered on a goal keeper and a striker wherein player one summons and unleashes ghosts along three lanes by using touch sensitive fabric on an Ouija board, the other player would then banish the summoned ghosts by using a flashlight to light up light sensors within a candle in order to launch a beam of light along a corresponding lane. The overall theme of the game was inspired by “Luigi’s Mansion” wherein the player uses a flashlight to combat ghosts within a haunted setting.

Project Inspiration:

The inspiration for our project was based around implementing non conventional human interface devices to interact with a seemingly classical arcade game. Our original concept was to recreate an existing arcade game within P5.JS and use a variety of sensors to interact with the game in the place of traditional controllers. This concept evolved to focus on two player competitive play wherein games such as air hockey or Pong were taken into consideration for game mechanics. When discussing how the player would interact with the game we decided that using light as a controller would be a novel, creative, and unique idea where the inputs would be based around light sensors. We then related this idea back to our original inspiration of “Luigi’s Mansion” where the character utilized light in order to fight ghosts; the game’s theme revolved around summoning and banishing ghosts which suited the theme of our arcade game and allowed us to develop two unique purpose driven controllers.

Background Work:

For the design of our controllers our goal was to create a unique and creative design that relates to the overall theme of the game as opposed to simple buttons and switches; after experimentation with what materials and parts we had we settled upon a design based upon a Ouija board and candle set. The Arduino components were a relatively simple design based upon multiple on and off switches and light sensors, these were tried and true methods that we knew would work but the trick was to utilize these components in a creative way through the building and usage of the controllers. With an idea set in stone we then researched other creative and unique controller designs throughout history as well as analyzing aspects of our past experiment as our “Cardboard Band” project was also inspired by the idea of providing an extra level of immersion through the design of the controller; utilizing what we learned we worked towards taking this experiment a step further.

In our research we found many strange yet unique controllers. The article “25 bizarre video game controllers” by Aaron Birch details many unique and creative controllers throughout history, yet one of the controllers that stood out for us was the Dreamcast’s Sega Fishing Rod that featured motion sensing capabilities, a quick upward thrust mechanism, and a reel on the side in order to catch fish on screen (Birch 2014). Although on a technical aspect it might not have been too different from a standard controller it was arranged in a creative and unique manner related to the theme of the game in order to provide an extra level of immersion, intractability, and entertainment. Controllers of this nature wouldn’t find much use in other games as they are more specialized and typically can only be used in certain games; however, this type of hardware excels in a more public arcade setting as it catches attention and allows multiple people to try and play the game. With the design of our Ouija board and candle set controllers we hoped to create an intriguing and unique design that catches attention and encourages people to investigate and try it out.

Features and Goals:

The list of features we wanted our game to have is for it to be a two player game, to have the controllers represent the feel of the game, and to have the game fully functional. What worked was that the game was playable with two players so throughout the critic people enjoyed playing and competing with each other which is what we had originally wanted. Another feature that was implemented well was the controllers; since our game involved ghosts we felt that it was appropriate to design the controllers as a Ouija board and a sort of magical circle with candles. We felt that the controllers made the person more immersed into the game. For the game being functional it did work but there were some problems with some collision detection but overall it worked as intended for the most part.


YouTube Link:

Github Link:

Works Cited:

Birch, A. (2014). 25 Bizarre Video Game Controllers. Retrieved from

Experiment 4 – Progress Report

Joseph Eiles

Ermar Tanglao

Vijaei Posarajah

Experiment 4 – Progress Report

Project Inspiration:

For experiment 4 our goal was to simulate an arcade game with a different dimension of play moving away from traditional controllers. The gameplay would be focused around a competitive 2 player mode where Player 1 would play against Player 2 on screen. As we cycled through ideas for different types of games such as air hockey, whack a mole, and Tron Lightcycles we had contemplated and played upon a potential a concept with the game “Luigi’s haunted Mansion” wherein the player would use a flashlight to shine at a hidden photocell to make an onscreen ghost disappear. This is where we decided that our game theme would revolve around ghosts with the use of photocells. To develop this into a 2 player game like we intended it was decided we would focus our game mechanics in a comparable fashion to soccer or hockey where the goal was to score a point while another played defense. In this case Player 1 would summon a ghost with a Ouija board and touch sensors to reach the opposite side of the screen while Player 2 defended themselves by using a light source to illuminate mock candles with photocells to “banish” Player 1’s ghosts. We decided we were going to pursue this game design and theme as it allows for us to create a competitive 2 player game as well as create unique ways to interact with the game for both players.


Creating a game for our final project was something of a unanimous decision; we were all gamers to some extent and we couldn’t see ourselves doing anything other than a game. As our main inspiration behind the project was the creation of a fun and physically interactive two player game we bounced around several ideas regarding old arcade games; we would then settle upon creating something akin to these games as they were relatively simple and their pixelated graphics were potentially easy to replicate and manipulate on screen using the Arduino and P5.js.

With an idea in mind we browsed through multiple examples of the Arduino being used in order to create arcade styled games. One example that stood out was Kris Temmerman’s storefront arcade game, wherein the creator utilized an Arduino DUE and a neopixel display in order to create a public arcade setup where up to two people could play a pixelated game through the storefront window (Benchoff 2013). Although Temmerman’s example did not include P5.js it displayed the potential of the Arduino in terms of creating games and the controller in order to play them. Essentially, the Arduino had the potential to create a powerful and creative controller while P5.js had the potential to take this a step further by offering fun and interactive digital visuals and sounds.

A big aspect as to what influenced our final design was what we had available in terms of parts and resources for the Arduino as well as our combined skillsets and knowledge in utilizing these parts. Our original designs had elements on the screen being controlled by the player’s hand movements but we found that this would be incredibly difficult using the Arduino. We eventually settled upon the use of analog switches as well as sensors in order to utilize what we had learned from our previous experiments as opposed to trying something new that could potentially not work and end up wasting time. However, with valuable advice and criticisms that we had received from our previous group project we hope to avoid previous pitfalls and expand upon what we have learned in a new and creative way.

Concepts, Techniques, and Materials:

The concept that we decided to work upon and create was a 2-player game where the players activate their own switches using different materials. In this game one player controls a human that carries a flashlight while the other player controls the summoning the ghosts. The ghosts will be summoned using a Ouija board with a switch as well as a heart shaped piece. For the other player we decided to use light sensors so it felt more realistic when the player shined their flashlight onto a spot that would be where the in-game character shines their light. For our visuals we decided upon creating 8-bit sprites; all of this would be coded using P5.js. As for the materials we plan on using we decided upon constructing the game board with either foam or cardboard. The player would have light sensors in different areas of the board that he/she would have to aim their light towards in order to stop the approaching ghosts. For the player playing the ghost he/she would have to interact with the Ouija board in order to summon the ghosts by sliding the heart shaped piece making this a digital switch and both these items would potentially utilize conductive fabric as tinfoil would break too easily if people were sliding too quickly or applying too much force.

Works Cited:

Benchoff, B. (2013, November 27). Turning a Storefront into a Video Game. Retrieved from

Other Contextual Links:

Materials as Sensors – Final Prototype

Joseph Eiles

Ermar Tanglao

Vijaei Posarajah

Materials as Sensors – Final Prototype

Github Link:

Project Description:

Our final prototype consists of two instruments constructed out of cardboard; a guitar and a drum set. When a specific note is played on the instrument by activation of an analog switch the corresponding sound will be played through the laptop. The drum kit is also connected to a projector which is aimed towards its front surface; when a note is played a red projectile travels from the top to the bottom of the screen in an erratic thunderbolt-like manner and creates a small explosion when it reaches the bottom of the screen. The projection is capable of having multiple thunderbolts on screen depending on how much notes are played on the drum kit.

Sensing Method and Materials:

Our group used digital sensors to power all of our instruments. The conductive material that we used the most was aluminum foil because it was the easiest and most accessible material to use.



In this diagram the green wires are connected to the strings of the guitar and the red strings are connected to the frets of the guitar. For the black wires they were connected to the conductive fabric people had on their fingers and the guitar pick. When the fingers and the frets connect it doesn’t create a sound for the guitar but instead makes it so when the pick and the strings connect it creates a sound depending on which fret is being held down.  



In this diagram the red wires are connected to the parts of the drum and the black wires is connected to the drum sticks as well as one part of the kick drum pedal. When the sticks hit any of the spots on the drum it creates a sound of the drum kit depending on where they hit.

Documentation – Process:

Documentation – Exhibition:

Video of our Final Prototype:

Project Context:

The context and inspiration behind our project was to create a virtual band setup akin to the well-known video game franchise known as Guitar Hero. The game was and popular and beloved franchise which begun in 2005 wherein the player would use a guitar shaped controller to play along to a list of popular songs as if they were part of a rock band; the formula proved to be a success with earnings reaching one billion dollars within the first week of its premiere (Venard n.d). For our final prototype we wanted to create something akin to the virtual band as represented in Guitar Hero; our first prototypes revolved around analog sensors or switches that could either be placed on the table or wall but we decided to place them onto appropriate instruments made out of cardboard in order to give the user a better sense of immersion and top make the experience more entertaining. Our prototype differs from Guitar Hero in that instead playing predetermined notes on the screen we gave the user more freedom by allowing them to play any note they want and an appropriate sound will always be played; our prototype offers a more freestyle and creative experience. We had also hoped that the interactive visual would also encourage the user to play the drum kit more while also offering an entertaining experience for people watching the user.

Works Cited:

Venard, C. (n.d). The Beats to Beat: A History of Guitar Hero. History Cooperative. Retrieved from

Koller, D. (2014). Visual: Drumset. Retrieved from