As the lead developer for “Sonophobia”, I had the job of creating the VR experience within Unity. Games I played that I used as reference is Stifled and Face Your Fear. Both these games are VR Horror games which relates the game we’re making which relies on giving the player a scary experience through sound.
These games helped me learn how they use audio as a way to build suspense to the player and providing them with a sense of fear as they play the game. How I implemented this into our game is using Oculus Integration Spacial Audio (indicated with the red orb around the object).
The game Stifled is similar to our game in terms of mechanics because it uses the mechanic of using echolocation to see their surroundings.
Our version of this is through shaded colours instead of outlines
Another mechanic we have similar to Stifled is how our enemy AI works. In stifled it is a free roaming AI that responds to your sounds. In our game he is either on a track or is tracking the player depending on the level. How the enemy tracks the player is through sound as well but he is not able to move when there is no sound being produced. The enemy’s AI is similar to the weeping angels from Dr. Who. So in our game when the player scans the enemy will get closer to the player.
We have the enemy have different poses to when he is scanned to give the player a sense of fear that he’s doing scary movements when approaching.
So my group decided to scope down and instead of 3 levels we would create 2 and have the movement be used by the analog sticks and press the button to activate the ability. Our reasoning for having 2 levels was to create a sense of progression and increase in difficulty for the player. During this time the second level was created which represents the 3rd circle of hell, Gluttony, which is shown by the amount of food in the scene. Also the edited 360 videos that contains clues were implemented within the scene for the player to find.
Through the workshops our group decided on what to use our 360 videos for and how we are going to use VR to enhance that experience for the player. From our discussions we decided to create our levels based of the real given space for the player so they feel more immersed playing the game. To add to that sense of immersion we wanted to make the player activate their ability using the mic.
What we wanted to do with the 360 videos was create a story with them in the beginning to give the player an understanding of what they are being brought into and as we will create visual clues within the 360 video for the player to find.
From the contextual review, I played the games Stifled and Face Your Fear. Both these games are VR Horror games which relates the game we’re making which relies on giving the player a scary experience through sound.
From these games I learned how they use audio as a way to build suspense to the player and providing them with a sense of fear as they play the game. Stifled is a game that is very similar to our game because we share the same mechanic of giving the user a echolocation type of ability to navigate the space. How our ability differs though is how the environment is seen when scanned.
In Stifled, when the environment is scanned it creates an outline for the models and the enemy is scanned red.
In our game, how we changed it is that we extended the radius of the echolocation because we had feedback of it being able to create seizures and made it so that the floor, environment, enemy and exit portal is scanned at different colours for the player to differentiate.
Another mechanic we have similar to Stifled is how our enemy AI works. In stifled it is a free roaming AI that responds to your sounds. In our game he is either on a track or is tracking the player depending on the level. How the enemy tracks the player is through sound as well but he is not able to move when there is no sound being produced.
From Face Your Fear I learned about how this game uses audio and visuals to scare the user. So how I implemented it into our game is by using spatial audio and having it so it can only be heard when the user is within a certain radius of either the enemy or the video. In addition to having the user hear the enemy when near it, the user will also feel vibrations from the controller giving them a warning that the enemy is near them. For the visual aspect we will create certain poses for the enemy that it will make when scanned and I will make it similar to the weeping angels from Dr.Who where when scanned the enemy will get closer to the player.
My experience with this VR Project is mostly within Unity. I’m in charge of getting the game mechanics working, making so those mechanics are usable within VR, integrating the 360 Videos and designing levels.
I first started with getting our core mechanic down which is the echolocation effect. I went through many sources and found an example of the effect we were going for so I played around with the code they used for our game.
The next mechanic that was needed for our game was having the enemy we have track the player. What I used for the Enemy AI was the pathfinding asset in the asset store of unity.
Right now what I am working on for our next build is to integrate the 360 Videos we have recorded and put them into the levels as well as create levels that would be challenging and horrifying for the player.
One of our levels
Game: Face Your Fear
What works within this game was its use of audio and visuals to build suspense and give fear to the player. How this game creates suspense for the player is that it doesn’t immediately throw you into the horror but slowly introduces the monster into the scene using audio and visual cues. From here the game builds up by the monster increasing getting scarier and louder causing the user to freak out and wondering when and where the monster is going to appear. What I learned from this experience is how this game uses audio as a major point to give the user a direction of where to look at the during the game and how audio can influence the mood of the player.
What this game does is that it limits your vision by having a black screen as your vision and the only way you can see is you have to produce a sound using the controllers which gives a echolocation type of view and it highlights a part of the environment for a short amount of time. The echolocation mechanic is a mechanic that my group is using so it was a good experience to play a game with the same mechanic. What I learned from playing this is how we can use this mechanic and build our environment to fit our gameplay and how the player would experience it
The creation tool that I had used was Google Blocks within the Oculus Rift. The application itself felt fun and easy too use with its simple interface and its ability to model in 3D space. What I created was a simple robot within my time with the application.
The next application I got to try was the music creation app, Modulia Studio. This application allowed me to create music using its virtual keyboard and allowed for songs to be made. Since it app is free, I could not access the other forms of keyboards available to me so I had to add multiple smaller keyboards to create one big keyboard.
As for the 360 video I did not experience one but I did watch and talk to fellow classmates who experienced S.E.N.S VR. This game relies on the user to use their head and line of sight as the trigger which allows for an interesting way to play the game. The illustrations within the 3D space made it feel like the user was inside a comic book.
Welcome to OCAD University Blogs. This is your first post. Edit or delete it, then start blogging! There are tons of great themes available for you to choose from. Please explore all of the options available to you by exploring the Admin Toolbar (when logged in) at the top of the page, which will take you to the powerful blog administration interface (Dashboard), which only you have access to.
Have fun blogging!