Collapsing City – Immersive Sound and Augmented Spaces

Collapsing City – An AR Experience

Michael Shefer – 3155884

Andrew Ng-Lun – 3164714

Brian Nguyen – 3160984

Rosh Leynes – 3163231


Description

For our immersive sound and augmented spaces we set out to create a narrative space developed with sound and the assistance of visuals all experienced within an AR environment. A set of four rooms were modeled and built in Unity all with variations in concept and layout. All the rooms are connected to each other allowing for linear and seamless story telling. Essentially, with a phone, the audience would maneuver through the 4 different rooms and listen to the sounds of each room to understand the narrative. As the user approaches certain objects in the AR space a proximity sound would play to further tell the story. The narrative of the space follows a city collapsing due to global tensions. The initial room is a regular day in the city accompanied by music, children playing, and basic city chatter. The scene connects to a room with a television that cuts from a game of football to an emergency broadcast of nuclear war. The next scene connected is a dense city environment with a ruined building, air-raid sirens, and fire. The final scene is the city in complete ruin with buildings decimated, rubble scattered, and an ambient howl of the wind.

screen-shot-2019-03-19-at-3-02-20-pm

 


Process

When developing the concept we knew that we wanted to tell the story through different rooms and environments with a variations of sounds accompanied with them so we organized the rooms with a rough sketch first.

20190312_092030

20190307_111529

Originally we wanted to set the whole scene onto a plane where the audience could look over and closer onto details and experience the change in the narrative but then we decided that having the audience go through each room individually and experiencing them like they were in the environment would yield a stronger reaction and connection to the narrative. When creating the rooms separately, we initially had the idea to have the audience teleport to each room after stepping through a door frame or entrance to the scene. We decided to scrap the idea because our intent was to have the audience experience the change in the environment so we bridged all the individual rooms together to create a seamless experience. As we were all working on one unity project and making frequent changes we used Unity Collaborate which allowed us to import each others changes.

20190314_095210

20190312_123125

Since sound was a main component to this experiment, we worked with triggers and ambient sounds. On top of general ambience in each scene to establish the environment, we included triggers that would aid in telling the narrative. For instance, if approaching the television, a soccer game cutting into an emergency broadcast would play. Additionally, if approaching rubble, the sound of explosions would trigger to play to visualize what had happened. Although we had visuals and 3D models to create our environments, the sound was crucial to tell the story.

20190314_091242


Challenges

We experienced several challenges throughout the development of our experiment which were eventually resolved but two had stood out: the sound triggers and the scale of our approach.

For sound, we wanted to have each scene vary in atmosphere and sound so we used triggers to achieve this. The trigger’s require something physical to run through it in order to trigger the sound. So when the come walks into the room with the television, it had to interact with the trigger in order to activate the sound. We worked around this by attaching a cube to the camera and adding the rigid component onto it so that it would be able to interact with the triggers.

screen-shot-2019-03-19-at-3-03-18-pm

The largest challenge we encountered was how we approached the experiment. We really enjoyed the idea of having variety especially in scenes to tell a narrative so we focused most of our development on building these scenes and rooms accompanied with sound, particles, and an assortment of sourced 3D models. Throughout development we had to regularly scale down our scene sizes and limit particles and lighting to effectively run the build on phones. In the end, the build ran with lag and users weren’t able to make it through the first scene due to it’s sheer scale.

 

Leave a Reply