Design Journals: Recording Studio / Audio Lab


When choosing a voice actor were able to find an incredibly talented voice actor by networking within the Integrated Media and Digital Futures community. Our team reached out to three individuals via Instagram requesting a short clip of them reading our script. Our team unanimously made a decision to move forward with the actor we thought had the most soothing voice.

When working in the recording studio our team made sure to send our voice actor a draft version of the audio script well in advance of our booked studio time. We did this to ensure that our voice actor had optimal time to review and edit the script. This was important because we wanted to make sure we were able to capture all the voiceover audio we needed in one three hour session. Outside of school, recording studios can charge a pricey hourly fee so it is in good practice to come prepared with scripts finalized and me mesmerized beforehand.

Once we had the voice acting audio done we moved on to creating an effective sound design strategy. To do this our team went through each step in our user journey and brainstormed a variety of sounds that we felt the best fit within the environment and the scenes intentions. For example, in the Beach scene of our VR experience, we decided we would have sounds of seagulls, people talking, a conch shell, a boat horn, etc,. For each one of these audio assets, we thought about how we wanted the sound to make our user feel. Do we want the user to feel alert? sleepy? creative? relaxed? Based on the intended emotion we set out for each sound we paired either an alpha, beta or theta binaural frequency with the sound to enhance the emotional aspect of sound design. We did this because Alpha, beta and theta frequencies have been proven to enhance particular brain activities in listeners. For example, a delta frequency is likely to enhance a sleepy and dream-like state in users.



Design Journals: 360 Video Capture

When capturing 360 Video footage it is important to keep in mind the viewpoint that the footage puts the user in. For example, footage that is shot low to the ground will make the user feel very small in VR, whereas footage shot from high up will make them feel very tall in VR.

In my experience, the best way to capture the ideal intended user viewpoint is to use the Yi360 Camera and the Yi360 mobile application. When our team switched from the Samsung 360 to the Yi we were able to optimize our capture time by being able to preview the footage on the mobile application. This enabled us to make tripod adjustments and visual design decisions on the fly without having to connect to a laptop. The camera + app workflow worked particularly well for us when filming in Allen Gardens without a permit, especially because we had to be quick (and stealthy) in capturing the footage.

When capturing 360 footage it is also imperative to keep in mind the weather conditions before a shoot. Our team made the unfortunate mistake of filming the Bluffs on a cold, overcast, afternoon. This resulted in a miserably cold shoot experience and poorly lit, inconsistent footage. From this experience, we learned that the optimal time for a 360 shoot would be early in the morning when the sun is at its highest so that the light conditions stay relatively the same throughout the shoot.

In the future, I would really like to take the skills and tools we acquired from this project and use them to capture 360 Videos of different ecological ecosystems. For example,  I think it would be visually interesting to view the world as and life of a beettle, a bird, or a tropical fish while hearing interesting stories about their ecosystem.

Design Journals : User Experience Mapping

In mapping the user journey for VR experiences I’ve learned that it’s very important to keep in mind the physical conditions of the user. For example, when filming 360 footage using moving video can be a challenge as it can sometimes cause a disconnect between the user’s cognitive processing and visual reception. This disconnect can cause nausea ruining the mindful experience. Additionally, it is important to keep in mind that the user’s range of motion is limited because of the VR headgear. That being said it is important to minimize any excessive motion on the user’s part and simplify any gesture-based interactions.

When designing a mindfulness VR experience assets design is the most important feature in creating a relaxing VR environment. The subject matter and colour grading of 360 Video can alter an experience drastically. For example, in our VR experience, we captured footage of a beach in the Winter. With color grading we were able to warm up the scene so that it would look more like an early spring day. Even though both the color graded and non-color graded footage was the same in terms of content users perceived the warm-colored footage to be significantly more relaxing than the cool colored winter footage.

The sound design also plays a huge role in creating a successful mindfulness VR experience. In order to evoke intended reactions from users it is best to carefully plan out the kinds of sounds you want and list what emotions you want each sound to evoke. That way when you are sourcing your audio assets you can ask yourself “What emotion does this sound evoke?” “Why?” And is this the emotion “I want my user to feel?”.

Because our VR game/experience is aimed at promoting a state of relaxation we’ve made sure to keep our footage as still as possible and remove any excess motion from the user experience. Secondly, we have removed excessive motion from our User Experience by eliminating the yoga scene and replacing controllers with gaze and breath-based interactions. Our team went through multiple iterations of user experiences, refining the user flow and design elements to meet only the essential objectives of the mindfulness experience.


Atelier IV Blogpost 6 : Getting to Completion

UPDATE: due to the restrictions posed by COVID-19 we have had to rescope our VR Project significantly. Therefore we will only be completing one scene is our VR experience, the beach scene.

For the beach scene sound is one of the most important components. I have focused a majority of my efforts on getting the instructional voice over as clear and simple as possible in terms of content and sound quality.  I got my very talented voice actor Shaughn to help me get the voiceovers for all of the scenes done in the recording studio.

I then took the audio file for the beach scene and edited out any imperfections in sound, for example, shakiness, hard breathing, errors in the script reading and any background sounds from the voice actor moving around.

Next, I had to scour the web for free background music and folly that I could use without having any copyright infringements. For the background folly, it was pretty easy to do this as Youtube’s audio library is quite extensive. However, for the sounds that are more niche to the mindfulness and mediation practice like binaural beats and Tibetan sound bowls I really had to do some digging. I eventually found this site which provides free to use meditations sounds and music.

Next, I took the collected audio and mixed them in Audition to create smooth transitions and more complex audio. For the voice over I added a faint sound of waves and wind blowing in the background. For each object in the scene, I have created an audio track that combines intuitive sounds  (the sound that the object would make e.g, a boat would emit a boat horn sound) with Binaural beats.

Binaural Beats or “binaural tones” are created by playing slightly different frequencies separately into each ear. The difference between the two frequencies stimulates a response in the brain which correlates with this frequency difference. It is used by a variety of practitioners and private users for improving self-confidence, stress relief, pain management, relaxation, improving and concentration and improving the quality of one’s sleep. There types of binaural frequencies, alpha, theta, beta, and gamma.

  1. Alpha music encourages a state of alpha relaxation. The alpha state is a pleasant state of relaxed alertness. While in a state of alpha relaxation, the mind is quite clear and receptive to information, learning is accelerated and one’s sense of creativity is enhanced [1].
  2. Theta is a state of tremendous stress relief. The benefits associated with theta level relaxation include improved concentration, reduced hyperactivity, and improved memory. While in a state of theta relaxation, one’s blood pressure, breathing and heart rate all slow to a much more restful and healthy level that promotes natural healing [1].
  3. Delta brainwaves are most prevalent during deep, dreamless sleep. The delta state is a mostly unconscious state that is essential to one’s physical, emotional, psychological and spiritual wellbeing. Delta frequency brainwave entrainment music is also a fantastic cure for insomnia [1].
  4. Gamma brain waves are believed to link and process information from all other parts of the brain. High levels of gamma brain waves have also been linked to improved memory and increased sensitivity to sensory input. Low amounts of gamma brainwave activity have been linked to learning difficulties, poor memory, and impaired mental processing [1].

Using this understanding of brainwave entertainment frequencies I have strategically chosen to associate specific sounds with specific frequencies. For example, when the user activates the windchimes object the audio will be the sound of windchimes accompanied by a soft alpha binaural frequency. Personally, I find the sound of windchimes to be very light-hearted and pleasant so I chose the Alpha frequencies to further encourage a sense of positive and light relaxation and reflection.

Another example of this can be seen in the boat object. When the user activates the boat object the audio will be the sound of a boat horn. This sound really reminded me of deep uninterrupted sleep. Therefore I chose to accompany the sound with Delta frequencies. I have applied the same process of sound association with the rest of the objects in the scene. It is my hope that users will subconsciously understand and benefit from deeply analyzing the sound design as part of their meditational practice.



  1. information

Atelier IV Blogpost 5: VR Game Progress

During the development process, I have been working on shooting and editing the 360 Video for the Garden and Bluff’s, sourcing and recording sounds for background folly and further developing the user experience through flow charts and process documents.

360 Video 

Over the past two weeks, I have done two 360 video shoots. The first one was at the Scarborough Bluffs. Unforchanatly, This shoot was very physically taxing and in the end, not too successful.  The weather the day of the shoot was not consistent throughout the day, by the time I made it up to the top of the bluffs it had begun to snow and the overcast lighting was very dim. Also, hiking up the Bluffs in the freezing snow with a tripod, camera, microphone, etc., was challenging. I had to do this twice because at some point my Yi360 camera died. As a result of all these challenges the captured video was too dark, shaky and overexposed in areas where the light reflected off the snow. Needless to say, this experience was not a pleasant one but I did learn a  lot from the process. 1. Always check the weather before doing an outdoor shoot 2. Carry a portable charger for the camera 3. Don’t wear combat boots on a hike.

The next shoot was at Allen Gardens. Luckily the staff keeps the indoor gardens warm for the plants so I did not have to worry about the cold so much as making sure the sun was out for good lighting. I did several shots all around the gardens using the Yi360 camera and the smartphone app. I feel like the app really improved the efficiency of my workflow and made it easier to get shots without being captured in the footage. I did have some challenges getting the footage to be at an average human height because the staff at Allen Gardens do not allow tripods or video shooting in general. Therefore, I had to sneak around the staff and hide the camera in inconspicuous places to get the footage. This experience was by far easier than the last part because I was able to get the app working (which made it easier to sneak) and the weather was much nicer that day. One lesson I learned from this experience was to check for permits before showing up to a shoot location. Allens Gardens is a free entry community space so I assumed that video shooting would not be a problem, and it isn’t as long as you have the permit.

Sound Sourcing 

While on-site for shoots I was also using the ambisonic sound recorder to record the sounds of the water at the bluffs, the sound of chirping birds and chatting pedestrians at Allen Gardens. Furthermore, I have done some research regarding sound design. Through my research, I found that a number of frequencies and sounds can help induce a sense of calm as well as ‘mindful’ brain activity. These sounds include isochronic tones, Tibetan sound bowl sounds, Solfeggio frequencies, sounds of nature, etc,. The next step was to compile a list of sounds that fit within each scene that are proven to increase stable brain activity.

User Experience 

As we iterate and change our development process the user experience diverges and converges. Part of my role in this team is to consistently update the documentation and user flow charts with any changes that are made to the overall experience and why. My secondary role is to be constantly monitoring these changes and adjusting our timeline and project management as needed.


Atelier IV Blogpost 4: VR Contextual Review


Breath Tech – Game

I really enjoyed the different ways that breath was used in this game, the designers really took the time to analyze the many ways in which breath is used in the real world. Such as in windpipes, to blow out candles, to melt ice etc,. The intuitiveness and creativity of these interactions made the gameplay very interesting and not too difficult to figure out. As a player I found the process of figuring out the challenges rewarding and the art was intriguing. I also appreciate the breath calibration and visualization I though it was very successful in delivering some kind of feedback so that the user understands that there breath is  having an actual impact on the environment.

From this experience I have learned that I have to expand my understanding of the different way in which breath is used. Moving forward I would like to ideate and brainstorm the possibilities of breath in my own VR experience and think more deeply into the ways breath can be visualized and used to change an object or environment.

Breath Peace – Meditation Experience 

This experience was really cute and aesthetically beautiful, however I did feel a bit impatient with the slowness of activity. The art is so well done and honestly the little panda character made me smile, it was very fitting to the theme. I really appreciated the way the scene would kind of glow and change hue with the breath prompts. This is defiantly something I would like to recreate in my own VR experience.

Although, there were points where I didn’t really feel peaceful as I was anticipating more action to happen on the screen. In the beginning of the scene there was text to guide meditation, however I feel that audio might have been a bit more powerful as the text was visually distracting. After a while the experience felt kind of repetitive. I understand it is a mindfulness experience and its meant to be that way, however I wonder if there is a way to accommodate the user flow so that even the impatient user without much time to be able to gain something from this experience.

From this experience I feel like I quickly learned that text in VR does not contribute to the ‘mindful’ or ‘calming’ experience. So  I will defiantly try to limit the amount of text in my own experience. Instead I would like to focus on improving the visual representation of the meditative experience.I also learned that emotional design for a mindfulness piece is so important. The panda character genuinely gave me joy, this kind of storytelling is so simple yet powerful and can instantly be a mood changer. It reminded me of the way  a compliment, a happy dog, or a kind gesture can make someone feel and I wonder how I can recreate these emotions in my own VR experience with simple actions and animations.

Noodles – Game

I did not like this experience at all, in fact it actually made me kind of annoyed while I was playing. The UI is way too cluttered with text and it is not easy to read as it is all done in a cursive style font. I feel like menus and instructions should always be in print for easy legibility, decorative fonts are for titles and headings only. When the game began the text was still all around the game environment which made the scene visually crowded. Despite all the text in the scene I still had no idea how to play this game, I was told to pick up a stool and when I did nothing happened. Personally, I felt very sick very quickly when moving around from room to room. After 30 seconds of trying to navigate I felt dizzy and had to exit the game.

From this experience I have learned that using the controls to move around is a tricky feature and will probably take a long time to develop to make sure the user doesn’t feel sick. If I do decide to let users navigate the VR space in my experience I will defiantly be incorporating the teleporting feature rather than using the joystick on the controller.

Atelier IV Blogpost 3: 360 Video Color Grading

This week I had to learn how to edit color grade the 360 videos I captured last week using Premiere Pro and After Effects. I started by adding all the necessary supports into Premiere Pro that allow users to edit videos for VR. Then, using After Effects I edited out the tripod from the scene by clone stamping the sand background and coloring the tripod out. I temporarily added some text to the scene to see what it would look like to have the audio instruction appear as subtitles. However, I removed them because they did not look very good and ultimately will not be used in the final version of our project.

To view the tutorial, I used to do this Click Here

Because the footage was shot in the Winter my aesthetic goal for this edit was to change the lighting to make the beach scene look warmer. I also wanted to increase the saturation to make the pink color on the beach umbrellas, and the blue in color on the water stand out more.  To do this I tested out a variety of LUT’s but I didn’t find anyone in particular that I thought helped achieve my aesthetic goals so I adjusted the Lumetri properties myself. To do this, I used this video tutorial as a guide.

Lumetri Color Changes


Old Video Footage screen-shot-2020-03-10-at-10-07-47-am

New Edited Video Footage


Atelier IV Blogpost 2: Unity + VR Set Up

9:00 – During class today we went over the history of virtual reality, I was very surprised to find out that this timeline stemmed way into the 1600’s – 1800’s. We looked at some early VR projects, one that I found particularly  interesting is a project that aimed to fully engage the users senses through an immerse visual reality experience. The artists sued smell as an element of their project by building a dome that released scents as the would experience the virtual reality.

10:00 – Next we learned how to set up VR with Unity through a quick tutorial. We had to make sure we had the updated version of Unity, turn on XR capabilities as well as virtual reality supports. Through an XR example scene we were able to learn about the different types of VR interactions and experience them for ourselves.

11:00 – During the work session Melissa and I worked on collecting  yoga motion capture data with the perception neuron motion capture suit. We recorded about 4-5 different positions, but after multiple attempts we were not able to get the perfect motion capture we wanted. So we decided to search the web for any Yoga motion capture data that we could source for free or at a low cost. We ended up finding a very conclusive motion capture library that had all of the data we needed for free!

12:30 – Next I began to work on creating a 360 ‘starry night’ world in blender. With the help of a classmate and a tutorial I was able to create a star filled world that we intend to use in out final project.

ezgif-com-optimize-17starry-night Video

Atelier IV Blogpost 1: Creating Assets in VR

10:00am – For this lab I attempted to design my own 3D assets in a VR environment using an application called VRCAT. The learning curve was pretty minimal, however it did take me a solid 10 minutes to figure out how to place and arrange things in the proper orientation. I made a few abstract 3D objects, a house, a game map, a dome etc., but I was not able to export them as an .OBJ file. I tried to find a quick fix online but none of the given suggestions worked for me, I am still trying to find the saved objects on the desktop!

I also used PaintLab which I really liked because it allows you to create objects with pre loaded textures and materials. I made a few objects including a window, a tree, and some crystal particles. However, I had the same problem that occurred when I used VRCAT, I was not able to export the files. I just took some pictures with the VR camera instead.


11:00am – During this portion of the lab I explored a 360 video experience called Coral Compass. The experience was incredibly disorienting for me, and the quality of the video itself was not as high resolution as I would have liked it to be. However, I did really admire the use of 360 video to show the impact of climate change as this is something I wanted to do for an upcoming project.  While this 360 experience was lacking for me in many ways, it did give me an idea of what not do when designing for VR.

12:00pm – Next, I played an interactive VR music video which I  kind thought was a game at first, it was called Show It To Me. I really enjoyed the  incorporation of the visuals and the music in a VR environment. The interactivity of the experience was pretty minimal and could use a bit more work but overall I really enjoyed this one!