Experiment 4: Final Report
Francisco Samayoa, Tyra D’Costa, Shiloh Light-Barnes
December 5, 2018
- User starts at any location on the map
- They put on the headphones
- Depending on the position of the head, the headphones will play a particular sound.
- If a particular sound is playing then the screen will display an associated image or video
- The user must experience all the audio and visual data in order to try and make a connection between the media and a place that exists in the real world before their opponent.
Our project seeks to encompass elements of sound, sensory perception, and nature in a installation based artwork. Our process began by discussing the types of sensors we wanted to explore, eventually we decided on the MPU 9250 Accelerometer and Gyroscope. Our proposed idea, is to integrate the sensor into headphones, which can then change the sound and visual experiences of the user. As the user navigates the room and moves their head about, the world in which they are digitally immersed in will enhance the real world. Essentially, it is a 4D-sound experience. If we had more time we would add visual elements such as 360 video, or a 3D Unity-based environment.
In the end, we were able to get the MPU sensor working with Arduino so that we receive X,Y and Z coordinates. After scouring the web for solutions to library related issues, and calibrating the code in order to receive accurate data readings. Additionally, we were able to take these coordinates and map them to a camera in Unity, so that the sensors orientation changes the perspective of the viewer. This meant we had the pitch, yaw and roll functioning for the gyroscope component. For the purpose of this experiment we disabled with roll since the user wouldn’t be rolling their head per se. However, there were a few “blind spots” the camera couldn’t pick up, such as the 180 degree mark. The interface was fully functional, for the most part. The main issue was the data overload kept causing Unity to freeze, so our solution was to reset the Arduino.
Links To Contextual Work
How to play multiple audio files
Research on Arduino + Gyroscope
Good Link explaining how IMU sensor works
Tutorial for IMU and Unity set up
How to set up MPU and Arduino Library
Unity + Arduino Library
Goals & Features
- Engaging UX Design: We want to solidify the conceptual ideas of the soundscape experience to create an immersive UI/UX design. We want this to include, elements of storytelling and gamification.
- Sound-Sensory Experience: Moving forward we will have to test that the sensor can allow smooth transition between sound files. The media assets will consist of ethereal soundtracks, data visualizations (natural algorithms and patterns), and nature related images.
- Integrate Wearables: Also, we need design a way to integrate the sensors into the wearable technology ( the headphones).
The MPU unit was housed in a sewn leather pouch. There was velcro attached underneath in order to stick to the top of the headphones. This way, the wires were completely out of sight since it was hanging from above. For debugging purposes we wanted to the MPU unit detachable from the headphones. In the end, we were successful.
- Discuss concepts of nature, ecosystems and natural algorithms: Lastly, we want to think about how these concepts can work together to create a narrative and game play element.
Using a blindfold we acquired, we were able to gameify the experience. With the blindfold on, the user would have to guess which environment they were placed in. We would randomly select 1 of 9 unique ecosystems, including 2 songs created by Francisco and Shiloh. These include a war scene, temple, beach, rain forest, and city intersection.
Pictures & videos