(Creation & Computation DIGF-6037-001)
Team : Jignesh Gharat & Nadine Valcin Mentors: Kate Hartman & Nick Puckett
Eggxercise is a hybrid digital and physical version of the traditional egg-and-spoon race where participants balance an egg on a spoon while racing to a finish line. It replaces the typical raw egg used in the classic version with a digital egg on a mobile screen. If the egg touches the side of the screen, it breaks, displaying the contents of a cracked egg with the message “Game over”.
Because of space and time constraints, a relay race format was used for the in-class demonstration. The participants were divided into 3 teams whose members had to go through a simple obstacle course made out of a simple row of chairs. When they finished their leg, they had to pass on the phone to the next member from their team without breaking the egg. If at any moment, the egg broke, the participant who was holding the mobile phone had to reload the game and wait for the expiration of the 5 second countdown to resume the race.
We both shared a strong desire to explore human-computer interaction (HCI) by creating an experience with an interface that forced participants to use their bodies in order to complete a task. That physical interaction had to be sustained (unlike a momentary touch on the screen or click of a mouse) and had to be different from the many daily interactions people have with their smart devices such as reading, writing, tapping, and scrolling. In other words, we were searching for an experience that would momentarily disrupt the way people use their phones in a surprising way that simultaneously made them more aware of their body movements.
We also wanted to produce something that was engaging in the simple manner of childhood games that elicit a sense of abandon and joy. It had to have an intuitive interface and an immediacy that didn’t require complex explanations or a high level of skill, but it simultaneously had to provide enough of a challenge as to require a high level of engagement. As Eva Hornecker remarks:
“ We are most happy when we feel we perform an activity skillfully […] Tangible and embodied interaction can thus be a mindful activity that builds upon the innate intelligence of human bodies.” (23)
poseNet() Library (ML)
We explored the different ways in which the body could be used as a controller mainly through sound and movement. Posenet – a machine learning model that allows for Real-time Human Pose Estimation (https://storage.googleapis.com/tfjs-models/demos/posenet/camera.html) offered the possibility of interacting with a live video image of a person’s face. We conjured an experience that would attach a virtual object to a person’s nose and allow the person to move that object along a virtual path on a computer screen. This led to the idea of a mouse, following its nose to find a piece of cheese. It would force users to move their entire upper body in unusual ways in order to complete the task.
We then moved onto the idea of controlling an object on a mobile device through voice and gestures. Building on our desire to make something inspired by childhood games, we decided to transpose the egg-and-spoon game. We didn’t want a traditional touch interaction so we used the accelerometers and gyroscopes data on the phone to sense tilting, rotation, and acceleration to control the movements of a virtual egg on a mobile phone. This allowed for immediate and unmediated feedback to the user who could quickly gauge the acceptable range of motion required not to break the egg. This can be seen as an application of a direct manipulation interface (Hutchins, 315) where the object represented, in this case, the virtual egg, behaves in a similar fashion than a real egg would if put on a moving flat surface. The interface also feels more direct because the user’s intentions to balance the egg as demonstrated by their hand movement provide the expected results that follow the normal rules of physics.
Eggxercise is engaged with the aim to investigate a natural user interface (NUI) where the interaction with the smart device is direct, tangible and consistent with our natural behaviors. Within that paradigm, it aligns with the multi-disciplinary field of tangible embodied interaction (TEI), that explores the implications and rich potential for interacting with computational objects within the physical world and with the projects and experiments of MIT’s Tangible Media Group, led by Hiroshi Ishii, that continuously search for new ways “to seamlessly couple the dual worlds of bits and atoms.” (tangible.media.mit.edu)
Mobile phone game play demo
Our project integrates a virtual object on a physical device that responds to movement in the physical world in a realistic way. In that way, it is related to the controllers that are used in gaming devices such as the Nintendo Wii and the Sony Playstation. It also has the embodied interaction of the Xbox Kinect while maintaining a connection to a real-world object.
Played between 3 teams of 6 participants on a 10mt relay track. The layout looked as illustrated in the image below.
The code for our P5 Experiment can be viewed on GitHub:
The sound for Eggxercise launched automatically on Android devices. We spent a lot of time trying to get the sound to work on the iPhone only to accidentally discover that users had to touch the screen to activate the sound on those devices.
- Android Device – Browser used Firefox.
- iPhone Device – Browser used safari but only works on the first touch.
- A background with lerp color to have the screen turn red as the egg approached the edges.
- More levels, increasing the speed of the egg, adding a second egg or obstacles on the screen.
- A timer indicating how long the users had managed to balance the egg
Observations / User test
- The participants were using a different browser on mobile phones with different operating systems and specifications and all the phones were connected over wifi to load the code from P5.js, It was difficult to start the game at the same time as the loading time was different.
- Participants having a large screen size had an advantage over others.
- Adding obstacles on the path made the game more challenging and fun.
- The background sound (Hen.mp3) did enhance the experience as the motion of the egg changed the speed and the amplitude of the sound.
- Participants were having a good time using the app themselves as well as watching others play.
- Getting started with creative coding and understanding the basic workflow of P5.js and java.
- Integrating graphics with code.
- Making the screen adaptive to any screen size (Laptop & mobile phones).
- Exploring various interaction and interface patterns using code. i.e Touch, voice, motion tracking, swipe and shake.
Hornecker, Eva, “The Role of Physicality in Tangible and Embodied Interactions”, Interactions, March-April 2011, pp.19-23.
Hutchins, Edwin L, James D. Hollandand Donal A. Norma, “Direct Manipulation Interfaces”, Human-Computer Interaction, vol. 1, 1985, pp. 311-338.
Tangible Media Group, tangible.media.mit.edu/vision/. Accessed September 28, 2019.
poseNet() librart https://ml5js.org/reference/api-PoseNet/
Amplitude Modulation https://p5js.org/examples/sound-amplitude-modulation.html
Frequency Modulation https://p5js.org/examples/sound-frequency-modulation.html
The Coding Train https://www.youtube.com/channel/UCvjgXvBlbQiydffZU7m1_aw
Hen Sound Effect https://www.youtube.com/watch?v=7ogWsIYJyGE