Experiment 1 – Body As Controller

Experiment 1 – Body As Controller

Project Description

In this collection of 4 studies, I explored using computer vision and motion tracking to make interactive moments of clicks and scrolls. My goal was to create a sense of playfulness by having interesting visuals and audio outputs. My overall success is understanding how to translate mouse interactions into body movements, and the benefits of creating content with programming. When users interact with elements on a screen using a mouse, they generally have four categories of interactions: hover, click, scroll, and drag; however, when the body becomes a controller, these interactions are translated into analyzing positions (both absolute and relative) of keypoint positions. This challenged me to switch to a different mindset and consider “click” as a movement or position that can fulfill (make true) or can’t fulfill (make false) a threshold, and “scroll” as a continual movement feeding into an accumulative change. I also made progress learning P5.js with all three models. My failure/challenge is also related to the limitations of the types of body movements there are to complete different interactions, and how to switch from one type of interaction to another.

  • Click 1
    • click-1
    • present link
    • edit link
    •  click-1-emojis
    • demo video
    • This is the first one I completed. I used the Handpose Tensorflow model implemented in ML5 and tracked the distance between my thumb and index finger to realize a “click”. As the distance reaches a certain threshold an emoji appears. There’s one emoji with a sound effect for each corner of the screen. My original plan was to make the emoji be random but ran into the difficulty of having random emojis flicker nonstop as the draw loop goes on.
  • Click 2
    • click-2
    • present link
    • edit link
    • click-2-particles
    • demo video link
    • I used the PoseNet model to track the left shoulder, the nose, and the distance between the eyes. I also used a particles js file for the particles animation. The “click” here is realized by having the left shoulder be in the circled area. When that condition is satisfied, the playful elements will appear on the user’s face on the screen and track respective areas on their face. Background music was added to make the interaction more playable. The googly eyes are something I’d like to reflect on, since their position is relative to the center point between the eyes, as the user tilts their head, the position would be off. This could be something to work on to improve playability.
  • Scroll 1
    • scroll-1
    • present link
    • edit link
    • scroll-1-rotate_1
    • demo video link
    • I used the PoseNet model to track both wrists’ positions and the distance between them. The “scroll” is realized by two movements: the position of the right wrist controls the rotation direction and speed; the distance between the two wrists determines the size. I explored using push and pop with rotation which allowed for an interesting and unpredictable visual effect. The blue and white circles in the background are lines that also rotate responsively, which ended up creating a pattern I didn’t anticipate when adding them in as an experiment.
  • Scroll 2
    • scroll-2
    • present link (see the edit link if this link is not working)
    • edit link
    • scroll-2-amp-audio
    • demo video link
    • I used the Clmtrackr to track the distance between the eyes and the midpoint between the eyes. The “scroll” is realized by two movements: the X position of the midpoint between the user’s eyes determines the sound channel the audio comes out from; the distance between the eyes determines the playback speed of the audio. I also mapped the volume of the audio to the size and position of the graphics, to create a “dancing to the beats” effect. The audio is EDM, which works well with this kind of audio manipulation and the aesthetics of the graphics.