Sketch 1 – Head Controller – Tyler

headcrop

Full Screen: https://editor.p5js.org/tbeattyOCAD/full/7cYwInKpR

Editor: https://editor.p5js.org/tbeattyOCAD/sketches/7cYwInKpR

This sketch has two parts:

  1. Cropping only the head of one person on the camera (using Posenet nose and eye locations). Once the x and y position of the nose is taken, and the distance between eyes are measured, the Image.get() function is able to capture only the desired pixels.
    – This was inspired by Lazano Hemmer‘s zoom pavilion. Getting the position and size of the head could be used for interaction that deals with the perspective of the audience (localizing where they are). In our screen space project we are considering using machine learning to identify masks or not, this method will be important for isolating parts of faces.
  2. Measuring head twist and tilt. Given the nose position relative to the eyes, the twist amount is a basic map calculation. The tilt works, but not well, comparing nose to eye-line distance to eye spacing. I think it would be better to compare the nose y position with the ear position. I show this data by moving the cropped head, so that you are always looking at yourself, but it looks like the person is simply translating their head. A better effect may be found for this input.
    – This turns your head into a (ineffective) mouse. But it could be used to estimate where on a big wall someone is looking.

Resources used: