Play Along

The series focuses on user’s facial movements through machine learning. The key idea is to interact with the user with the help of sound and hand gestures, to make regular movements more fun. The experiments have been designed by keeping in mind the  ‘Click’ and ‘ Scroll’ movements. While everybody has their own interpretation of these actions, the ‘Play Along’ experiment has been designed in a way to help the user interact with the computer through body postures and gestures without the use of a mouse/keypad. I have tried to combine actions with playful movements and with a touch of music.

Experiment 1: Coloured Claps

With every clap the colour of the ellipse changes

screenshot-2021-09-27-at-17-43-45

 

 

The idea behind the experiment is to change the colour of the ellipse with every clap. When the user brings his/her hands together with every action of ‘clap’ the ellipse switches colours. Once the user stands/ sit in front of the webcam, the system recognises the writs and when both the wrists come together the system generates an output of changing colours.

 

Links:

Present link: https://preview.p5js.org/preetimahajan11/present/qlorcGTkF

Edit link: https://editor.p5js.org/preetimahajan11/sketches/qlorcGTkF

Video link: https://ocadu.techsmithrelay.com/2deV

 

Experiment 2:Move it

Pushing the text by lifting your right hand

move-it

 

 

This experiment focuses on the scroll gesture. The user can interact by raising his/her right hand and pushing the text. The concept was how the text moves to the end and then bounces back to its initial position. This idea can be adapted for several ‘swipe’ feature

Learnings: PoseNet was identifying my left wrist as well and the text was bouncing back before reaching the x-axis.

 

 

Present link: https://preview.p5js.org/preetimahajan11/present/9GYuaqm6E

Edit link: https://editor.p5js.org/preetimahajan11/sketches/9GYuaqm6E

Video link: https://ocadu.techsmithrelay.com/AJyT

 

Experiment 3: Pause and Play

Using Nose press to play and pause music

Free music: https://freemusicarchive.org/genre/Instrumental

screenshot-2021-09-28-at-15-42-59

 

With a defined ‘y’ axis, the user can use his/her nose to play and pause music. The user can move his head up and down the defined axis to control the music with their nose.

Learnings: I faced an issue with the music loading slowly even though the prompt commanded it to ‘preload’. I was able to make a slight change to my command and changed it to poseNet = ml5.poseNet(video, modelLoaded);

 

Present link:https://preview.p5js.org/preetimahajan11/present/20MolhoPN

Edit link: https://editor.p5js.org/preetimahajan11/sketches/20MolhoPN

Video link: https://ocadu.techsmithrelay.com/ObFh

 

Experiment 4: Laser eyes

With the user’s eyes as the source and the laser beams move with the user’s eye movement.

screenshot-2021-09-28-at-16-00-51

 

 

This fun-filled experiment works in coordination with the user’s eye movement. The laser tracks the movement of the eye and moves along.

Glitches: I intended to make the lines squiggly, but they were not capturing the x and y properly

 

 

Present link: https://preview.p5js.org/preetimahajan11/present/tfO6EaVtp

Edit link: https://editor.p5js.org/preetimahajan11/sketches/tfO6EaVtp

Video link: https://ocadu.techsmithrelay.com/W6ru

 

Bibliography

https://www.youtube.com/watch?v=bkGf4fEHKak

https://p5js.org/examples/

https://www.youtube.com/watch?v=FYgYyq-xqAw

https://www.youtube.com/watch?v=ISkrBJ9YqCs

https://www.youtube.com/watch?v=LO3Awjn_gyU

https://github.com/tensorflow/tfjs-models/tree/master/posenet