FaceTracking

featureimage

Context

In a world full of interactive devices we find ourselves surrounded with sensors, joysticks, screens, etc. For this assignment, we decided to explore a different kind of input that does not require the user to press any buttons or screens, instead, we wanted to explore using the camera and user’s face to send commands to a computer. FaceTracking is a p5 application that uses the computer’s camera and an algorithm to read and understand the user’s face. This application is currently set to play a sound with the user’s face movement, however, any function can be added to this application.

FaceTracker
FaceTracker

Objective

In this project, we will explore using the user’s face, as a controller to send commands to the device. This tool is a basic prototype but has potential to be scaled to include any number of functions that run based on the user’s face manipulation and movement.

This tool is based on the FaceTracking application using Haar Detection technique, which uses an algorithm that contours the user’s eyes, nose, mouth, eyebrows, and chin. Each element of the face is given a number and then a vector is drawn connecting the numbers. Using this tool, we were able to make a simple beat player that the user can play simple music with.  

 

Link to Code

 

Design Process

High-Level Computer Vision focuses on a complex analysis of images. When talking about CV and faces, there are three major sections:

1)      Detection: spotting the difference between a face and non-face,

2)      Recognition: distinguishing different faces,

3)      Tracking: a combination of detection and recognition over time.

We wanted to explore the face tracking option and create a controller using our faces. We started with Kyle McDonald’s Face Tracking Example.

4

3

We found this Class Notes from McDonald that explains all you need to know about CV and faces. OpenCV uses a Haar Detection technique, developed by Paul Viola and Michael Jones in 2001. Haar detection can be used for any two-dimensional object, but it can not handle any significant rotation or skew. It is also very limited on the colour variation that it requires. There is a video about HP Computers that could not follow Black faces.

The face tracking example identifies 70 points on the user’s face.

We took the key points that layout the face’s elements and draw contours around them. We didn’t notice a lot of change with the eyes, eyebrows, or the nose. But we were able to rotate the general contour of the shape. So we took the two points that defined the edges of the faces and compared them with each other. By comparing their Y position, we were able to identify if the face was tilting in any direction. After that, we added music to each direction so that the user would be allowed to play music by moving their heads.

 

Tools & Materials Used

P5.JS online editing tool

Github

Two Mp3 files

laptop

1

2

5

Challenges

Trying to determine which direction the face was angled either up or down was slightly confusing. We used a calculation to determine the exact point at where the controller would be activated but realized we did not need to after re-thinking our logic. (Omid can you elaborate on this part a bit? )

 

Future Steps

Future iterations could include a new beat everytime the user opens the page. So other could make a variety of beats together in front of the camera on their own devices. The webcam could also include more than one face and once again provide a random mix of  different beat set per controller, since it currently only recognizes one.

 

Useful links to look into:

 

OpenCV Website: https://opencv.org/

Kyle McDonald’s Class Note : https://github.com/kylemcdonald/AppropriatingNewTechnologies/wiki/Week-2

Kyle McDonald’s CV Examples: https://kylemcdonald.github.io/cv-examples/

OpenCV Face Detection: Visualized https://vimeo.com/12774628

How To Avoid Facial Recognition: https://vimeo.com/41861212

Face Detection For Beginners: https://towardsdatascience.com/face-detection-for-beginners-e58e8f21aad9