Technologies and short projects
Slit Scan with ml5.js
These last couple weeks I have been looking at different aspects. One of the things I tried out was looking at slit scanning. I wanted to incorporate slit scanning with ml5.js. With this experiment I wanted to observe the machine trained model to recognize faces if the participant stood still.
This did not seem to work so well. The slit scan work but the way the frame was spliced by p5.js in the code made it hard for ml.5 system to work.
This week we had also started looking at CLM tracker.
CLM tracker is better at tracking the face and it works on many points of the face. Due to the significant detail in mapping out the face, CLM tracker has some unique capabilities in being able to map out interesting shapes onto the face. This code was released in 2016 and this is almost a precursor to some of the filter technology that exists now.
This key unique feature shows the mapping of shapes on the face.
Sometimes the tracker works well however most of the times the tracker takes time locating the face. It isn’t as steady as ml5. This makes it hard to code with it as we figured it out while trying map shapes onto each key point.
One of the unique codes create Clm tracker which is the emotion reader. The emotion reader which recognizes the emotion based on how each key point is programmed to be in a particular face. The interesting insight for me was the fact that having my facial emotions recognized allowed me to make my face take on different features to take on different emotions. I also wanted to make some notes on what the potential for this kind of technology could be for the future.
This kind of led me to research further into this topic. I wanted to kind of see what is out there in terms of the pros and cons of this type of technology. As clm tracker is open source and is a library which is compatible with p5.js it isn’t as sophisticated as to what is out there. Emotion recognition is a subfield within the facial recognition industry which is said to grow from 19 billion to 37 billion USD.(https://neurodatalab.com/blog/how-do-technologies-recognize-our-emotions-and-why-it-is-so-promising/)
One of our goals throughout this process was to also look at openframeworks. Openframeworks which is a program written in c++ allows for creative coding. We had looked at some of zach lieberman’s videos which earlier in the course which gave us interesting insights into the openframeworks platform. We downloaded open frameworks and Xcode which is the IDE to be used for the code. The advantage is of starting to use openframeworks is that we move away from the browser which is the main disadvantage with p5.js.
We tried out the examples. The first one was the blob example which is great for hand tracking.