Sign Language with Ai

Code to this project: https://github.com/imaginere/UbiComp_Exp6

After looking through all the examples I felt a bit stumped as most of them seemed like they were already finished products and repurposing them in terms of code seemed beyond my current skills.

What appealed to me was using the Ai engines as an input device to make it learn different hand gestures, the idea was to make the sign alphabet shown below.

sign-chart
Copyright: Cavallini & Co. Sign Language Chart Poster

The KNNClissification example had a rock paper scissors example which was doing hand recognition and was the perfect start.

It took some tinkering with the source code to figure out what it was doing and how the html was rendering things from the .js file. This took a fair amount of time as most of the code was very alien looking and some very concise shortcut methods were used to optimize the code.

I had to stop and go back to Nick’s video to understand what tensorFlow and Ml5 were doing and how this all fits together as my code kept breaking and only the initial 3 values kept showing. A little mind mapping of the tensorflow and ml5 ecosystem cleared the fog and I could finally get the engine working with 5 values, it was understanding how the two things connected that helped me see what I was trying to do with the code, as most of it was copy-pasting code snippets.

mind map

The Idea for using ml5 and p5:
I had now the framework for the I wanted to use, I did not do the traditional chart as my idea was that I could use hand gestures instead. Like Stop, attention, Thumbs Up etc, this would allow me to make a game of it, where two players could train their own sets and ping-pong signs before the time ran out the winner would be the best at getting predictions precise and also switching back and forth in time before the timer ran out.

This was the charts I used to get the icons from01df

I mapped these in the window:

204

The next part was training the different symbols to recognize the hand gestures, this was fairly straight forward but I could not get the Ai to load the data set I had saved, I tried replacing the permissions but it did not work when I clicked load, I kept having to retrain the data sets to get them to function.

The next thing I want to try is getting p5 to trigger an animation based on the gesture that was 100% I did this with a very simple example to a rotating cube which would turn based on the no that was 100%. This was testing the concept but the ultimate goal was to make a meter which would be like a gauge for where the most accurate recognition was.

206

This was a proof of concept where the circle would be then replaced with a meter. I tried making this purely in p5 but it was way too much math for such a simple shape, so I made an image in illustrator and imported it.

207

The final result does not have a working meter or a game but they are both edging towards that final outcome the where you would the final game where two users return the gesture they are given and then trigger a new one if they are unable to return the gesture in time they lose a point. When you return the gesture the timer gets faster on the next return, till its too hard to compete and one person loses a point. The game ends at 10 points.

The concept for the game:

209

The above is mockUp of what I would like to create with the AI engine, I still have a long way to go to realize this but I have some of the pieces in place. It would use ml5->p5->PubNub->p5

 

References:

https://github.com/ml5js/ml5 examples/tree/release/p5js/KNNClassification/KNNClassification_Video

Special Thanks to Omid Ettehadi for help with understanding the code.
Icons Designed by Freepik

 

 

 

 

 

 

The new

Leave a Reply