Sign Language with Ai

Code to this project:

After looking through all the examples I felt a bit stumped as most of them seemed like they were already finished products and repurposing them in terms of code seemed beyond my current skills.

What appealed to me was using the Ai engines as an input device to make it learn different hand gestures, the idea was to make the sign alphabet shown below.

Copyright: Cavallini & Co. Sign Language Chart Poster

The KNNClissification example had a rock paper scissors example which was doing hand recognition and was the perfect start.

It took some tinkering with the source code to figure out what it was doing and how the html was rendering things from the .js file. This took a fair amount of time as most of the code was very alien looking and some very concise shortcut methods were used to optimize the code.

I had to stop and go back to Nick’s video to understand what tensorFlow and Ml5 were doing and how this all fits together as my code kept breaking and only the initial 3 values kept showing. A little mind mapping of the tensorflow and ml5 ecosystem cleared the fog and I could finally get the engine working with 5 values, it was understanding how the two things connected that helped me see what I was trying to do with the code, as most of it was copy-pasting code snippets.

mind map

The Idea for using ml5 and p5:
I had now the framework for the I wanted to use, I did not do the traditional chart as my idea was that I could use hand gestures instead. Like Stop, attention, Thumbs Up etc, this would allow me to make a game of it, where two players could train their own sets and ping-pong signs before the time ran out the winner would be the best at getting predictions precise and also switching back and forth in time before the timer ran out.

This was the charts I used to get the icons from01df

I mapped these in the window:


The next part was training the different symbols to recognize the hand gestures, this was fairly straight forward but I could not get the Ai to load the data set I had saved, I tried replacing the permissions but it did not work when I clicked load, I kept having to retrain the data sets to get them to function.

The next thing I want to try is getting p5 to trigger an animation based on the gesture that was 100% I did this with a very simple example to a rotating cube which would turn based on the no that was 100%. This was testing the concept but the ultimate goal was to make a meter which would be like a gauge for where the most accurate recognition was.


This was a proof of concept where the circle would be then replaced with a meter. I tried making this purely in p5 but it was way too much math for such a simple shape, so I made an image in illustrator and imported it.


The final result does not have a working meter or a game but they are both edging towards that final outcome the where you would the final game where two users return the gesture they are given and then trigger a new one if they are unable to return the gesture in time they lose a point. When you return the gesture the timer gets faster on the next return, till its too hard to compete and one person loses a point. The game ends at 10 points.

The concept for the game:


The above is mockUp of what I would like to create with the AI engine, I still have a long way to go to realize this but I have some of the pieces in place. It would use ml5->p5->PubNub->p5


References: examples/tree/release/p5js/KNNClassification/KNNClassification_Video

Special Thanks to Omid Ettehadi for help with understanding the code.
Icons Designed by Freepik







The new

Mom finally gets a Call!


Mom Finally gets a Call


I wanted to explore the notification protocol I used in my Creation and Computation project, it was rushed and I used code from a third party to get the notification working with IFTT, it wasn’t the best implementation as it lacked a lot of flexibility and there was no way to see any data logs of when the data was sent. This new way through AdaiO is ideal for the implementation of that IOT (internet of things) product.

I took this opportunity to rebuild the project with the new notification protocol though AdaIO and IFFTT as a whole. It uses some of the original code for getting the sensor information.

The Process:

In this this write up I will focus mainly on the communication protocol as the Creation and Computation project mainly focuses on the product design and implementation of the circuit and design.

This time around I had already a proof of concept on how the project worked and now it was a matter of getting it to send that via AdaIO

Before I could tackle that I had to figure out how AdaIO worked and all the libraries involved in IOT, for this I re-created one of Adafruits ESP32 Feather projects of mailbox that has it flag go up when a email is received in gmail. This got me familiar with all the libraries and also the sample code for the feather was very useful to study for setting the wifi configuration. I got this working with my gmail account and was flooding my phone with IFTT notifications.

Here is the LINK to the project.

Once I had that working it was a matter of reverse engineering what I had learnt and got it to work the other way around, here another one of the code examples helped, its the most basic one called Adafruit io Publish.

The basic premise is I had to set up a function to send the AdaIO a variable once the light ON statement had triggered the function, it would then send the people I had set in IFTT an email telling them the lamp was ON. This is the exact same thing I had before but it was not saving any data on how many time it was triggered and also no way of controlling how many people received the email, it was a bit bootstrapped to fit the needs I had at the time.

Bits and Bops:

The code is all on git Hub but it is good to hi-light some of the main parts of the process. These do not have to be done in sequential order to work.

  • AdafuitIO receives data on a certain channel and then triggers an event.
  • Setup the feather to send that variable to Adafruit IO in a function which is triggered by an event, a sensor value, getting remote data etc, anything that says OK this happens DO that, where that sends the trigger.
  • SetUp IFTT to react to the trigger in AdaIO, make sure you have the right settings in the


The project was a very interesting exploration, of the adfruit IO ecoSystem and all its different components, there is some limitation to the amount for data point you can send but for small projects you are fine. You can send direct email triggers through adafruitIO but it is not available to free accounts.

You can also have adafruit IO send you a notification when a value is received, in the trigger value.


Adfruit External Services:

Adafruit Mailbox:

Adafruit Iot Library:

Working with Api’s – Tag Cloud



In this experiment, we had to explore connecting APi’ and make use of the data we received from it. I explored various api other than the one we worked on in class, called Wolfram. The data we were working with was mainly text and the challenge was to take the text data received from different queries and make it into something visual on the screen. I used p5.js for this experiment but we were not limited to just using this framework. I had several ideas of using other data sets and frameworks and tried a few but they were not realtime, I could not query and get back an interactive response, like I could with the setup of PubNub and Wolfram, setting up the others needed hosting your own servers and I did not have time to explore that. I settled on using the Data from Wolfram and creating a Tag Cloud with displays in different colours.


Git Hub Link:

I explored using node.js, the desktop version of processing and plain html 5. My main idea was to download stock data and convert it to a chart. This idea was possible in processing after I downloaded a CSV file from., they gave me an Api key and the stock data. This I could accomplish but the data would not be real time and there was no interactive loop to call the data with a query from the browser.

I use the code samples from to explore using the data set and visualizing them in processing.

The skillshare class was also very good to explore how to visualize data I have in the form of Json or Csv files.

The API:

I settled on using the API we setup in class and that had an interactive component and it was sending back text data from different Queries we sent it through PUBNub.

One of the ideas was to convert the Colour text string received into its corresponding RGB value and show it on the screen as a palette. I thought this would be a relatively straight forward process but like with anything, simple does not mean easy. For one the text I was getting back was a string with many words and I would have to evaluate for each letter in the string by making it an array, then there was the problem that sometimes what I got back were things like “Light Blue” which needed my code to understand that pair as a colour and convert it based on that. I was able to make the code into an array and check for words but I got dumped in using a dictionary to map the words to rub codes and then put that back on the screen, the fact that RGB is in brackets and are integers seemed to be the biggest problem. I scrapped that idea for this assignment as I need to brush up my coding to get past that coding challenge.

In the process I did find that I could display visual data based on the input I received, I understood how an array can be broken down and how PUBnub sends back a data string.

I used this to inform my next experiment, I made a new sketch and started from Scratch, I made use of the text I was reciting and tried to break it up into single words, for this I went down the rabbit hole of working with text in P5.js, I explored a .js add-on for P5 called:

This is a javascript library is compatible with P5.js and also allows for sophisticated functions for working with text. I would like to say this was needed but it was not, its a good resource I now have for future projects but I could not use it for this project.

It was with a very few lines of code that I was able to create a Tag cloud with the text I received, this proved to be very satisfying and also very effective as the communication was real-time and it always created a new visual pattern based on the query. I would like the add a bit more of finesse to the final file but for now, I have a working prototype which uses the data received and Dows something interesting with it.

This was the Result:


References: ( Finance Api) ( Processing Code for Data Viz) ( Javascript library for working with text)

Code from Nick



Github Link:
The Project:

The game Showdown!! is a twist on the classic western movies showdown (gunfight). The idea was to use the Nudgeable’s as game controllers, with the vibration as a tactile indicator of being shot or injured. It is a very short game, around 2-3mins and allows for 2 players to duel out to see who is the fastest gun in the west.


We used p5 to display the animations and interaction on screen. It has two components: the p5 serial controller for the serial port connection with the Arduino ‘Listener’ and the p5 code in the web browser which makes the interaction visible. The purpose of the p5 display is to track when a ‘gunshot’ is fired in order to provide to participants an accurate gauge of who shot first, as well as a scoring mechanism. We reused characters from one of Frank’s previous projects and imagined that, as players scored points, the characters might lose limbs or otherwise act out.

Development was halted when we discovered technical problems with the listener device, but we made a proof-of-concept version to demonstrate how it would have worked.

The Nudgeable Mitts:


The hardware provided was Nudgeables, a custom device created by Kate Hartman for the body-centric design experiments. Our intended circuit featured two xBees listening in on the Nudgeable devices; when either Nudgeable device is activated, the listening device would send a signal to the microcontroller and our p5 interface. Due to technical challenges, we were unable to realize this with the Nudgeable devices.. perhaps using a microcontroller for the job would have been more intuitive.
We sourced two pairs of mitts from a dollar store and affixed the nudgeables to the garments with conductive thread. In order to activate the device, we designed a ‘switch’ that would activate whenever a thimble came into contact with the palm of the glove, which had a layer of conductive fabric. The activation of the device was very intuitive and effective.

The Listener (+ Challenges with the Nudgeables Hardware)


In order to create this project, we had to determine exactly how the Nudgeable xBees were communicating with each other. The Nudgeables user manual doesn’t explicitly define what the configuration settings are for the devices, but we determined that it was likely that the boards were using wireless communication to activate paired pins between the device’s modes A and B. We used serial communication with the Nudgeable xBees to collect their channel, identity, target and digital pin pairings in order to create ‘listeners’ for our projected interface that could eavesdrop in their communication and trigger events for our microcontroller and software. Based on our testing we determined that the ATD0 and ATD1 values for the Nudgeable xBees were paired with each other and that they communicated on channel 12.

With this in mind, we manually copied the configuration information we collected from the Nudgeable xBees and applied them to our own xBees in an attempt to create ‘clones’ that would listen in on the conversation of the originals. We tried to detect when the digital pins were activated on a breadboard by using LEDs, then serial from an Arduino Micro. The results were indeterminate, and when we later tried plugging in our clones into the Nudgeable devices they didn’t behave appropriately. In essence, something was still different between our clones and the original xBees.

We discovered that a software known as XTCU has a means of properly cloning the configuration of one xBee to another and decided to give that a go. Testing them out on the Nudgeable devices we created worked, although we were still unable to use these cloned xBees for our listening device.



What worked:

The Nudgeables are designed to be very modular and work great as part of the gloves, we were able to get the gloves to act as triggers with conductive fabric and conductive thread.

  • Each glove vibrates when you cress against the conductive fabric.
  • We were able to clone the Xbee’s and listen in on the Nudgeables.
  • The p5 sketch works with test data to trigger and show the winner


What didn’t work but we would like to have:

Although we failed to realize our intended project, we did manage to learn more about the xBees and their communication protocol in Nudgeable devices as well as use the opportunity to create some fun controllers. If we were to take up the project again, we probably would resort to using an Arduino instead of the Nudgeable devices for the communication as we would have more control over the success of the communication protocol and the capacity of the Listener to capture the communication between the devices. On the other hand, it probably would be informative to ask Kate directly about how the communication protocol works and the benefits of having the devices perform such communication; one of the novelties of the devices we noticed was that they would receive illegible serial communication from the other device whenever they were both active (and one was connected to a computer via FDTI chip).  


Experiment 1 Xbee Talking

In this experiment, we explored the Xbee communication protocols.  The Xbee is a wireless radio device which can be used to communicate to each other with or without microcontrollers.

Mind Map

You can download the entire mindMap as a PDF from the image above.

The above is the basic ecosystem of how the Xbee works based on Kate Harmans Slides and Digi.Com website.

The Main Concepts:

  • The basic idea is the same as a Walkie Talkie, where one device can send and receive signals, they are transceivers.
  • The device runs on 3.5V and can damage the device if you use a higher Voltage.
  • The Xbee is configured and not programmed, you cannot add new commands to it only configure the properties set in the Data Sheet.
  • The Pin spacing is different for a breadboard so you have to use a breakout adapter.
  • It also operates in two modes transparent Vs Command Mode, transparent is the data mode to send and receive and command mode is activated with +++ where you configure the device.

The Experiment 01

For this experiment, we had to receive signal from one Xbee in the form of a PULSE, H would turn the power to the Pin On and L would turn the Pin to Low/OFF.

My idea was to use the signal to drive a fan which would be a tube and levitate a ball. I got the pieces together, but the fan was not strong enough to drive the ball into the air.


What can I do to change this up?
I can get this to work if I use a Mosfet and use a higher powered fan and use a lighter ball. I did not have time to go back to get the parts and change my design.


I went back to the drawing board and went with the simplest solution of programming a buzzer to receive the signal and creates a simple pulse.