Workshop Notes #5
Ubiquitous Computing
Olivia Prior

Web Application 

Eavesdropped home page
Web application “Eavesdropper” home page.


Eavesdropper is a web application that actively listens for the mention of a user’s name spoken in conversation. The application uses a voice to text API that transcribes conversations that are in listening range. The transcriptions are analyzed, and if the name of someone is said the application will sound an alert noting that that person is going to be notified through text message. Simultaneously, the clip of what was being said around the user is saved. The user then receives a text message and can go see what was being said about them.


Eavesdropper is an exploration into creating an applet on the platform If This Then That (IFTTT) using Adafruit IO. The purpose of IFTTT is to make a trigger and a response. I wanted to create an accessible and customizable web page that anyone could use as an applet. The JavaScript voice to text API analyzes what is being said throughout the space that the application is opened in. If the text detects the name is sends two pieces of data to Adafruit IO: the snippet of conversation containing the user’s name and a “true” statement. IFTTT is linked with Adafruit IO; if the channel data matches “true” then the applet sends a text message to the associated user letting them know that someone is mentioning them in conversation. The web application simultaneously uses the text to voice API to announce a message to the party that set off the trigger. This applet is simple to set up, allowing anyone to create a transcription analyzer that can notify them of anything they so wish.


Building upon my previous voice to text “DIY Siri” project, I wanted to play around with the concept “what if my computer could notify me if it heard something specific?”. I initially thought that it would be interesting to build directly off of the Wolf Ram Alpha API from the DIY Siri project to notify me if something specific was searched. From here I decided that I wanted to isolate the working parts and start with the concept of “the application hears a specific word, the user gets a notification”. I chose to use names as a trigger because they are rare enough that the trigger would not be sent frequently. This is important because both IFTTT and Adafruit IO have data sending and receiving limits. IFTTT has a limit of sending up to 100 text messages a month, and Adafruit IO has a limit of updating channels 30 times a minute.

I started off by using my existing code from DIY Siri and removing any of the PubNub server integration. I then changed the code to analyze the transcripts of what was being said. If my name was mentioned, then log this information.

Iterating through the transcript to detect if the string "Olivia" was picked up
Iterating through the transcript to detect if the string “Olivia” was picked up

My next step was to connect my Adafruit IO channel to the page. I created a new feed titled “overheard” with two channels: listening, and transcripts. Listening would indicate whether or not my name was overheard, and transcripts would save whatever was being said about me.

After creating those two channels, I connected my voice to text API to Adafruit to see if I would be able to save the value “true” and the transcript of the conversation. I tested with “if my name is included in this transcript, send the data to Adafruit”. This was successful.

Upon the guidance from Adafruit, I started to create an applet of my own to connect this feed to my phone. I chose the if “this” (trigger) to be Adafruit IO, and the “then that” (action) to be an SMS text message. On the Adafruit side, I selected to monitor the feed “overheard” and the channel “listening”. If “listening” was equal to the data “true” then send a text message. The UX of IFTTT made it simple to connect the two platforms together.


Step 1 in IFTTT applet, find the "This" platform Adafruit
Step 1 in IFTTT applet, find the “This” platform Adafruit
Monitor the channel listening - overheard. If the channel has the value true, send an SMS trigger
Monitor the channel listening – overheard. If the channel has the value true, send an SMS trigger
Message that will be sent in the text message.
Message that will be sent in the text message.


I started testing my application with all of the parts now connected. At first, I was not receiving text messages. This was because I was sending Adafruit a boolean value and not a string. The “equal to” on the IFTTT side of the platform was comparing the channel value to the string “true”. I changed the value of what I was sending to Adafruit to a string and was able to receive a text message.

Screenshot of receiving multiple notifications in a row from the recursive loop.
Screenshot of receiving multiple notifications in a row from the recursive loop.

Once I received a text message, I resulted in receiving six in a row. I realized that my voice-to-text alert that played upon hearing my name was vocalizing my name out of the speakers, which in result my application was picking up. This created an infinite loop of alerts. “Alert alert Olivia has been notified that you mentioned her and received a text message”. I attempted to stop the recursive loop by turning off the voice recognition and restarting it. The issue was with each time a new voice recognition object is instantiated explicit permission from the user to have their microphone activated was required. A quick fix for this was so that I could continue development was to not use my name in the text to voice alert from my speakers. I chose to use “O Prior has been notified” rather than using my name, Olivia.

Screenshot of the recursive loop picking up the text to voice message.
Screenshot of the recursive loop picking up the text to voice message.

For the UX/UI of the application, I chose to use a simple button. When the application was not actively listening a button would appear that said “Shhhhhhh!”. If the button was clicked, a microphone permissions prompt would display requesting access. Once the application was listening to the room the entire screen would turn black to be “inconspicuous”. The stop button was designed to be black and appears if the cursor hovers overtop of the element. If the name Olivia is heard in conversation, then a .gif file plays showing someone pressing an alert button. The video and message loop twice before returning to a black screen.

Video demo of Eavesdropper


One challenge I faced was attempting to connect two channel to the IFTTT applet. I wanted to additionally send the transcript as data through the SMS notification. The applet that was connected to Adafruit only allowed for the data of one channel to be used in the SMS. Due to the set up of the applet, I could only compare on direct values (such as greater than, is not equal too, etc.) This inhibited me from using the transcript channel as a trigger to send the message. Alternatively, I could have set up the applet so that it sent a message anytime the transcript channel was updated. With this method, I would have to be concerned with character length and substring the message to ensure that the data would not exceed the character limit for the SMS. I did not want to cut the transcript short, so I chose to use the boolean method. If the user wanted to see what was being said about them, they could investigate the transcript channel and use the time the text message was sent as an indicator for what was being said about them at that moment.

The other challenge I noted was the text to speech API. I had to use a function that checked many different iterations of “Olivia”. This included all different capitalizations of Olivia and with different punctuation. This was only an issue once so all of the iterations may not be necessary. The function that I used is incredibly useful if this program were adapted to listen for multiple keywords. The program loops through the transcript and checks for strings for a list of words that are kept in an array. The array can be modified to store whatever the user would want to be notified of in a conversation.

Next steps & conclusion

The next step for this project would be to find a way to use the data from both of the channels. Having different customized messages from the triggered conversation I think would provide a better experience for the user and would stop the application from being redundant.

This applet is an initial exploration into connecting a web application to IFTTT. Adafruit IO is a useful technology for networking Internet of Things (IoT) devices. In another iteration, it would be interesting to see how this applet could be applied to a bespoke IoT device, such as an Arduino voice to text sensor. The JavaScript voice to text API is still in beta mode, and even though the development on it is improving issues such as frequent permissions become an issue if the desired goal is continuous audio listening on a space. The JavaScript API is not a replacement for tools such as a Google Home or Alexa for this reason.

Overall IFTTT is an intuitive platform that uses simple and accessible logic that allows many people to create bespoke trigger and responses. Some limitations may be an issue if one were to have their applet on a grander scale, such as the SMS message limit of 100 per month. This web application is a simple demo of what can be achieved with lots of other potentials to adapt to more bespoke applications.

References & Research 

Configuring IFTTT using Adafruit to SMS

Function for finding multiple strings


ValenClick – Jingpo and April

1st Idea:


Code: https://github.com/jli115/GoogleHome


Design Concept:


This time we consider to do something fun with the IFTTT functions. After researching the current projects online, we found it is possible to link Google Home to “this” using Google Assistant, and then by linking “that” to webhook service in IFTTT, we can control the output by directly speaking to the Google Home.


The set-up of Google Home:

In “”this”:



Webhook Configuration:




The URL should be the IP address+index.php?state={{Text Field}}

We were inspired by the project, “Google Home – Control DIY Devices”. This project shows how to control multiple IoT devices using Google Home using php. However considering we were planning to use Arduino feather as the output, the configuration might be a little different. We changed the “that” service to Adafruit and expect to control the result using the toggle block in the Adafruit.


So far, the logic of the data flow we are planning is:

Google Home > Google Assistant > IFTTT > Webhook > PHP > Turn on/off the light


Google Home > Google Assistant > IFTTT > Adafruit Send Data > Adafruit > Arduino Feather Board > Turn on/off the light



  1. Arduino code

After we went over the Adafruit IO feed and dashboard basics guides we learnt in class, we all agree the most challenging part of this project is to get Arduino code working on Adafruit platform. We found online source under Adafruit learn section. It covers the basics of using Adafruit IO, showing us how to turn a LED on and off from Adafruit IO using any modern web browser.

2. Google Home

We failed to connect to the Google Home SSID from the school’s Wi-Fi settings. We guess Google Home cannot connect to public Wi-Fi.

Step 1: Adafruit IO Setup:




Step 2: Arduino Wiring


Step 3: Arduino Setup

Have the Adafruit IO Arduino library installed and open the Arduino example code.

We follow the tutorial step by step. When we compelling the code, it didn’t work. We keep adding libraries indicated by Arduino IDE.



2nd Idea:

Code: https://github.com/jli115/ValenClick

Design Concept:

Considering the timeframe of this homework, we had to change our mind towards something easier to approach and more manageable. As Valentine’s Day is at the corner, we were thinking to relate the project to it. We believe that for some people, to tell someone their love can be very hard, but also, to break up with someone they do not feel anything anymore can be even harder. There comes our project “ValenClick”, the users can send their love or no to anyone by just one click…in a funny way.


The interface is super clear and simple, the users just need to click the right or the left side of the screen to send the different emails to their receivers.

IFTTT configuration:

The centre of the image is about 680px.


Configure the left IFTTT applets:
12 13

Configure the right IFTTT applets:14 15

Test: 16






Code: https://webspace.ocad.ca/~3170557/UbiquitousComputing/Week5/SOS.rar

For this project, I wanted to explore how the communication between the Feather ESP32 and P5 using Adafruit IO can work.  I wanted learn more about the process and see if I can implement it in my other projects.

SOS is a simple device that allows users to contact Omid if they ever need him. The goal of the project is to notify Omid if any one needs him while he is sitting at his desk, listening to music and working on his own projects, without badly disturbing him. This system involves two different point of interaction. One is the SOS Device that Omid has to install on his table. This device has a LED, a Vibration Motor and a button.


The other point of interaction with this system is the website. The website is a simple button that allows you to contact Omid. When the button is pressed, The LED on Omid’s device turns on with a small vibration indicating some one is looking for Omid.


When Omid Notices this LED, He can press the button on his device, which then notifies the user that Omid is on his way.


In addition, Using a custom IFTTT, Whenever a request is being sent, a text message is also sent to Omid’s Phone in case he is not sitting at his desk.

untitled Setting up the Feather was not very challenging. I used the two examples provided by the Adafruit IO Library, Digital In and Digital Out. I mixed the programs together to try to communicate both ways. I did also run the Analog examples and use the Block options in adafruit IO which was very fun to do. During my exploration, I noticed that some times because of the message rate limitation in the Adafruit  IO, the messages were not sent from the SOS device causing some minor errors in the program. But it very rarely happened so not a lot of problem there. But what I noticed and was troublesome was the delay that occurred when dealing with IFTTT. In some cases the message took more than 3 minutes to be sent to my phone. I did explore the IFTTT and setup a bunch of different Applets and tested them out. But the same problem was there. I tried designing a circuit for my Ring doorbell, so that every time someone rang the bell and LED Would go off, but there was delay in the system that made it really ineffective.

What I did found useful was how easy it was to log all the activities that happened in my program in a google sheet. This would be very ideal in cases when you want to track everything that happens in your program but you don’t need the data live.



Adafruit IO API Docs – https://io.adafruit.com/api/docs/

Adafruit IO Basics Digital Input Example – https://learn.adafruit.com/adafruit-io-basics-digital-input/overview

Adafruit IO Basics Digital Output Example – https://learn.adafruit.com/adafruit-io-basics-digital-output



Button of Fun




Button of Fun is a p5.js sketch used as an interface with an IFTTT applet. Meant to be a creative way to take a break and laugh. The P5.js sketch and the IFTTT applet are both meant to be utilized as buttons.  You can interact with the happy face in the p5.js sketch from the IFTTT applet on your mobile phone. “The Smiley has travelled far from its early 1960s origins, changing like a constantly mutating virus: from early-70s fad to late-80s acid house culture, from millennial txt option and ubiquitous emoticon” (Savage, John ,2009. The Guardian).Through adafruit IO you can communicate from IFTTT to p5.js to create an interaction. Adafruit IO allows us to transfer data and utilize libraries to interact with the server. The application portrays the relationship between devices  and personal customization for an intended interaction.


The most intricate aspect of this project was creating statements to get the adafruit applet through IO to communicate with error free code in the p5.js sketch. I wanted to get a better understanding of how to create effects in p5.js through IFTTT. I felt this would give me a strong foundation to connect simple applets through p5 in the future by focusing on the connection between IFTTT, IO and p5; versus a small aspect of p5 and 2 applets. As my first experience working with IO I chose the adafruit applet button because I wanted to become more practiced at understanding how to incorporate IO more seamlessly  and utilize data in and out through this platform to trigger interactions.


  • JQuery JS –JavaScript library that can be used to simplify event handling.
  • Adafruit IO – Data connection software.
  • p5.js – Library for graphics and interactive design.
  • IFTTT- Connects applets that work with your services like twitter, facebook or smart devices.



The code I expanded on was provided by Nick Puckett and Kate Hartman to connect  p5.js to IO Adafruit. Getting the value from the data feed to connect to P5.js initially triggered error after error.  Through IO you can push data to communicate with IFTTT applet to design an interaction. IFTTT (If This Then That) creates a 2 part system. If this occurs then that will happen and this works with services like Instagram and digital devices in your home like Hue lights. Some of the AdafruitIO documentation was limited I needed to incorporate jQuery and learned about including a promise.  A Promise is an object representing the eventual completion or failure of an asynchronous operation. Essentially, a promise is a returned object to which you attach callbacks, instead of passing callbacks into a function. When I passed callbacks into a function I kept getting errors.



I used 2 Adafruit applets and programmed them. The first applet is ( send data to fun button feed) the second applet is (an alert notification based on the data from P5). Whenever the fun button is initiated a certain number of times an automated  mobile notification that says“You are so fun for creating a fun button, a fun button is a button that is fun, fun, fun,fun”.  The second app initiates this text based on a data parameter. When that parameter is reached I am sent a notification.


Working with IFTTT it’s interesting the different initializations that you can connect to each other. The trigger of actions and relationships is a longer exploratory process now that I have a basic understanding of how to connect everything. I think that there are some other fascinating amalgamations with this process such as networked automation. I’m really inspired by using this in future iterations of gathering data from simultaneous places and converting that into one data stream through IFTTT and P5.




 (Links to an external site.)

Links to an external site.


 (Links to an external site.)

Links to an external site.



IFTTT – Shake, you’re Late!!!

Because announcing your late arrival to those early 8:30 classes is very important.
Because announcing your late arrival to those early 8:30 classes is very important.




Shake, I’m late!!! (no censoring needed!)

The main idea is if you’re running late, shake your device to send a message to your colleagues on Slack! instead of typing it out as you’re running or bustling through your commute. But why Shake? I had hoped to convey this frantic state of being late expressed through shaking your mobile device, instead of having ourselves focus on the screen to type to notify as you’re walking or driving. I equated this experience to rushing/running to class, with your device either in your hand or in your pocket, shaking as you hastily move. Because with all that effort to make sure you’re not too late to class, why not let it produce a useful result? 🙂

My goal with this experiment is to see if there can be a more dynamic experience to IFTTT than just simply pressing a button or sending a string through using p5.js. I realized if I had opened my scope to using the Adafruit feather, there would have been other possibilities, but I narrowed it down to what can be used on our mobile device, as it is always available to us. Of course, the webpage would need to be displayed first for it to work.


This applet is used to send a message to a Slack channel or a direct message to anyone on your Slack workspace through monitoring on the Adafruit IO feed. Once it reaches a certain threshold, it would send the data through IFTTT and before the user decides to turn it on, it can be customized to their own message beforehand on the applet itself. This can be used to post messages on the user’s Slack workspace, depending on the situation.



I was mostly inspired by the “Shake It” feature from the LINE Chat Application. It is a popular app to use in East/Southeast Asia as opposed to Whatsapp or iMessage. I also thought about how shaking the device has only been limited for use in “sharing” or spreading contacts, and has not been used in other contexts.

Steps to adding a new contact via "Shake It" feature
Steps to adding a new contact via “Shake It” feature

I began by identifying what type of input I want to implement as a trigger to sendData(). I looked at keyPressed by perhaps creating a specific key combination for a shortcut that would send data somewhere as a notification and I also considered touchMoved for mobile devices. After looking at the examples for deviceShaken, I realized that I can really explore this for the experiment and see how it can effectively be used for an applet.

Using a default sample from the p5 deviceShaken page and creating a button to activate new channel
Using a default sample from the p5 deviceShaken page and creating a button to activate new channel.

I could have easily used a previous channel made from the example in class (the xPos or yPos, or even the message channel) but I wanted to test if merely shaking the device will create a new channel on the Adafruit IO feed itself. When that did not work, I had to initialize a new channel by creating a button on desktop, and once that was clicked, a new channel appeared by the name “colors”, since when the device is shaken, the background colors would change. Once this was done, I was able to see the values on the channel feed, confirming that the deviceShaken trigger was successful.

'colors' Channel activated
‘colors’ Channel activated
Feed after deviceShaken

The next thing I had to decide on was the “then that” part, where I searched for different search terms, such as “recommend a song” “recommend random” “tweet random” etc. My initial idea involved shaking the device and it would send me a text message or an email with a song recommendation from Spotify, but it was not an available action when connected onto the Spotify service. I had to recommend the action, and hopefully it can be realized in the future to expand the potential of IFTTT!

Available actions on the Spotify service
Available actions on the Spotify service
Applets I was Interested in
Applets I was Interested in

I narrowed my concept down to using the Slack service and decided that the shaking mechanism suits the idea best as it “quickly” notifies someone that you are running late. The Slack service also only provides 1 action under the “then that” section.

Only 1 action available under the Slack service
Only 1 action available under the Slack service

After finalizing the applet, I tested a few thresholds while monitoring the Adafruit IO feed. The first code had the values increasing by +5 every time the device shakes, which was a rigorous effort on my part to shake and check to make sure I had reached a value of 255. Since I wanted it to be quicker, I tested instances with a threshold of 200 vs 250, and changed the value from +5 every time the device shakes to +10. For this experiment I ended up with +10 with every shake to hasten the process of getting the notification on Slack. I had to test this on email first to see if the shaking mechanism can trigger to sendData, and it proved successful. The example below was triggered when the values reach 200.

Email test
Email test

When I finally tested on Slack, there was a significant delay before it posted. While the email was rather instantaneous, it took around 1 minute to be able to see if it had posted. In the applet, I had set the message to post as a private message to Slackbot for the testing, and it appeared either after some time or when I “Check” the applet. This was an interesting finding as the applet would work differently for other people and also, could be affected by the amount of people using the same applet on the same Slack workspace… that would be one to test.

Testing on Slack
Testing on Slack

After testing frequently on Slack, the shaking ended up becoming instantaneous, which proves that the run time on the Applet fluctuates quite a lot. I also alternated between channel and private message to see if there were any differences, but it proved to be quite consistent.

Testing on a channel
Final test on Slack, Notification pops up as values trigger sendData()
Final test on Slack, Notification pops up as values trigger sendData()

Another aspect I wanted to explore was the visualization on the p5 webpage itself and thought to implement the example with the bouncing ball on the p5 example page with some slight color tweaks and changes. However, since it required the use of classes, I decided to go with the example from the Creation and Computation github called Color Shake, and mapped the values from the shaking action to the background color.


I believe there is still a lot of potential in creating IFTTT projects once more actions in each services are added. There is still so much that can be done to enhance the way we use our devices and the applications on our phone, and IFTTT is something that could help realize that. It would also be nice to see other ways in which deviceShaken can be implemented in other projects – it’s something that is still rather gimmicky as opposed to contributing to an experience. I can see it be used in perhaps multi-sensory experiences and in games, but perhaps the sensitivity of the shaking itself is something to consider. Overall, its potential might be dependent on the limitations of our handheld devices.


p5 device shaken
p5 mapping
p5 device shaken threshold
slack applet idea
line shake it

High Scores

This exercise was fairly simple, thankfully, as I didn’t have too much time for it what with the thesis deadline on Friday.

The hardest part was coming up with a reason for the data to come from a webpage at all. Often when I think about adafruitIO, it’s in relation to hardware, gather passive sensor input, and my first instinct was to do something with this. However, I saw the applet on IFTT which allowed you save records into a google sheet, and this intrigued me. I came up with the idea of recording data into a leader board for a game.

This was trivial to set up. I didn’t want to reinvent the wheel, so I simply used p5’s snake game tutorial. Everything stayed relatively the same, except that I changed the controls to work with WASD instead of IJKL, and added a congratulatory message for high scores.

function checkGameStatus() {
 if (xCor[xCor.length - 1] > width ||
   xCor[xCor.length - 1] < 0 ||
   yCor[yCor.length - 1] > height ||
   yCor[yCor.length - 1] < 0 ||
   checkSnakeCollision()) {
      var scoreVal = parseInt(scoreElem.html().substring(8));
      scoreElem.html('Game ended! Your score was : ' + scoreVal);
      if(scoreVal >= 10){
        gratsElem = createDiv('Score = 0');
        gratsElem.position(20, 40);
        gratsElem.id = 'grats';
        gratsElem.style('color', 'white');
        gratsElem.html('Congrats! Your high score has been recorded!');

By the way, this felt redundant AF because IFTT will check for the score’s validity too. This could be salvaged if IFTT let you compare values to other values, instead of forcing you to hard code them, but whatever. Once the game ends, it sends the score to adafruiIO, which in turn, sends its data to IFTT. I have two applets recording data there.


The first is that google drive applet I mentioned earlier. This one is a bit slow to start up, so I got the second one to test with faster. The conditions are the same for them both, though. When a player reaches the “high score” of 10, then save the data to the drive, send me a notif.


These work really well.



Future work:

Now that I have this data, it would make sense to add a leaderboard to the game page itself. I started looking into this, but the google API for fetching data from sheets is… A Lot. Will take more time/effort than I have to spare this. It makes sense tho. That could be sensitive data, and there’s a lot of layers of authentication going on. Anyways, this would involve getting pubnub set up to take the API, and modify the p5 code to make a call to pubnub to make a call to google’s sheets api and retrieve the scores sheet. Then, once the sheet’s data is received, it would be to sort from highest to lowest, and display it on the game upon game completion.



Mom finally gets a Call!


Mom Finally gets a Call


I wanted to explore the notification protocol I used in my Creation and Computation project, it was rushed and I used code from a third party to get the notification working with IFTT, it wasn’t the best implementation as it lacked a lot of flexibility and there was no way to see any data logs of when the data was sent. This new way through AdaiO is ideal for the implementation of that IOT (internet of things) product.

I took this opportunity to rebuild the project with the new notification protocol though AdaIO and IFFTT as a whole. It uses some of the original code for getting the sensor information.

The Process:

In this this write up I will focus mainly on the communication protocol as the Creation and Computation project mainly focuses on the product design and implementation of the circuit and design.

This time around I had already a proof of concept on how the project worked and now it was a matter of getting it to send that via AdaIO

Before I could tackle that I had to figure out how AdaIO worked and all the libraries involved in IOT, for this I re-created one of Adafruits ESP32 Feather projects of mailbox that has it flag go up when a email is received in gmail. This got me familiar with all the libraries and also the sample code for the feather was very useful to study for setting the wifi configuration. I got this working with my gmail account and was flooding my phone with IFTT notifications.

Here is the LINK to the project.

Once I had that working it was a matter of reverse engineering what I had learnt and got it to work the other way around, here another one of the code examples helped, its the most basic one called Adafruit io Publish.

The basic premise is I had to set up a function to send the AdaIO a variable once the light ON statement had triggered the function, it would then send the people I had set in IFTT an email telling them the lamp was ON. This is the exact same thing I had before but it was not saving any data on how many time it was triggered and also no way of controlling how many people received the email, it was a bit bootstrapped to fit the needs I had at the time.

Bits and Bops:

The code is all on git Hub but it is good to hi-light some of the main parts of the process. These do not have to be done in sequential order to work.

  • AdafuitIO receives data on a certain channel and then triggers an event.
  • Setup the feather to send that variable to Adafruit IO in a function which is triggered by an event, a sensor value, getting remote data etc, anything that says OK this happens DO that, where that sends the trigger.
  • SetUp IFTT to react to the trigger in AdaIO, make sure you have the right settings in the


The project was a very interesting exploration, of the adfruit IO ecoSystem and all its different components, there is some limitation to the amount for data point you can send but for small projects you are fine. You can send direct email triggers through adafruitIO but it is not available to free accounts.

You can also have adafruit IO send you a notification when a value is received, in the trigger value.


Adfruit External Services: https://io.adafruit.com/imaginere/services

Adafruit Mailbox: https://learn.adafruit.com/gmailbox/3d-printing-the-mailbox

Adafruit Iot Library: https://learn.adafruit.com/category/internet-of-things-iot

Word of the Day

For this project, I wanted to explore fetching data from Adafruit IO to use in a p5.js sketch. While exploring IFTTT I noticed that most of the services in the “that” section were either very restricted in their feature offerings or they were tied to a particular IOT home device. I decided to try receiving data from IFTTT via my Adafruit IO account.

My project displays the word of the day from Wikipedia’s Wiktionary website. The word received is then used displayed on a p5.js sketch, showing its definition and

from this project I learned how to use XMLHttpRequest and parse response data while getting data from Adafruit IO.

  • Testing saving multiple values

I created a new applet using Gmail and Adafruit to collect email sender and email subject headline. When sending the data in the “Add ingredient” tab of the applet creation, I realized that I needed to add delimiters to my data so that I could send multiple values in one applet trigger. This is shown in the data below:



Results from testing showing data with a delimiter and data without

  • Sending values back to p5.js script

To get values from Adafruit IO i made a “GET” XMLHttpRequest() to the following endpoint -> https://io.adafruit.com/api/feeds


Note: To access the response data, I noticed that I had trouble getting the returned data when trying to pass the incoming data to the reqListener function as a parameter. When doing this, I wasn’t able to print any of the received data to the console, however I realized that when referring to the current object as this.responseText, I was able to access the returned data by using a reference to the JSON keys to refer to the elements i.e feed.name and feed.description


The results printed to the console are shown below.


  • To get a specific feed I use the following url:

var url = (“https://io.adafruit.com/api/feeds/emails” + “?x-aio-key=” + AIO_KEY);

The JSON response was then parsed and printed to the screen. Below is the result of the test showing the sender and email subject of the last received email in my school email account.



Once I had my proof of concept. I switched to my Wikipedia Applet that returns the word of the day.



Adafruit IO API Docs : here

XMLHttpRequest : here

Link to code: here

Text Me!



Text Me! is a simple interaction that allows web visitors to send text messages to the website’s owner without the use of the phone number.

The interaction starts on a p5-based web page. It uses three input boxes – asking for name, phone number, and message.


IFTTT only looks for the third input, the “messageIs” feed, in order to activate. Name and phone number are included, and captured by Adafruit IO, for purposes of follow-up.


When data is received by Adafruit IO, a custom IFTTT applet fires that sends a text message to my phone.



The process of arriving at this was straightforward. After reverse-engineering the demo code provided by Nick Puckett and referring to the p5 Reference I was able to put together several working inputs. It took a little trial and error to get them arranged nicely on the screen.docu2


Note in the above screenshots that I had the inputs and channels backwards – the intended name channel was taking the message and vice versa. The next step was to put a label on the inputs and test to make sure there were no more mix-ups.

A bit of testing confirmed that all worked well, although there is and continues to be an unreliable delay between entering information on the website and receiving the text.


I uploaded the project to my website for ease of access. Going forward I’d like to make it look a little nicer and match the visual design of the rest of my website. It’s also limited by IFTTT’s SMS service to 100 messages per month – which I doubt would ever be a problem.

I envisioned having all the information entered by the user included in the text, but Adafruit’s applet’s don’t seem to be able to include multiple feeds in a single applet. So, for now, until I figure out how to write applets from scratch, the information from the website writers will be stored on AdafruitIO.

Blow, Wind, Blow: A Windy Tweet Machine

Tyson Moll


I hooked up a Twitter account with p5 and the Weather Underground. It tweets whenever the wind picks up speed, or whenever I have something spicy to say on the account. Whoop de doo!

Follow it at @TorontoWindy on twitter!

Thanks to a handy-dandy example provided by Nick Puckett, our presiding Ubiq prof, we were able to connect p5 / javascript to io.adafruit.com, which in turn can be accessed by a website called IFTTT (IF-This-Then-That) to perform actions with whatever value you provide it through the p5 environment. IFTTT works in terms of ‘applets’, which perform the “do this whenever that happens” you dictate them to. It supports a variety of services; I used the Twitter, WeatherUnderground and Adafruit functionality.

IO.adafruit.com accomplishes this by capturing data sent to its API, then sharing the information with any integrated services observing the channel the data was sent to. The p5 environment simply has to collect information, then pass it on to IO.adafruit.com.

feeds applets

The first applet allows you to post a Wolfram Alpha response to Twitter using a p5 interface. I crudely merged last week’s example code on retrieving Wolfram Alpha API messages through PubNub with this week’s example code on sending information to io.adafruit.com in order to have the applet post some fun facts about wind on the Twitter account. Essentially, whenever the Wolfram API sends back information to the p5 interface, it triggers the io.adafruit.com function and sends the string to be forwarded to the Twitter account.


The second applet just takes wind data from WeatherUnderground whenever it passes certain thresholds (in KPH) and posts a tweet about it on Twitter. By following the IFTTT’s site’s step-by-step instructions I quickly got the applet running.

I never really payed any attention to the wind direction reports from weather providers but it seemed cool in concept to be alerted whenever it gets real windy in Toronto. So why not? Having it available as a separate service makes it more noticeable to me instead of the information being bogged up with the weather details everyone else wants to know about (e.g. temperature).

It’s likely that the IFTTT service can be circumvented altogether with a solid understanding of each of it’s services APIs, as I would presume many of them are publicly available, but the process of setting this connection up was considerably quick and simple.

The concept of having this access to API’s hosted on the web also interests me in the idea of public engagement of data and its distribution and uses. Maybe the p5 context could be a corporate sharing site for employee photos, maybe it’s a handy way to centralize all your social media sharing activity.