Eavesdropper

Workshop Notes #5
Eavesdropper
Ubiquitous Computing
Olivia Prior

Github
Web Application 

Eavesdropped home page
Web application “Eavesdropper” home page.

Concept

Eavesdropper is a web application that actively listens for the mention of a user’s name spoken in conversation. The application uses a voice to text API that transcribes conversations that are in listening range. The transcriptions are analyzed, and if the name of someone is said the application will sound an alert noting that that person is going to be notified through text message. Simultaneously, the clip of what was being said around the user is saved. The user then receives a text message and can go see what was being said about them.

Objective

Eavesdropper is an exploration into creating an applet on the platform If This Then That (IFTTT) using Adafruit IO. The purpose of IFTTT is to make a trigger and a response. I wanted to create an accessible and customizable web page that anyone could use as an applet. The JavaScript voice to text API analyzes what is being said throughout the space that the application is opened in. If the text detects the name is sends two pieces of data to Adafruit IO: the snippet of conversation containing the user’s name and a “true” statement. IFTTT is linked with Adafruit IO; if the channel data matches “true” then the applet sends a text message to the associated user letting them know that someone is mentioning them in conversation. The web application simultaneously uses the text to voice API to announce a message to the party that set off the trigger. This applet is simple to set up, allowing anyone to create a transcription analyzer that can notify them of anything they so wish.

Process

Building upon my previous voice to text “DIY Siri” project, I wanted to play around with the concept “what if my computer could notify me if it heard something specific?”. I initially thought that it would be interesting to build directly off of the Wolf Ram Alpha API from the DIY Siri project to notify me if something specific was searched. From here I decided that I wanted to isolate the working parts and start with the concept of “the application hears a specific word, the user gets a notification”. I chose to use names as a trigger because they are rare enough that the trigger would not be sent frequently. This is important because both IFTTT and Adafruit IO have data sending and receiving limits. IFTTT has a limit of sending up to 100 text messages a month, and Adafruit IO has a limit of updating channels 30 times a minute.

I started off by using my existing code from DIY Siri and removing any of the PubNub server integration. I then changed the code to analyze the transcripts of what was being said. If my name was mentioned, then log this information.

Iterating through the transcript to detect if the string "Olivia" was picked up
Iterating through the transcript to detect if the string “Olivia” was picked up

My next step was to connect my Adafruit IO channel to the page. I created a new feed titled “overheard” with two channels: listening, and transcripts. Listening would indicate whether or not my name was overheard, and transcripts would save whatever was being said about me.

After creating those two channels, I connected my voice to text API to Adafruit to see if I would be able to save the value “true” and the transcript of the conversation. I tested with “if my name is included in this transcript, send the data to Adafruit”. This was successful.

Upon the guidance from Adafruit, I started to create an applet of my own to connect this feed to my phone. I chose the if “this” (trigger) to be Adafruit IO, and the “then that” (action) to be an SMS text message. On the Adafruit side, I selected to monitor the feed “overheard” and the channel “listening”. If “listening” was equal to the data “true” then send a text message. The UX of IFTTT made it simple to connect the two platforms together.

 

Step 1 in IFTTT applet, find the "This" platform Adafruit
Step 1 in IFTTT applet, find the “This” platform Adafruit
Monitor the channel listening - overheard. If the channel has the value true, send an SMS trigger
Monitor the channel listening – overheard. If the channel has the value true, send an SMS trigger
Message that will be sent in the text message.
Message that will be sent in the text message.

 

I started testing my application with all of the parts now connected. At first, I was not receiving text messages. This was because I was sending Adafruit a boolean value and not a string. The “equal to” on the IFTTT side of the platform was comparing the channel value to the string “true”. I changed the value of what I was sending to Adafruit to a string and was able to receive a text message.

Screenshot of receiving multiple notifications in a row from the recursive loop.
Screenshot of receiving multiple notifications in a row from the recursive loop.

Once I received a text message, I resulted in receiving six in a row. I realized that my voice-to-text alert that played upon hearing my name was vocalizing my name out of the speakers, which in result my application was picking up. This created an infinite loop of alerts. “Alert alert Olivia has been notified that you mentioned her and received a text message”. I attempted to stop the recursive loop by turning off the voice recognition and restarting it. The issue was with each time a new voice recognition object is instantiated explicit permission from the user to have their microphone activated was required. A quick fix for this was so that I could continue development was to not use my name in the text to voice alert from my speakers. I chose to use “O Prior has been notified” rather than using my name, Olivia.

Screenshot of the recursive loop picking up the text to voice message.
Screenshot of the recursive loop picking up the text to voice message.

For the UX/UI of the application, I chose to use a simple button. When the application was not actively listening a button would appear that said “Shhhhhhh!”. If the button was clicked, a microphone permissions prompt would display requesting access. Once the application was listening to the room the entire screen would turn black to be “inconspicuous”. The stop button was designed to be black and appears if the cursor hovers overtop of the element. If the name Olivia is heard in conversation, then a .gif file plays showing someone pressing an alert button. The video and message loop twice before returning to a black screen.

Video demo of Eavesdropper

Challenges

One challenge I faced was attempting to connect two channel to the IFTTT applet. I wanted to additionally send the transcript as data through the SMS notification. The applet that was connected to Adafruit only allowed for the data of one channel to be used in the SMS. Due to the set up of the applet, I could only compare on direct values (such as greater than, is not equal too, etc.) This inhibited me from using the transcript channel as a trigger to send the message. Alternatively, I could have set up the applet so that it sent a message anytime the transcript channel was updated. With this method, I would have to be concerned with character length and substring the message to ensure that the data would not exceed the character limit for the SMS. I did not want to cut the transcript short, so I chose to use the boolean method. If the user wanted to see what was being said about them, they could investigate the transcript channel and use the time the text message was sent as an indicator for what was being said about them at that moment.

The other challenge I noted was the text to speech API. I had to use a function that checked many different iterations of “Olivia”. This included all different capitalizations of Olivia and with different punctuation. This was only an issue once so all of the iterations may not be necessary. The function that I used is incredibly useful if this program were adapted to listen for multiple keywords. The program loops through the transcript and checks for strings for a list of words that are kept in an array. The array can be modified to store whatever the user would want to be notified of in a conversation.

Next steps & conclusion

The next step for this project would be to find a way to use the data from both of the channels. Having different customized messages from the triggered conversation I think would provide a better experience for the user and would stop the application from being redundant.

This applet is an initial exploration into connecting a web application to IFTTT. Adafruit IO is a useful technology for networking Internet of Things (IoT) devices. In another iteration, it would be interesting to see how this applet could be applied to a bespoke IoT device, such as an Arduino voice to text sensor. The JavaScript voice to text API is still in beta mode, and even though the development on it is improving issues such as frequent permissions become an issue if the desired goal is continuous audio listening on a space. The JavaScript API is not a replacement for tools such as a Google Home or Alexa for this reason.

Overall IFTTT is an intuitive platform that uses simple and accessible logic that allows many people to create bespoke trigger and responses. Some limitations may be an issue if one were to have their applet on a grander scale, such as the SMS message limit of 100 per month. This web application is a simple demo of what can be achieved with lots of other potentials to adapt to more bespoke applications.

References & Research 

Configuring IFTTT using Adafruit to SMS

Function for finding multiple strings

 

High Scores

This exercise was fairly simple, thankfully, as I didn’t have too much time for it what with the thesis deadline on Friday.

The hardest part was coming up with a reason for the data to come from a webpage at all. Often when I think about adafruitIO, it’s in relation to hardware, gather passive sensor input, and my first instinct was to do something with this. However, I saw the applet on IFTT which allowed you save records into a google sheet, and this intrigued me. I came up with the idea of recording data into a leader board for a game.

This was trivial to set up. I didn’t want to reinvent the wheel, so I simply used p5’s snake game tutorial. Everything stayed relatively the same, except that I changed the controls to work with WASD instead of IJKL, and added a congratulatory message for high scores.

function checkGameStatus() {
 if (xCor[xCor.length - 1] > width ||
   xCor[xCor.length - 1] < 0 ||
   yCor[yCor.length - 1] > height ||
   yCor[yCor.length - 1] < 0 ||
   checkSnakeCollision()) {
     noLoop();
      var scoreVal = parseInt(scoreElem.html().substring(8));
      scoreElem.html('Game ended! Your score was : ' + scoreVal);
      if(scoreVal >= 10){
        gratsElem = createDiv('Score = 0');
        gratsElem.position(20, 40);
        gratsElem.id = 'grats';
        gratsElem.style('color', 'white');
        gratsElem.html('Congrats! Your high score has been recorded!');
      }
   sendData(scoreVal);
 }
}

By the way, this felt redundant AF because IFTT will check for the score’s validity too. This could be salvaged if IFTT let you compare values to other values, instead of forcing you to hard code them, but whatever. Once the game ends, it sends the score to adafruiIO, which in turn, sends its data to IFTT. I have two applets recording data there.

two-applets

The first is that google drive applet I mentioned earlier. This one is a bit slow to start up, so I got the second one to test with faster. The conditions are the same for them both, though. When a player reaches the “high score” of 10, then save the data to the drive, send me a notif.

highschore-notif

These work really well.

screenshot_20190212-003405_ifttt

scores

Future work:

Now that I have this data, it would make sense to add a leaderboard to the game page itself. I started looking into this, but the google API for fetching data from sheets is… A Lot. Will take more time/effort than I have to spare this. It makes sense tho. That could be sensitive data, and there’s a lot of layers of authentication going on. Anyways, this would involve getting pubnub set up to take the API, and modify the p5 code to make a call to pubnub to make a call to google’s sheets api and retrieve the scores sheet. Then, once the sheet’s data is received, it would be to sort from highest to lowest, and display it on the game upon game completion.

LINK

CODE

CountrySize

Concept:

I am sure you know the biggest country in the world. I think you also know the top 5 biggest counties but I doubt if you know top 15 largest counties. Which country is bigger or how much bigger? When a question is asked, Wolfram Alpha’s response is displayed.

People are bored of numbers and sentences. Instead of representing numbers, represent through aesthetic visuals. Instead of designing for affect-as-information, design for affect-as-interaction. Our assignment is to explore a new way to ask questions and get the results by using specific type of data.

I change a way to ask the question. Instead of asking or typing the whole questions (What is the size of Canada?), just type the country name.

screen-shot-2019-02-05-at-12-01-46-am

Instead of giving you the answer (The total area of Canada is about 3.86 million square miles), I wanted to use different size of circle to represent the size of the country.

screen-shot-2019-02-05-at-12-11-47-am

Process book:

screen-shot-2019-02-05-at-12-55-23-am

Above is my first successful attempt at accessing split Wolfram Alpha’s response on a p5 canvas.  Split an array so that in every loop, those value are entered separately side by side.

In order to get some visualization, I need to extract the number out of the answer(text). I found out these answers have the same format as “The total area of (country name) is about (certain number) square miles.” That is to say I need to have the number right after “about”.

I used for each statement to find out “about” in an array. The forEach() method executes a provided function once for each array element.

screen-shot-2019-02-05-at-1-00-10-am

Challenge:

I still cannot find a way to solve the problem. What I did for the code is to extract the number after “about” and to make a circle based on that number. The thing is the answers have different units and the range of those number is too big.

For example:

“The total area of Canada is about 3.86 million square miles.”

“The total area of Thailand is about 513,120 square miles.”

So if I just used the number after about, Thailand would be larger than Canada and cannot fit in canvas.

I also tried to ask the question in a different way, such as”The size of Canada in million square miles” They don’t have an answer for that question.

 

Colour Radio

 

Untitled-1

 

 

img_3312

 

Overview:

Colour Radio is a communications project encompassing analog signals conceptualizing an interaction through sensors and analog radio communication. This project incorporates an interactive transition through the colours in your surroundings with a colour sensor. With Neo Pixel LED lights we have created a visual output that acts as an abstract representation of environments. Our intention was to create a beautiful visual that represents analog communication through Xbee components.

 

Ideation:

Phase 1 of brainstorming incorporated the knudgeables we explored the idea of creating a wearable that surrounded an interaction based on being able to express your state of mental health.

Phase 2 was discussing other forms of communication with the Xbee radios. At this phase of discovery, we decided that we wanted to delve further into an aspect of networking. We wanted to learn how to handshake between the Xbees and formed our project around the mode of learning we wanted to explore.

image-from-ios

img_3278

Phase 3 Throughout this stage we discovered that we wanted to make a representation of radio communication.
We discussed analog technology and the older forms of electrical components like the old vacuum tubes in televisions. While contemplating the circuit and its components we decided to use a sensor that could grasp aspects of surroundings without obvious communication like words, letters or noises. We discussed Infrared sensors and other distance sensors but decided to incorporate a colour sensor into the project and use LEDs as an output of this data.

photo-on-2019-01-25-at-7-26-pm
screen-shot-2019-01-29-at-11-18-12-am

Process:

We initiated the projects 1st steps by calibrating the light sensor. This was fun for the group because it was our first time working with colour sensors. The light sensor works by shining a white light at an object and then recording the reflected colour. It can also record the intensity of the reflection. Through red, green and blue colour filters the photodiode converts the amount of light to current. The converter then converts the current to voltage which our arduino can read. We weren’t able to get good absolute RGB readings from the sensor What we have been able to get is what I would call relative RGB, as the colour it is sensing is clearly visible by one value spiking, but it’s not within the 0-255 range. So a red object might cause the red value to rise by 30, not by 255. This is clearly visible when connecting an RGB LED, it will change colour, but the colour change might not be visible, an issue is also that generally blue LEDs have a higher energy output so it might overpower.

With further exploration into the networking of Xbees with sensors and the variations of input and output while exploring various nodes we originally had some issues transferring through the RGB values.  We would get some data readings to communicate as an output through the Xbees but would continually stop working.
screen-shot-2019-01-30-at-9-26-54-am
Our goal was to establish a call/response handshake between two Xbee radios, however we had many issues executing this. We set up a transmitting XBee ( A ) connected to an Arduino that would send RGB values and a receiving XBee (B) that would get those values and use them to light a NeoPixel LED strip. We kept getting strange values in the buffer on the receiving end and data would sometimes be sent and sometimes not. We weren’t able to get it working however were able to get the handshake to work when the receiving XBee (B) was  connected to a laptop directly without an Arduino, with this setup we were able to initiate the handshake call/response action whereby the XBee(A) would send “hello” until we sent a character over the serial monitor and it would respond with the RGB reading from the color sensor. The image below shows a screenshot of this (The values returned are shown twice as we were printing out Serial and Serial1 values).

screen-shot-2019-01-30-at-9-21-23-amIn the end we modified our code so that XBee (A) transmitting color sensor values and the NeoPixel strip ran on the same breadboard, so that the strip would change colors according to the colour sensor reading. The receiving XBee (B) was connected to Processing and the values received were used to change the background of a sketch, we envisioned this as an additional representation of the color sensor’s readings.

Conclusion :

For future iteration, we would create a mesh network with colour sensors and more intricate variances in the colour changes of the led’s. Like initiating different patterns perhaps a blending of colours depending what the light sensor picks up. Exploring communication with the Xbees further and advancing on the foundation of this project we would like to add colour temperature sesnors in accordance with basic colour sensors in the future.

 

References:

http://www.robotsforfun.com/webpages/colorsensor.html

How to use the Color Sensor with Arduino board (TCS3200 & TCS3210)

https://learn.adafruit.com/adafruit-neopixel-uberguide/arduino-library-use

 

A “Silent” Alarm / An XBee x Arduino Process Journal

silentalarm_blog

This little device is a simple desk companion that acts as a visual silent alarm for the hearing impaired. In theory it would react to a doorbell and spin indicating that someone was at the door or that mail had been left. Alternatively, it also acts as a distracting toy that reacts to notifications.

XBee Chat Exercise:

During the chat exercise, my partner and I didn’t have much trouble configuring our XBees and were soon able to send messages back and forth. However we didn’t really do much chatting as we noticed that we weren’t able to send long messages because packets would get dropped. One of us would send “Hello” and the other would receive “Hell”. It was interesting to see how this led to funny miscommunications. This led me to the conclusion that the XBee wasn’t really a communication device in the traditional sense of the word. I would have to think of communication beyond words. We found that the most effective way was to send characters.

XBee Radio Receiver / Transmitter:

Tip: Use the edge of the breakoutboard not the XBee to determine where to put the power, TX, and RX pins for the XBee connection.

While testing the XBee using the Arduino Physical Pixel example, I was able to control an LED bulb using the serial monitor, however when trying to control the Arduino x XBee setup using another XBee we ran into some issues. We were able to only achieve one way communication. My LED would light up or go off when receiving signals from my partner’s XBee but I was not able to light their LED using my radio. This was also happening with another group.

Troubleshooting (Tested with 3 different partners as the one-way communication issue occurred each time I tried to connect to a new XBee)

We noticed that:

  1. The radios would work when both were configured on the same laptop.
  2. (Best troubleshooting method) The radios would work after sending messages back and forth over chat.
  3. The radios would work when brought closer together.

XBee Metronome Receiver

For my device’s design choice, I was thinking of the instructions we got in class; to think of an interaction that would be around 20 or more others. Initially, I had wanted to use sound as an output however I figured something more visual would be a better choice as it would still be able to be noticed when around other devices that were reacting to the metronome. Since I removed sound from the equation, and began to focus mainly on visuals, this made me think of the hearing impaired and I thought “What if you could have a tiny visual desk alarm that spins when someone rings your doorbell?”. I also wanted to learn how to work with a servo as I had never used one before.

Issues:

When conceptualizing my design I had envisioned having a rotating cylinder or disk inspired by spinning tops and wheels, however, I realized that the micro-servo can only make 180 degree rotations not 360 degree as I had imagined. I didn’t have the knowledge to hack my servo or the time to get another one so I improvised the rotations to still create an optical illusion effect. Below are some images from my prototyping.

Future expansion:

I would like to continue to explore making desktop companions thinking along the themes of accessibility and self-care toys. I’d also like to work with more servos.

Github link to code : here

Papillon

Project Name: Papillon

Project Members: Omid Ettehadi

Github: https://github.com/Omid-Ettehadi/Papillon

Process Journal:

The XBee Chat Exercise was a fascinating opportunity to explore the XBee devices and to test out the capabilities of the XBee and the CoolTerm Software. Wireless communication in its simplest form can lead to a much simpler connection between different devices in a project.

While doing the exercise, I could not stop thinking about how this system can replace the complicated, internet-based systems that I had used for my previous projects, and how much things would become more reliable with a much shorter lag time.

To make a metronome receiver device, I decided to start with a simple example first, to test the limits of the XBee and find out the range and the kind of packaging I could use in my project. I ran the “Physical Pixel” example and placed the device in different locations to see how different materials affect the range.

For my next step, I changed the Serial in the example to Serial1 and used another XBee with the CoolTerm software to send data to the device.

mvimg_20190113_132413For this project, I wanted to play with something that I hadn’t had the chance to play with before. I decided to create a simple piano by playing different frequencies for each note and allowing the metronome device to set the tempo for the music.

To create different frequencies, I used the “Melody” example, I turned on the speaker, waited for the period of the note and then turned the speaker off and waited for the period of the speaker again. I repeated the same procedure for 20 times so that the note is created. For the music, I chose the soundtrack of the movie Papillion and wrote down the notes for the song in an array.  Every time the device receives an H or an L, it will play a note from the array.

To add more to the project, I added a servo motor installed on a drum piece, so that after every 4 notes, the drum will make a sound as well, keeping the beat of the music.

mvimg_20190114_162206mvimg_20190114_163832

mvimg_20190115_092041

Component List:

components-list

Final Schematic:

circuit_bb

 

 

 

Process Journal #1

1. Chat

Worked mostly as expected. Loose USB cable caused minor problems. Received possibly interference? from other radios communicating as we would occasionally receive input neither of us had sent, despite having set each other as addressees.

2. Metronome

I wanted to explore working with a servo for this. Plugging the servo in, and seeing the regular beat of its movement reminded me of the dull rhythm of working out. At first, i was just using the servo arm to push the cutout up and down, but found I could give it more life by tying a string to the wrist and attaching it to the servo arm. This way, as the arm moves away to push the cutout up, it pulls the arm in. The string relaxes as the arm resets, and lets the cutout “relax” its own arm.

The metronome itself was easy enough to coordinate with, except that I was too far to receive its signal the first time the attempt was made. There is also an obvious delay between the transmitting radio and the reception of its data on the Arduino, noticeable when it takes a few beats for the speed to change after adjusting the potentiometer.

Process Journal 1- Jingpo

XBee Chat Exercise:

Step1: Download software(CoolTerm) and driver(FTDI USB).

Step2: Change the configurations of the radio in command mode.

We need to configure XBees to enable the communication. These radios are configured with AT commands, so we learnt some basic AT commands from AT command set:

ATRE = factory reset – this will wipe any previous settings
ATMY = my address – the label on your radio
ATDL = destination low – the label on your partners radio
ATID = pan ID – the agreed upon “channel” both radios are using to communicate
ATWR = write to firmware – this will make sure the configurations are saved even when the radio loses power
ATCN = end command mode

Open CoolTerm:

Go to Options → Serial Port

Go to Terminal → Select Local Echo

Step 3: Let’s chat!

screen-shot-2019-01-14-at-3-50-36-pm

img_0164


Metronome Receiver Device:

Arduino Code:

Use “Serial” to test the code. Change “Serial” to “Serial1” when arduino is connected to xBee and send the “H” and “L” commands via the serial monitor.

Sample Code is from Arduino example, Physical Pixel. It is an example of using the Arduino board to receive data from the computer.

screen-shot-2019-01-14-at-3-57-39-pm

Add a servo and change the Arduino code:

screen-shot-2019-01-14-at-4-00-09-pm

img_2832-2

Fabrication:

img_3528

I want to have a fan connected to the servo, so I went to dollar store and got this candy toy for 2 dollars, along with an unexpected free motor, a button and a battery.

img_3289img_2411

img_2597