Text Me!

nickalexander.ca/getintouch

GitHub

docu3
Text Me! is a simple interaction that allows web visitors to send text messages to the website’s owner without the use of the phone number.

The interaction starts on a p5-based web page. It uses three input boxes – asking for name, phone number, and message.

docu6

IFTTT only looks for the third input, the “messageIs” feed, in order to activate. Name and phone number are included, and captured by Adafruit IO, for purposes of follow-up.

docu7

When data is received by Adafruit IO, a custom IFTTT applet fires that sends a text message to my phone.

docu4

docu5

The process of arriving at this was straightforward. After reverse-engineering the demo code provided by Nick Puckett and referring to the p5 Reference I was able to put together several working inputs. It took a little trial and error to get them arranged nicely on the screen.docu2

docu1

Note in the above screenshots that I had the inputs and channels backwards – the intended name channel was taking the message and vice versa. The next step was to put a label on the inputs and test to make sure there were no more mix-ups.

A bit of testing confirmed that all worked well, although there is and continues to be an unreliable delay between entering information on the website and receiving the text.

docu8

I uploaded the project to my website for ease of access. Going forward I’d like to make it look a little nicer and match the visual design of the rest of my website. It’s also limited by IFTTT’s SMS service to 100 messages per month – which I doubt would ever be a problem.

I envisioned having all the information entered by the user included in the text, but Adafruit’s applet’s don’t seem to be able to include multiple feeds in a single applet. So, for now, until I figure out how to write applets from scratch, the information from the website writers will be stored on AdafruitIO.

Art is the Answer

docufinal

GitHub

Art is the Answer is an exploration of the way we interpret information. When a question is asked, Wolfram Alpha’s response is displayed as a procedurally generated artistic abstraction. Art is the Answer disrupts the succinct question/answer call/response of Wolfram Alpha’s usual use and asks the user to interpret what the answer to their question might be, and how the shapes displayed affect that interpretation.

Concept

I started, as I often do when I encounter new technology, by examining each component and breaking them down into their simplest terms in an attempt to develop a mental model. I took a close look at the demo code and copied it over in to my sketch. Having not worked with PubNub with any depth previously (in the Networking assignment of Creation & Computation I spent most of the time working on physical builds) I knew I didn’t want to get too complicated with regards to APIs – I left PubNub and Wolfram Alpha alone after connecting them. Since we were working with p5, a tool designed originally for making art, I decided that I would break down the response from Wolfram Alpha into data that p5 could use for art’s sake.

I was sure there must be a way to turn letters into numbers – after all, in code, everything is a number eventually. I did a little research into letter/number encoding. The first hit I got was for A1Z26 code, a simple cipher in which A=1 and Z=26. I doubted this would suit. Eventually I remembered ASCII and hexadecimal code, and while I was researching these I came across the function charCodeAt(). This function returns the UTF-16 code at the referenced location. A few tests showed that this would be perfect. It was time to get into arrays and for loops.

I had never worked with arrays with any depth before, nor for loops outside of simple limited mathematical situations. I knew that in order to get this to work I would have to store every letter of Wolfram Alpha’s response in an array, and run that array as a for loop so the variables could be used to draw shapes.

Process Journal

docu1

Above is my first successful attempt at translating Wolfram Alpha’s response letters into shapes on a p5 canvas.  Printed in the console is the UTF-16 value of each letter. Those values are entered into an array which in turn is called during every calling of a for loop, and the values are entered as the fill, size, and coordinates of the shapes.

The problem was that since every value was identical within each loop, I always got a print out of a series of greyscale shapes along a diagonal.

After consulting with some classmates, I adjusted the parameters of the shape so that they would not all be identical: they would call the Character Code of the letter in their step ( p[i] ) as well as the Character Code of the letters one or two places ahead of their step ( p[i+1] ) .

docu2

So now we had a variance in shape size and colour, but the numbers being returned from Wolfram Alpha were too small to make a pleasant scatter effect on the canvas – the x and y coordinates were all under 200 on a canvas that was 900 units wide. I decided a few random numbers in the mix to serve as multipliers would be useful to get that effect.

I played around with including the random number generator from Wolfram Alpha as well, but that proved to be a little too complicated to work in to the existing for loop. It also proved unnecessary: the random() function included in p5 was sufficient.

docu3

One random numbers were included as multipliers to the numbers returned from Wolfram Alpha, I had the kind of results you see above. Rather than scattering them around the canvas, they distributed them fairly evenly.

I set the random number to be between 0 and 10, and set the translate() by to move itself by this random amount each loop. Thus, the relative point of origin of every shape was changing in a random direction by a random amount each loop. Finally I had the kind of imagery I had imagined.

docufinal

Next Steps

This project is mechanically similar to my mindPainter project, and it would be interesting to combine them in some way.

I would like to polish the art that this produces so the shapes it paints are not consistently primitives like rectangles and squares.

I would like to add a mechanism wherein a user can click or press a key to see the answer to their question in text – although, this might undermine what is in my opinion the strongest aspect of the piece: that the user has no way to getting to their answer without interpreting the artwork.

References
Demo code by Nick Puckett & Kate Hartman

p5 Reference by Processing Foundation

MDN Web Tools by Mozilla

ASCII Table by asciitable.com

UTF-16 by fileformat.info

 

The Queue Manager

By Cartsa Antariksa, Nick Alexander, Omid Ettehadi, Olivia Prior

Github: https://github.com/npyalex/Ubiq-Class-Manager

Overview

The Queue Manager is the result of an exploration of connectivity using XBee radios and microcontrollers to organize, store, and utilize inputs from a human user. The Queue Manager envisions a classroom in which every attendee has a node XBee/microcontroller apparatus (which we have called a Student Node or just Student) and the instructor has a master XBee/microcontroller (the professor). When they want to speak, a user will press a button on the student to reserve their place in the queue. The professor will press a button to advance the queue. To let the students know it is time to speak, an LED with light up in the student’s circuit.

Ideation

We were excited to work with unusual functionalities of the XBee. Looking at the suggestions we decided we wanted to try to combine:

  • an on-the-fly rewrite of the XBee
  • logging and re-using data
  • a handshake between XBees

We considered desiging a system of haptic walkie-talkies but decided that would be too close to the existing Nudgeables. We thought of modifying them to have different channels that could be scanned through with a potentiometer but discarded it. It was this line of thought that brought us to our idea for The Queue Manager.

Initially, our idea was ambitious. We envisioned a system wherein the queue would be managed, would signal with LEDs when you were up next, and give priority to those who had not yet spoken. It would automatically adjust a user’s place in line based on how often they spoke, meaning those who entered the queue less were more likely to be placed highly in it.

We imagined we’d be logging incoming data on the Professor node, use the XBee rewrite to determine where transmissions from the Professor were going, implement a handshake to confirm with each Student node, and repeat. The idea was that while the Professor node would be listening to the entire group, the group would only listen for the Professor and the Professor would only transmit to one at a time.

In practice, as we will see, this was needlessly complicated.

img_7606

img_20190122_144247

Process

Step 1: Initiate handshake between professor XBee and student node

Our first initial step was for the professor XBee to acknowledge the incoming information from any of the student nodes in class. The student XBees all had the ATDL set to “C” for the professor. When the student would press their button, that XBee would send the professor their ATMY. In theory, the professor XBee would switch its ATDL to the incoming ATMY, sending a signal that would be interpreted as “This student is registered in the class queue.”

We considered pairing the professor XBee with the registered student XBees using XBee I/O, but we needed to manage LEDs on the student XBees. Since the simplest way to manage LEDs would be with Arduino, we decided to eschew using XBee I/O and stick to using an Arduino with every node.

Upon implementing the Arduino rewrite we found a multitude of issues. The first was mixed and interfering data that was flooding our professor XBee channel. We found that this was a miscommunication of the type of data that was being sent between the student nodes and the professor. The student’s were sending characters but the professor was parsing for integers in the incoming messages. Getting this data type protocol correct was challenging due to the nature of the inconsistent transmitting and receiving of the XBee’s. When we were debugging the issue, we found that it was hard to tell if it was the physical hardware setup, the settings on XBees or if it was the code processing the information between the nodes.

When we were able to get clean data being sent from the student nodes to the professor node, we attempted to implement the ATDL rewrite for the professor node. This was challenging due to delays and inconsistent rewrites on the Arduino side. Each rewrite would take over a second long, which we found was problematic for the workflow of our code. Once again, we were unsure if it was the hardware setup, the rewrite library, or the XBees devices. One solution we found that worked was to move the XBees closer together. This allowed for more consistent sending and receiving of data.

Step 2: Confirm handshake and enter student node in the queue

Once we were able to have consistently clean data sending to the professor we wrote each of the student nodes to send their ATMY to the professor upon button press. We chose to still send this data because it is a unique identifier for the XBee, even though we decided to not use the Arduino rewrite for the professor node. We took inspiration from the initial assignment Metronome, rather than the professor rewriting the protocol the professor would transmit back the ATMY it had just received. This method was more of a call and response rather than a handshake. The professor would hear “6” coming in and then broadcast “6” back immediately. When the student node would process the incoming data and that data matched their ATMY, an LED would light up.

This step we found was necessary for our next iteration of the project. In our original scope, we had anticipated having another LED light that would indicate when the student pressed the button. The second light would turn on indicating that the student was now in The Queue.

We found this worked decently but not consistently. Sometimes the student node would send the ATMY twice, causing the LED light to flicker rather than light up. As well, sometimes the student node would miss the response from the professor node. We attempted to send the ATMY three times in a row back to the student in case their XBee dropped the first call. We found this did not work and was unnecessary. As well, we tried to have a general message that would be broadcast when someone would register on the queue. This was the one area we saw the most timing delays between the XBee radios. One student would press their button, and then, in theory, all of their lights should turn on. The lights would turn on arbitrarily depending on when the node had received the message. One thing that worked was having the XBee radios close to the professor node. Because this was unrealistic for the actual purpose of the project, we decided to move onto the next step.

Step 3: Professor logs a question queue ad cycles through with a button press

The final step to accomplish our minimum viable product was for the professor to log the incoming button presses and to store a queue. The professor would then have the ability to press a button that cycles through the queue and this would change the message that the node was transmitting. The professor would then transmit the ATMY of the next person in the queue and their node would process this information and have their green LED turn on. Our goal for this step was “to only have one LED on at a time”.

We created an array that would store every time a student button was pressed. The professor would then press the button and broadcast the first ATMY in the array, this would also move all of the data in the array over one spot closer to the beginning of the array. Our main issue with this step was figuring out how to transmit the data. Back to our initial issues in the development, there were conflicting data types that were not being processed properly and sending weird characters between the radios. Once we organized the information to all be the same data type we broadcast the ATMY to the channel. The LED light of the student who was first in the queue would light up, but also flicker. We realized this was because we were sending each command as a print line statement rather than a print. This extra character was being read and disrupting the LED light to turn on.

If the student was written in the array in two or more adjacent spots, we included a “quick blink” delay for the LED light before transmitting the next ATMY. This was to give feedback to the professor and the student, otherwise, it looked like the professor was clicking the button and nothing was changing or happening amongst the student nodes.

We placed our nodes together and created a queue through button presses and the professor node was successfully able to cycle through the queue, and as well register more ATMY slots in the array as they were coming in from the class.

img_20190127_185046

Step 4: Introducing back the handshakes

After finishing our minimum viable product, we wanted to experiment with different mechanism that would allow us a much smoother experience with more reliable communications. The most effective way we was to create a handshake communication between the Student and the Teacher, so that every time the Student pressed the key, it would send its ATMY as a message to the the Teacher until the Teacher would respond back with a message to confirm the addition to the queue.

Another issue we found was that when the teacher pressed the key, sending the ATMY of the user first in the queue, it would also act as if the Teacher had received a request to add a person to the queue. In order to get around that, when the Teacher sends a data to the students, it will wait to hear its own message in the channel and then it will continue functioning as normal.

In addition, every time a student request for a place in the queue, the Teacher would check if they already are the last person in the queue, if so it will ignore their request.

student_node_diagram_image
Student Node

teacher_node_diagram_bbTeacher Node

img_7803

Findings

As noted in our process the hurdles we encountered in the development of this project included the messy data received by nodes and the master sending unreadable data. Something we attempted to use as a way of monitoring the incoming was adding a “ghost” XBee to the channel. This XBee would listen to the channel and provide us with a serial monitor of all the data that was being distributed. Our development was stalled because the device transmitting mystery data that we frequently mistook for data being sent from our own nodes. Once we isolated the issue and continued development without the extra XBee, our team noticed that there was still messy data that was being sent amongst the XBees. We isolated the issue down to sending mismatching data types. Because we were using numbers, our data types were often confused between either integers or characters. We solved this by declaring all of our transmitted and received data as characters, and placed our numbers in string quotes so there would be no confusion amongst Arduino.

The XBee communication we found was not as reliable as we had previously experienced. We found there were delays in sending and receiving data, and sometimes data would be missed entirely. The only logical solution we found, besides frequently resetting our XBee settings, was to situate the devices close together. We suspected that there may have been some soldering issues or wiring issues that inhibited the distance the XBees could have amongst each other. Once the XBees were beside each other we found the communication greatly improved. This was useful for debugging but not a long term solution if we were to expand our project to include more than four users.

Through what we achieved in this experiment, we believe there is a lot of room to expand the scope to reach our initial concept. The Queue is particularly beneficial for students or participants who still struggle with confidence to voice their thoughts or raise their hand within many classroom contexts. The setup is also quite simple given that it exists within a kit that can be used to supplement lectures in classrooms that lack this ease in feedback method.

In terms of a future expansion, each of the student and professor nodes can be placed within a 3D printed case to hold the circuits in place and can be labelled accordingly. We could have also explored the possibility of having multiple LEDs to indicate different types of feedback for the student node themselves. For example, we imagined a blue LED turning on when the student was entered into the queue, a yellow light when they were next in line, and a green when it was their turn. Perhaps a red light could turn on when a student had entered the queue too many times. We also imagined a system wherein students who had not entered the list yet would be given primacy and placed higher in the queue than those who had. There was also a  potential for a screen interface for the professor to see the queue in real-time, rather than its current view through the arduino serial monitor.

Within real life applications, this manager could expand to contexts outside of the classroom, especially within restaurants or cafes. It would act as a “smart” pager that can cater to customers who request service earlier on, or able to indicate urgency in terms of which customer needs the service the most. It can also function well in locations where internet access is limited, especially in remote conferences and even countries where radio communication is more accessible.

References

https://docs.google.com/presentation/d/1qiehq3F99ZQz_Phf2RWTu0y4yj3hSDWgFGePThFjPxc/edit#slide=id.g4d74fb7caa_0_107

Inspired in part by the server-pagers at Mo’Ramyun Restaurant at 1 Baldwin St, Toronto ON, M5T1L1

 

Metronome Gong

What’s an orchestra without a percussion section? This little robot bangs a gong with aplomb, filling the air with an irritating ring.

The XBee Chat exercise in class was, for me, an opportunity for play. Being able to communicate directly, free of the constraints of networking via the internet, felt like discovering the magic of chatting with a friend over walkie-talkie. Omid and I goofed around like kids, sending jokes and ASCII images back and forth through the XBees.

The freedom of being able to communicate relatively code-free, without having to worry about packets or JSON (which I must admit are a concept I still have trouble wrapping my head around) is relevatory and I’m excited to explore.

The Metronome Receiver device was an opportunity to explore an idea I’ve had since early in the first semester. I had an idea for exploring sensor-controlled percussion instruments way back in the This + That experiment of Creation & Computation that I had to shelve at the time. I had conceived of an Arduino-powered device that struck a  singing bowl when certain criteria were met. When we were given this assignment, I was excited for the chance to try it out.

I started by considering exactly what I wanted to make: a small arm that would bang a gong. Having worked with solenoids last semester I felt comfortable with their coding and operation. I bought a cheap one and tested it out.

Here was my first pass on the code, for testing the solenoid:

int incomingByte; // variable for incoming serial data
int solenoidPin = 2; //control the solenoid
void setup() {
    Serial.begin(9600);
    pinMode(solenoidPin, OUTPUT);
    }
void loop() {
    digitalWrite(solenoidPin,HIGH);
    if (Serial.available() > 0) {
       incomingByte = Serial.read();
       if (incomingByte == 'H') {
//strike the gong then recede
           digitalWrite(solenoidPin, LOW);
           delay(100);
           digitalWrite(solenoidPin, HIGH);
            }
        }
    }
}

Animated GIF - Find & Share on GIPHY

This worked nicely, though I was alarmed at how hot the solenoid got as it was required to stay engaged at HIGH for the majority of the time the circuit was active. I checked my wiring repeatedly and found no mistakes. I looked it up and found multiple sources reporting that this was normal, as long as the heat stayed uniform and did not continually increase. I resolved to keep an eye on it and not keep the circuit engaged for extended periods.

Rather than seek out a drum or gong, I decided to use the singing bowl I own. However, I noticed that the bowl had a high-pitched and long ringing peal, which I thought might be irritating. So I rewrote the code so the solenoid only fired on the third beat:

int incomingByte; // variable for incoming serial data
int solenoidPin = 2; //control the solenoid
int beat = 0; //an integer for monitoring the beat
void setup() {
   Serial.begin(9600);
   pinMode(solenoidPin, OUTPUT);
   }

void loop() {
   digitalWrite(solenoidPin,HIGH);
   if (beat >= 4) {
     beat = 0;
     }

   if (Serial.available() > 0) {
     incomingByte = Serial.read();
     if (incomingByte == 'H') {
       beat++;
     if (beat == 3){
//strike the gong then recede
       digitalWrite(solenoidPin, LOW);
       delay(100);
       digitalWrite(solenoidPin, HIGH);
       }
     }
   }
}

ubiqgong-fritz_bb

Finally, to keep the solenoid in place, I made a little stand out of a coffee cup.

Some issues still to explore and solve:

The XBee only gets reliable reception when it is physically close to the transmitter.

Even when close, it seems to miss or drop beats occasionally.

Parts:

  • Arduino Micro
  • XBee and XBee Breakout Board
  • Barrel Jack
  • 12V 2A AC/DC Adaptor
  • TIP120 Darlington Transistor
  • 1K Ohm Resistor
  • IN4001 Diode
  • Solenoid
  • Alligator clips x2
  • Wiring