Art is the Answer



Art is the Answer is an exploration of the way we interpret information. When a question is asked, Wolfram Alpha’s response is displayed as a procedurally generated artistic abstraction. Art is the Answer disrupts the succinct question/answer call/response of Wolfram Alpha’s usual use and asks the user to interpret what the answer to their question might be, and how the shapes displayed affect that interpretation.


I started, as I often do when I encounter new technology, by examining each component and breaking them down into their simplest terms in an attempt to develop a mental model. I took a close look at the demo code and copied it over in to my sketch. Having not worked with PubNub with any depth previously (in the Networking assignment of Creation & Computation I spent most of the time working on physical builds) I knew I didn’t want to get too complicated with regards to APIs – I left PubNub and Wolfram Alpha alone after connecting them. Since we were working with p5, a tool designed originally for making art, I decided that I would break down the response from Wolfram Alpha into data that p5 could use for art’s sake.

I was sure there must be a way to turn letters into numbers – after all, in code, everything is a number eventually. I did a little research into letter/number encoding. The first hit I got was for A1Z26 code, a simple cipher in which A=1 and Z=26. I doubted this would suit. Eventually I remembered ASCII and hexadecimal code, and while I was researching these I came across the function charCodeAt(). This function returns the UTF-16 code at the referenced location. A few tests showed that this would be perfect. It was time to get into arrays and for loops.

I had never worked with arrays with any depth before, nor for loops outside of simple limited mathematical situations. I knew that in order to get this to work I would have to store every letter of Wolfram Alpha’s response in an array, and run that array as a for loop so the variables could be used to draw shapes.

Process Journal


Above is my first successful attempt at translating Wolfram Alpha’s response letters into shapes on a p5 canvas.  Printed in the console is the UTF-16 value of each letter. Those values are entered into an array which in turn is called during every calling of a for loop, and the values are entered as the fill, size, and coordinates of the shapes.

The problem was that since every value was identical within each loop, I always got a print out of a series of greyscale shapes along a diagonal.

After consulting with some classmates, I adjusted the parameters of the shape so that they would not all be identical: they would call the Character Code of the letter in their step ( p[i] ) as well as the Character Code of the letters one or two places ahead of their step ( p[i+1] ) .


So now we had a variance in shape size and colour, but the numbers being returned from Wolfram Alpha were too small to make a pleasant scatter effect on the canvas – the x and y coordinates were all under 200 on a canvas that was 900 units wide. I decided a few random numbers in the mix to serve as multipliers would be useful to get that effect.

I played around with including the random number generator from Wolfram Alpha as well, but that proved to be a little too complicated to work in to the existing for loop. It also proved unnecessary: the random() function included in p5 was sufficient.


One random numbers were included as multipliers to the numbers returned from Wolfram Alpha, I had the kind of results you see above. Rather than scattering them around the canvas, they distributed them fairly evenly.

I set the random number to be between 0 and 10, and set the translate() by to move itself by this random amount each loop. Thus, the relative point of origin of every shape was changing in a random direction by a random amount each loop. Finally I had the kind of imagery I had imagined.


Next Steps

This project is mechanically similar to my mindPainter project, and it would be interesting to combine them in some way.

I would like to polish the art that this produces so the shapes it paints are not consistently primitives like rectangles and squares.

I would like to add a mechanism wherein a user can click or press a key to see the answer to their question in text – although, this might undermine what is in my opinion the strongest aspect of the piece: that the user has no way to getting to their answer without interpreting the artwork.

Demo code by Nick Puckett & Kate Hartman

p5 Reference by Processing Foundation

MDN Web Tools by Mozilla

ASCII Table by

UTF-16 by


Hand Shaking

The most frustrating thing about working with xbee radios is how hard it is to actually see what’s going on between them. The code is fairly simple, when you follow the tutorial available from the slides, but when a hiccup happens, and you’re not sure where the bug is, it becomes a whole production to implement debug statements between getting information from the arduino, the radio hooked up to the arduino, and the processing script. In the future, I should look into maybe making the onboard LED blink upon transmission or something, as well as sending debug messages for processing to print without counting it in its “received” data.

Another unexpected issue was that the whole thing didn’t work unless the server radio was plugged into port 5. I thought maybe it might be the processing script failing to pick up the active port, but even CoolTerm refused to connect to anything that wasn’t port 5. I have yet to identify the cause of this.

One of the things about this handshake method that I’m not fond of is that the order of which program starts first seems to matter. It shouldn’t. The radio hooked up to the arduino should be sending nothing but the message to establish contact which should be picked up by the radio which is hooked up to the processing sketch, but for some reason, if the arduino radio is plugged in first, the processing radio never establishes contact with it. It doesn’t even try. I have a suspicion that it’s caused by the fact that the radio being plugged in at all is what Serial1.available() is looking for, so that the processing sketch is already 5 steps behind (ie, the arduino “established contact” already) by the time processing fires up.

Update: I was right, and i actually forgot to check for the request from processing before stopping sending hellos. The loop now makes sure to check for a question from the processing radio, but it seems to stall after two or three questions. Will update with further debugging.



Update II: It’s probably because the processor radio only sends a request after it receives a packet from the arduino radio. If that request gets lost during transfer, the arduino processor will never know it was requested.

further work: implement an easy debug script please this will save you so much time.

further work II: implement a timer to send another request if no further paquets are received from the arduino radio after a set amount of time.

The Queue Manager

By Cartsa Antariksa, Nick Alexander, Omid Ettehadi, Olivia Prior



The Queue Manager is the result of an exploration of connectivity using XBee radios and microcontrollers to organize, store, and utilize inputs from a human user. The Queue Manager envisions a classroom in which every attendee has a node XBee/microcontroller apparatus (which we have called a Student Node or just Student) and the instructor has a master XBee/microcontroller (the professor). When they want to speak, a user will press a button on the student to reserve their place in the queue. The professor will press a button to advance the queue. To let the students know it is time to speak, an LED with light up in the student’s circuit.


We were excited to work with unusual functionalities of the XBee. Looking at the suggestions we decided we wanted to try to combine:

  • an on-the-fly rewrite of the XBee
  • logging and re-using data
  • a handshake between XBees

We considered desiging a system of haptic walkie-talkies but decided that would be too close to the existing Nudgeables. We thought of modifying them to have different channels that could be scanned through with a potentiometer but discarded it. It was this line of thought that brought us to our idea for The Queue Manager.

Initially, our idea was ambitious. We envisioned a system wherein the queue would be managed, would signal with LEDs when you were up next, and give priority to those who had not yet spoken. It would automatically adjust a user’s place in line based on how often they spoke, meaning those who entered the queue less were more likely to be placed highly in it.

We imagined we’d be logging incoming data on the Professor node, use the XBee rewrite to determine where transmissions from the Professor were going, implement a handshake to confirm with each Student node, and repeat. The idea was that while the Professor node would be listening to the entire group, the group would only listen for the Professor and the Professor would only transmit to one at a time.

In practice, as we will see, this was needlessly complicated.




Step 1: Initiate handshake between professor XBee and student node

Our first initial step was for the professor XBee to acknowledge the incoming information from any of the student nodes in class. The student XBees all had the ATDL set to “C” for the professor. When the student would press their button, that XBee would send the professor their ATMY. In theory, the professor XBee would switch its ATDL to the incoming ATMY, sending a signal that would be interpreted as “This student is registered in the class queue.”

We considered pairing the professor XBee with the registered student XBees using XBee I/O, but we needed to manage LEDs on the student XBees. Since the simplest way to manage LEDs would be with Arduino, we decided to eschew using XBee I/O and stick to using an Arduino with every node.

Upon implementing the Arduino rewrite we found a multitude of issues. The first was mixed and interfering data that was flooding our professor XBee channel. We found that this was a miscommunication of the type of data that was being sent between the student nodes and the professor. The student’s were sending characters but the professor was parsing for integers in the incoming messages. Getting this data type protocol correct was challenging due to the nature of the inconsistent transmitting and receiving of the XBee’s. When we were debugging the issue, we found that it was hard to tell if it was the physical hardware setup, the settings on XBees or if it was the code processing the information between the nodes.

When we were able to get clean data being sent from the student nodes to the professor node, we attempted to implement the ATDL rewrite for the professor node. This was challenging due to delays and inconsistent rewrites on the Arduino side. Each rewrite would take over a second long, which we found was problematic for the workflow of our code. Once again, we were unsure if it was the hardware setup, the rewrite library, or the XBees devices. One solution we found that worked was to move the XBees closer together. This allowed for more consistent sending and receiving of data.

Step 2: Confirm handshake and enter student node in the queue

Once we were able to have consistently clean data sending to the professor we wrote each of the student nodes to send their ATMY to the professor upon button press. We chose to still send this data because it is a unique identifier for the XBee, even though we decided to not use the Arduino rewrite for the professor node. We took inspiration from the initial assignment Metronome, rather than the professor rewriting the protocol the professor would transmit back the ATMY it had just received. This method was more of a call and response rather than a handshake. The professor would hear “6” coming in and then broadcast “6” back immediately. When the student node would process the incoming data and that data matched their ATMY, an LED would light up.

This step we found was necessary for our next iteration of the project. In our original scope, we had anticipated having another LED light that would indicate when the student pressed the button. The second light would turn on indicating that the student was now in The Queue.

We found this worked decently but not consistently. Sometimes the student node would send the ATMY twice, causing the LED light to flicker rather than light up. As well, sometimes the student node would miss the response from the professor node. We attempted to send the ATMY three times in a row back to the student in case their XBee dropped the first call. We found this did not work and was unnecessary. As well, we tried to have a general message that would be broadcast when someone would register on the queue. This was the one area we saw the most timing delays between the XBee radios. One student would press their button, and then, in theory, all of their lights should turn on. The lights would turn on arbitrarily depending on when the node had received the message. One thing that worked was having the XBee radios close to the professor node. Because this was unrealistic for the actual purpose of the project, we decided to move onto the next step.

Step 3: Professor logs a question queue ad cycles through with a button press

The final step to accomplish our minimum viable product was for the professor to log the incoming button presses and to store a queue. The professor would then have the ability to press a button that cycles through the queue and this would change the message that the node was transmitting. The professor would then transmit the ATMY of the next person in the queue and their node would process this information and have their green LED turn on. Our goal for this step was “to only have one LED on at a time”.

We created an array that would store every time a student button was pressed. The professor would then press the button and broadcast the first ATMY in the array, this would also move all of the data in the array over one spot closer to the beginning of the array. Our main issue with this step was figuring out how to transmit the data. Back to our initial issues in the development, there were conflicting data types that were not being processed properly and sending weird characters between the radios. Once we organized the information to all be the same data type we broadcast the ATMY to the channel. The LED light of the student who was first in the queue would light up, but also flicker. We realized this was because we were sending each command as a print line statement rather than a print. This extra character was being read and disrupting the LED light to turn on.

If the student was written in the array in two or more adjacent spots, we included a “quick blink” delay for the LED light before transmitting the next ATMY. This was to give feedback to the professor and the student, otherwise, it looked like the professor was clicking the button and nothing was changing or happening amongst the student nodes.

We placed our nodes together and created a queue through button presses and the professor node was successfully able to cycle through the queue, and as well register more ATMY slots in the array as they were coming in from the class.


Step 4: Introducing back the handshakes

After finishing our minimum viable product, we wanted to experiment with different mechanism that would allow us a much smoother experience with more reliable communications. The most effective way we was to create a handshake communication between the Student and the Teacher, so that every time the Student pressed the key, it would send its ATMY as a message to the the Teacher until the Teacher would respond back with a message to confirm the addition to the queue.

Another issue we found was that when the teacher pressed the key, sending the ATMY of the user first in the queue, it would also act as if the Teacher had received a request to add a person to the queue. In order to get around that, when the Teacher sends a data to the students, it will wait to hear its own message in the channel and then it will continue functioning as normal.

In addition, every time a student request for a place in the queue, the Teacher would check if they already are the last person in the queue, if so it will ignore their request.

Student Node

teacher_node_diagram_bbTeacher Node



As noted in our process the hurdles we encountered in the development of this project included the messy data received by nodes and the master sending unreadable data. Something we attempted to use as a way of monitoring the incoming was adding a “ghost” XBee to the channel. This XBee would listen to the channel and provide us with a serial monitor of all the data that was being distributed. Our development was stalled because the device transmitting mystery data that we frequently mistook for data being sent from our own nodes. Once we isolated the issue and continued development without the extra XBee, our team noticed that there was still messy data that was being sent amongst the XBees. We isolated the issue down to sending mismatching data types. Because we were using numbers, our data types were often confused between either integers or characters. We solved this by declaring all of our transmitted and received data as characters, and placed our numbers in string quotes so there would be no confusion amongst Arduino.

The XBee communication we found was not as reliable as we had previously experienced. We found there were delays in sending and receiving data, and sometimes data would be missed entirely. The only logical solution we found, besides frequently resetting our XBee settings, was to situate the devices close together. We suspected that there may have been some soldering issues or wiring issues that inhibited the distance the XBees could have amongst each other. Once the XBees were beside each other we found the communication greatly improved. This was useful for debugging but not a long term solution if we were to expand our project to include more than four users.

Through what we achieved in this experiment, we believe there is a lot of room to expand the scope to reach our initial concept. The Queue is particularly beneficial for students or participants who still struggle with confidence to voice their thoughts or raise their hand within many classroom contexts. The setup is also quite simple given that it exists within a kit that can be used to supplement lectures in classrooms that lack this ease in feedback method.

In terms of a future expansion, each of the student and professor nodes can be placed within a 3D printed case to hold the circuits in place and can be labelled accordingly. We could have also explored the possibility of having multiple LEDs to indicate different types of feedback for the student node themselves. For example, we imagined a blue LED turning on when the student was entered into the queue, a yellow light when they were next in line, and a green when it was their turn. Perhaps a red light could turn on when a student had entered the queue too many times. We also imagined a system wherein students who had not entered the list yet would be given primacy and placed higher in the queue than those who had. There was also a  potential for a screen interface for the professor to see the queue in real-time, rather than its current view through the arduino serial monitor.

Within real life applications, this manager could expand to contexts outside of the classroom, especially within restaurants or cafes. It would act as a “smart” pager that can cater to customers who request service earlier on, or able to indicate urgency in terms of which customer needs the service the most. It can also function well in locations where internet access is limited, especially in remote conferences and even countries where radio communication is more accessible.


Inspired in part by the server-pagers at Mo’Ramyun Restaurant at 1 Baldwin St, Toronto ON, M5T1L1


Colour Radio








Colour Radio is a communications project encompassing analog signals conceptualizing an interaction through sensors and analog radio communication. This project incorporates an interactive transition through the colours in your surroundings with a colour sensor. With Neo Pixel LED lights we have created a visual output that acts as an abstract representation of environments. Our intention was to create a beautiful visual that represents analog communication through Xbee components.



Phase 1 of brainstorming incorporated the knudgeables we explored the idea of creating a wearable that surrounded an interaction based on being able to express your state of mental health.

Phase 2 was discussing other forms of communication with the Xbee radios. At this phase of discovery, we decided that we wanted to delve further into an aspect of networking. We wanted to learn how to handshake between the Xbees and formed our project around the mode of learning we wanted to explore.



Phase 3 Throughout this stage we discovered that we wanted to make a representation of radio communication.
We discussed analog technology and the older forms of electrical components like the old vacuum tubes in televisions. While contemplating the circuit and its components we decided to use a sensor that could grasp aspects of surroundings without obvious communication like words, letters or noises. We discussed Infrared sensors and other distance sensors but decided to incorporate a colour sensor into the project and use LEDs as an output of this data.



We initiated the projects 1st steps by calibrating the light sensor. This was fun for the group because it was our first time working with colour sensors. The light sensor works by shining a white light at an object and then recording the reflected colour. It can also record the intensity of the reflection. Through red, green and blue colour filters the photodiode converts the amount of light to current. The converter then converts the current to voltage which our arduino can read. We weren’t able to get good absolute RGB readings from the sensor What we have been able to get is what I would call relative RGB, as the colour it is sensing is clearly visible by one value spiking, but it’s not within the 0-255 range. So a red object might cause the red value to rise by 30, not by 255. This is clearly visible when connecting an RGB LED, it will change colour, but the colour change might not be visible, an issue is also that generally blue LEDs have a higher energy output so it might overpower.

With further exploration into the networking of Xbees with sensors and the variations of input and output while exploring various nodes we originally had some issues transferring through the RGB values.  We would get some data readings to communicate as an output through the Xbees but would continually stop working.
Our goal was to establish a call/response handshake between two Xbee radios, however we had many issues executing this. We set up a transmitting XBee ( A ) connected to an Arduino that would send RGB values and a receiving XBee (B) that would get those values and use them to light a NeoPixel LED strip. We kept getting strange values in the buffer on the receiving end and data would sometimes be sent and sometimes not. We weren’t able to get it working however were able to get the handshake to work when the receiving XBee (B) was  connected to a laptop directly without an Arduino, with this setup we were able to initiate the handshake call/response action whereby the XBee(A) would send “hello” until we sent a character over the serial monitor and it would respond with the RGB reading from the color sensor. The image below shows a screenshot of this (The values returned are shown twice as we were printing out Serial and Serial1 values).

screen-shot-2019-01-30-at-9-21-23-amIn the end we modified our code so that XBee (A) transmitting color sensor values and the NeoPixel strip ran on the same breadboard, so that the strip would change colors according to the colour sensor reading. The receiving XBee (B) was connected to Processing and the values received were used to change the background of a sketch, we envisioned this as an additional representation of the color sensor’s readings.

Conclusion :

For future iteration, we would create a mesh network with colour sensors and more intricate variances in the colour changes of the led’s. Like initiating different patterns perhaps a blending of colours depending what the light sensor picks up. Exploring communication with the Xbees further and advancing on the foundation of this project we would like to add colour temperature sesnors in accordance with basic colour sensors in the future.



How to use the Color Sensor with Arduino board (TCS3200 & TCS3210)




Github Link:
The Project:

The game Showdown!! is a twist on the classic western movies showdown (gunfight). The idea was to use the Nudgeable’s as game controllers, with the vibration as a tactile indicator of being shot or injured. It is a very short game, around 2-3mins and allows for 2 players to duel out to see who is the fastest gun in the west.


We used p5 to display the animations and interaction on screen. It has two components: the p5 serial controller for the serial port connection with the Arduino ‘Listener’ and the p5 code in the web browser which makes the interaction visible. The purpose of the p5 display is to track when a ‘gunshot’ is fired in order to provide to participants an accurate gauge of who shot first, as well as a scoring mechanism. We reused characters from one of Frank’s previous projects and imagined that, as players scored points, the characters might lose limbs or otherwise act out.

Development was halted when we discovered technical problems with the listener device, but we made a proof-of-concept version to demonstrate how it would have worked.

The Nudgeable Mitts:


The hardware provided was Nudgeables, a custom device created by Kate Hartman for the body-centric design experiments. Our intended circuit featured two xBees listening in on the Nudgeable devices; when either Nudgeable device is activated, the listening device would send a signal to the microcontroller and our p5 interface. Due to technical challenges, we were unable to realize this with the Nudgeable devices.. perhaps using a microcontroller for the job would have been more intuitive.
We sourced two pairs of mitts from a dollar store and affixed the nudgeables to the garments with conductive thread. In order to activate the device, we designed a ‘switch’ that would activate whenever a thimble came into contact with the palm of the glove, which had a layer of conductive fabric. The activation of the device was very intuitive and effective.

The Listener (+ Challenges with the Nudgeables Hardware)


In order to create this project, we had to determine exactly how the Nudgeable xBees were communicating with each other. The Nudgeables user manual doesn’t explicitly define what the configuration settings are for the devices, but we determined that it was likely that the boards were using wireless communication to activate paired pins between the device’s modes A and B. We used serial communication with the Nudgeable xBees to collect their channel, identity, target and digital pin pairings in order to create ‘listeners’ for our projected interface that could eavesdrop in their communication and trigger events for our microcontroller and software. Based on our testing we determined that the ATD0 and ATD1 values for the Nudgeable xBees were paired with each other and that they communicated on channel 12.

With this in mind, we manually copied the configuration information we collected from the Nudgeable xBees and applied them to our own xBees in an attempt to create ‘clones’ that would listen in on the conversation of the originals. We tried to detect when the digital pins were activated on a breadboard by using LEDs, then serial from an Arduino Micro. The results were indeterminate, and when we later tried plugging in our clones into the Nudgeable devices they didn’t behave appropriately. In essence, something was still different between our clones and the original xBees.

We discovered that a software known as XTCU has a means of properly cloning the configuration of one xBee to another and decided to give that a go. Testing them out on the Nudgeable devices we created worked, although we were still unable to use these cloned xBees for our listening device.



What worked:

The Nudgeables are designed to be very modular and work great as part of the gloves, we were able to get the gloves to act as triggers with conductive fabric and conductive thread.

  • Each glove vibrates when you cress against the conductive fabric.
  • We were able to clone the Xbee’s and listen in on the Nudgeables.
  • The p5 sketch works with test data to trigger and show the winner


What didn’t work but we would like to have:

Although we failed to realize our intended project, we did manage to learn more about the xBees and their communication protocol in Nudgeable devices as well as use the opportunity to create some fun controllers. If we were to take up the project again, we probably would resort to using an Arduino instead of the Nudgeable devices for the communication as we would have more control over the success of the communication protocol and the capacity of the Listener to capture the communication between the devices. On the other hand, it probably would be informative to ask Kate directly about how the communication protocol works and the benefits of having the devices perform such communication; one of the novelties of the devices we noticed was that they would receive illegible serial communication from the other device whenever they were both active (and one was connected to a computer via FDTI chip).  


Week 3 Process Journal – Jingpo and April

Intelligent remote control system for the baby/child room.


Based on our existing hardware equipment, we designed a two-way transceiver device which can be placed in the master bedroom and the baby room respectively. The idea is to help parents and infants/children communicate remotely (in separate rooms). We made use of the characteristics of XBee and our existing electronic components, simulated the interaction scenario of the parents’ bedroom and the baby’s room. To make this simple interactive device, we edited XBee commands and used Arduino as the power supply.  



It is challenging for young children to turn off the light and then go to bed in the dark room by themselves, also, if they wake up at night needing their parents, they have no choice but crying to make their parents awake. Keeping this scenario in mind, we were aiming to create a two-way transceiver system which is 1) easy for the baby/child to use, 2) has the function of comforting the baby/child and 3) not too disturbing for the grown-ups.  During the design process, we tried out different buttons/switches, buzzer/speakers and lights, and finally decided that for the baby/child’s room, the input in is a simple button, all the baby/child has to do is to push it to buzz the buzzer in the parents’ room, the output is a light which can be adjusted to different brightnesses, with the control of the input in the parents room (a potentiometer), it can softly light itself from dim to bright.


  • Parent bedroom:

Input: light controller (potentiometer)

Output: sound prompt(buzzer)


  • Baby room:

Input: call button (button)

Output: light (led)



Parent Room:


Baby Room:



Day one:

  1. Connect two XBees.
  2. Test the code with one led and one potentiometer.

The first thing to do is to make sure two XBees are connected, so we plan to start from the simplest LED and potentiometer. We hope that the light on the one side will be on when the button is pressed on the other side. It didn’t work.

We checked the codes for two XBees and the circuits respectively and finally got LED and potentiometer working.  We saw the LED flickering light with the change of potentiometer, that is to say, two radios can communicate. However, it did not achieve the expected result – the change of light brightness with the change of potentiometer.   

Day two:

  1. Write the code for two distant controllers.
  2. Test the code with two LEDs, one button and one potentiometer.

We wrote codes for two transceivers, one for the parent room with one button and one led and one for the baby room with one potentiometer and one LED. The experiment was very successful.


3. Test the code with one led, one button, one potentiometer and one speaker.

Since the last experiment is very successful, we exchanged LED with the speaker. It didn’t work very well. It made some slight noises.


4. Instead of using the speaker, we replaced the speaker with a buzzer! We did it!!!






XBee Exploration




In this experiment we utilized Xbee radios; an analog radio transmitter and receiver that can be incorporated into a mesh network. Through this experiment there is an exploration of XBees and  initiating an interaction with an arduino component. This is communicated through ‘L’ or ‘H’ low or high to send and receive data and control pins in arduino. XBees can talk to each other by setting the channel and id’s. Incorporating the use of tones with this project I was curious to see how to program music and how that would coincide with sounds from other projects.



Configuring the XBees through the software Coolterm to confirm the XBees are communicating was my first step. I did this by making sure ATID,ATMY and ATDL in Coolterm were properly configured.

ATID: This is the ID of the channel both XBees are talking on

ATMY: This is the ID that defines the Xbee you are using

ATDL : This is the ID of the Xbee you are transmitting to.

Using just an LED with Arduino Mirco and Xbee I tested to make sure the ‘L’ and ‘H’ commands were working. Using the sample Arduino file Physical Pixel and uploading to the controller I was able to turn my LED on and off through the serial monitor.  Then I connected my Xbee and changed the Arduino code from ‘Serial’ to ‘Serial1’. I could see my LED turning on and off so I was ready to progress and attach new components to my bread board.

While setting up my Xbee with my breadboard it’s important to note that the XBee cannot run on 5V it has to be connected to the 3V power pin in the controller.

Physical Pixel Code

const int ledPin = 13; // the pin that the LED is attached to
int incomingByte; // a variable to read incoming serial data into

void setup() {
// initialize serial communication:
// initialize the LED pin as an output:
pinMode(ledPin, OUTPUT);

void loop() {
// see if there’s incoming serial data:
if (Serial.available() > 0) {
// read the oldest byte in the serial buffer:
incomingByte =;
// if it’s a capital H (ASCII 72), turn on the LED:
if (incomingByte == ‘H’) {
digitalWrite(ledPin, HIGH);
// if it’s an L (ASCII 76) turn off the LED:
if (incomingByte == ‘L’) {
digitalWrite(ledPin, LOW);


Never having used a speaker with arduino before I started to research how to use tones. From the Arduino website I found pitches.h. This file contains all the pitch values for typical notes. For example, NOTE_C4 is middle C. NOTE_FS4 is F sharp, and so forth. So instead of writing the frequency in the tone( ) function, we’ll just have to write the name of the note. This note table was originally written by Brett Hagman. I wanted to make use of the Piezo Speaker in order to play melodies. It produces PWM signals in order to play music. While using and testing with the Piezo speaker I learned can’t use tone() while also using analogWrite() on pins 3 or 11. That’s because the tone() function uses the same built in timer that analogWrite() does for pins 3 and 11.



With time being a factor I realized creating my own music wasn’t viable but I did discover a lot of pre existing music. I decided to use the music from the video games Super Mario Brothers. I correlated the music tones with the ‘L’ and ‘H’ command through the XBee. I added an LED to initiate with the speaker. While testing with this version of the breadboard the ‘L’ function would only sometimes work. I think this has to do with the sequencing of the tones.



With this collaborative project, I learned about the different affordances of creating an abstract orchestra through different sounds with others. The aspect of testing with others and seeing their projects was a positive experience and provided a good foundation in radio analog communication. Experiencing each project as a collective with these components is something that could be expanded upon later and with a larger network now that we know how to test and troubleshoot with the XBees.


Play simple melodies with an Arduino and a piezo buzzer


Explorations In Light Mapping






The Project:

This project is an interactive exploration of analog radio communication through the collaboration of multiple sensors communicating between Xbee modules, to create an ambient experience using Processing. the intention is to create an expressive representation with formula based shapes by integrating multiple nodes with light mapping. We chose concentric squares to conceptualize this perceptual field as incoming data values of light through networked modules.This interaction is transmitted through XBee radio transceivers and interpreted by Processing, creating a geometric array, sequencing a colour experience that reflects art, light, data and cooperative interaction; portraying the output of a network through a creative interface.


Explorations in light mapping ideation


As a group, our main goal was to create a collaborative ambient experience. Our ideas gravitated towards active and body focused experiences. The initial idea we arrived at was to use temperature sensors to create a heat map of everyone’s body temperature in the room. We tested the TMP36 Temperature Sensor to measure the temperature of our bodies, the sensor can detect a wide range of temperatures from -40 to +120, but we noticed that there wasn’t much difference in our body temperature, to the extent that it would barely be noticeable.

Our ideation continued forward using the inspiration of a map as our guidance. We discussed the possibilities of mapping different parts of a room through different inputs. The group consensus was to use photoresistors to map the light in the room. Mapping the light could be taken on in many different ways: we could place XBees in either active/inactive areas or place the radios in the various window sills to map out the light of the building. Eventually, we decided that we wanted an active participatory experience over an ambient passive project.

Creating the participatory experience was interesting as everyone had different distinct images of how the differing light values should be reflected on the screen. The first discussion was informed by our conversations of mapping: could the Processing screen be split up into a grid and reflect where the person was in the room? We decided that this idea was a bit too rigid and wasn’t easily adaptable for a various amount of XBees. This discussion of the project had the realization that we wanted a project that collaborative, generative, and was not restricted by numbers.

We came to the consensus of displaying the data in concentric squares. The squares would generate based on how many radios we had connected, and the inner values of the squares would be mapped to the values of the incoming light sensors. Each person participating would have a square that their radio would control. We liked this idea because the shapes of the concentric squares had connotations to optical illusions. It was playful, adaptable, and generative so we went forth with this idea.

Parallel to this decision, our group was still considering the concept of a lightmap. We realized that the code for mapping the light the concentric squares would be almost identical to the code for mapping light in the room. We decided that if we had time, we could create a floor map of the Digital Futures lounge in Processing and map the light in the different areas.

These two above ideas our group decided were manageable, interesting, and could be played with in many different ways depending on how many radios were connected, or how many people were actively participating at any given time. The results of both ideas had variables that had the opportunity to change and produce new data visualizations.

Exploration light mapping testing


Design + Testing:

Through experimentation, we designed three generative outputs that all use photoresistors to control light or colour values on the screen. The first is concentric squares that are mapped to the analog value by changing the displayed gray scale of the square (lower values are closers to black, and higher values are closer to white); the second is the same as the first concentric squares, but rather each square is assigned a different hue that lightens or darkens to reflect the analog value of each individual sensor rather than solely black and white values. For the third experiment, we mapped out a floor plan of the Digital Futures lounge where the team envisioned placing the XBee radios in different locations to get a “light” map that would change throughout the day.

The responsive concentric square layout was designed to be collaborative. We designed the Processing side of the project to, in theory, take in as many radios as possible to contribute to the generative scene. A new radio could be added, and the squares would adjust in size to accommodate. The concentric squares allow a group to playfully collaborate to create different illusions. Each operator of a photoresistor would be able to alter the values by changing the gray-scale or the hue of the squares to create different compositions as a team. Something to note is that we did not label the squares according to the participant’s individual radios.

This allows for investigation and communication amongst those participating in the group. Through our own testing, it is interesting to try and create different depth illusions by controlling the value of the photoresistor. Since the sensor is quite sensitive to any amount of light in a room, there was a little trouble in achieving a darker color for the low values, making the composition more often gray in situations where the light in the room is turned on. This would mean that the ideal setup for this experiment would be in a dark room, where the light values can be controlled by the individual through a flashlight on their smartphone or by fully closing the space around the sensor with our hands.

The intention of the floorplan design was to place the radios amongst the room and remove the active participation from the group. The light sensors would calibrate to the spaces that they were placed and would reflect changes or movement within the space. This would allow for an ambient light map of the space with the ability to change along with the different parts of the day, such as if the automatic lights turn on or off, if the sun sets, or if someone in the room blocks the light directly. This passive design has connotations of surveillance. The photoresistors act as light monitors of the space; if someone were to come in the middle of the night and turn on the lights the sensors would automatically react wirelessly to the sudden change of light. Overall, we did not have time to fully test this idea. As an early iteration of this experiment, we took the mapping of the values from Arduino to Processing from the code for the concentric squares. We also designed the floor plan on Processing and tested it similarly to how we tested the previous experiments. We hope to be able to test this one at some point to see the results.

The Hardware and Circuit


The Software:

We gained some valuable insights through the process of testing, connecting the nodes and Xbees to the network. Throughout this process, we problem solved some communication issues between the Xbees removing RX and TX connections before uploading code (Arduino will give an error). If you find yourself in the scenario where your XBee just won’t communicate; try resetting to the default values.

Although many of us have worked with the photoresistor in previous projects, we learned that calibrating the sensor is important to ensure that it does not take in values that are negative or beyond the maximum value set on the Arduino code. Creating code for this project our initial goal was to have created a dynamic network through Processing; a process by which an infinite amount of radio nodes could by dynamically networked as they were detected. Due to time constraints we opted instead, for this iteration, program a static array of radio nodes into Processing, one for each node that will be at the presentation. In the next iteration we intend to make the code dynamic, adding a potentially endless number of XBees, that are automatically detected and added to the visualization.


Demonstrating our light mapping project was highly successful. The outcome was a fun collaboration that intrigued some game interaction. This aspect of the project stemmed from the curiosity of trying to discover which grid of the concentric square your Xbee is controlling.

As a future iteration of this project, were it to be expanded upon;  we wanted to explore a node network meshed theremin. The theremin would create reflective colours, represented through values from distance sensors. If we wanted to expand our network this project taught us aspects of the processing code and how to develop a possible larger more complex network.  Considering  Zigbee mesh networking as the next step, utilizing the exploration of a small network of radio communication proved successful. This project taught us many things about working with Xbees.  Xbees cannot manage the received or sent data. However, they can communicate with intelligent devices via the serial interface.






Experiment 1 Xbee Talking

In this experiment, we explored the Xbee communication protocols.  The Xbee is a wireless radio device which can be used to communicate to each other with or without microcontrollers.

Mind Map

You can download the entire mindMap as a PDF from the image above.

The above is the basic ecosystem of how the Xbee works based on Kate Harmans Slides and Digi.Com website.

The Main Concepts:

  • The basic idea is the same as a Walkie Talkie, where one device can send and receive signals, they are transceivers.
  • The device runs on 3.5V and can damage the device if you use a higher Voltage.
  • The Xbee is configured and not programmed, you cannot add new commands to it only configure the properties set in the Data Sheet.
  • The Pin spacing is different for a breadboard so you have to use a breakout adapter.
  • It also operates in two modes transparent Vs Command Mode, transparent is the data mode to send and receive and command mode is activated with +++ where you configure the device.

The Experiment 01

For this experiment, we had to receive signal from one Xbee in the form of a PULSE, H would turn the power to the Pin On and L would turn the Pin to Low/OFF.

My idea was to use the signal to drive a fan which would be a tube and levitate a ball. I got the pieces together, but the fan was not strong enough to drive the ball into the air.


What can I do to change this up?
I can get this to work if I use a Mosfet and use a higher powered fan and use a lighter ball. I did not have time to go back to get the parts and change my design.


I went back to the drawing board and went with the simplest solution of programming a buzzer to receive the signal and creates a simple pulse.





A “Silent” Alarm / An XBee x Arduino Process Journal


This little device is a simple desk companion that acts as a visual silent alarm for the hearing impaired. In theory it would react to a doorbell and spin indicating that someone was at the door or that mail had been left. Alternatively, it also acts as a distracting toy that reacts to notifications.

XBee Chat Exercise:

During the chat exercise, my partner and I didn’t have much trouble configuring our XBees and were soon able to send messages back and forth. However we didn’t really do much chatting as we noticed that we weren’t able to send long messages because packets would get dropped. One of us would send “Hello” and the other would receive “Hell”. It was interesting to see how this led to funny miscommunications. This led me to the conclusion that the XBee wasn’t really a communication device in the traditional sense of the word. I would have to think of communication beyond words. We found that the most effective way was to send characters.

XBee Radio Receiver / Transmitter:

Tip: Use the edge of the breakoutboard not the XBee to determine where to put the power, TX, and RX pins for the XBee connection.

While testing the XBee using the Arduino Physical Pixel example, I was able to control an LED bulb using the serial monitor, however when trying to control the Arduino x XBee setup using another XBee we ran into some issues. We were able to only achieve one way communication. My LED would light up or go off when receiving signals from my partner’s XBee but I was not able to light their LED using my radio. This was also happening with another group.

Troubleshooting (Tested with 3 different partners as the one-way communication issue occurred each time I tried to connect to a new XBee)

We noticed that:

  1. The radios would work when both were configured on the same laptop.
  2. (Best troubleshooting method) The radios would work after sending messages back and forth over chat.
  3. The radios would work when brought closer together.

XBee Metronome Receiver

For my device’s design choice, I was thinking of the instructions we got in class; to think of an interaction that would be around 20 or more others. Initially, I had wanted to use sound as an output however I figured something more visual would be a better choice as it would still be able to be noticed when around other devices that were reacting to the metronome. Since I removed sound from the equation, and began to focus mainly on visuals, this made me think of the hearing impaired and I thought “What if you could have a tiny visual desk alarm that spins when someone rings your doorbell?”. I also wanted to learn how to work with a servo as I had never used one before.


When conceptualizing my design I had envisioned having a rotating cylinder or disk inspired by spinning tops and wheels, however, I realized that the micro-servo can only make 180 degree rotations not 360 degree as I had imagined. I didn’t have the knowledge to hack my servo or the time to get another one so I improvised the rotations to still create an optical illusion effect. Below are some images from my prototyping.

Future expansion:

I would like to continue to explore making desktop companions thinking along the themes of accessibility and self-care toys. I’d also like to work with more servos.

Github link to code : here