The Queue Manager

By Cartsa Antariksa, Nick Alexander, Omid Ettehadi, Olivia Prior

Github: https://github.com/npyalex/Ubiq-Class-Manager

Overview

The Queue Manager is the result of an exploration of connectivity using XBee radios and microcontrollers to organize, store, and utilize inputs from a human user. The Queue Manager envisions a classroom in which every attendee has a node XBee/microcontroller apparatus (which we have called a Student Node or just Student) and the instructor has a master XBee/microcontroller (the professor). When they want to speak, a user will press a button on the student to reserve their place in the queue. The professor will press a button to advance the queue. To let the students know it is time to speak, an LED with light up in the student’s circuit.

Ideation

We were excited to work with unusual functionalities of the XBee. Looking at the suggestions we decided we wanted to try to combine:

  • an on-the-fly rewrite of the XBee
  • logging and re-using data
  • a handshake between XBees

We considered desiging a system of haptic walkie-talkies but decided that would be too close to the existing Nudgeables. We thought of modifying them to have different channels that could be scanned through with a potentiometer but discarded it. It was this line of thought that brought us to our idea for The Queue Manager.

Initially, our idea was ambitious. We envisioned a system wherein the queue would be managed, would signal with LEDs when you were up next, and give priority to those who had not yet spoken. It would automatically adjust a user’s place in line based on how often they spoke, meaning those who entered the queue less were more likely to be placed highly in it.

We imagined we’d be logging incoming data on the Professor node, use the XBee rewrite to determine where transmissions from the Professor were going, implement a handshake to confirm with each Student node, and repeat. The idea was that while the Professor node would be listening to the entire group, the group would only listen for the Professor and the Professor would only transmit to one at a time.

In practice, as we will see, this was needlessly complicated.

img_7606

img_20190122_144247

Process

Step 1: Initiate handshake between professor XBee and student node

Our first initial step was for the professor XBee to acknowledge the incoming information from any of the student nodes in class. The student XBees all had the ATDL set to “C” for the professor. When the student would press their button, that XBee would send the professor their ATMY. In theory, the professor XBee would switch its ATDL to the incoming ATMY, sending a signal that would be interpreted as “This student is registered in the class queue.”

We considered pairing the professor XBee with the registered student XBees using XBee I/O, but we needed to manage LEDs on the student XBees. Since the simplest way to manage LEDs would be with Arduino, we decided to eschew using XBee I/O and stick to using an Arduino with every node.

Upon implementing the Arduino rewrite we found a multitude of issues. The first was mixed and interfering data that was flooding our professor XBee channel. We found that this was a miscommunication of the type of data that was being sent between the student nodes and the professor. The student’s were sending characters but the professor was parsing for integers in the incoming messages. Getting this data type protocol correct was challenging due to the nature of the inconsistent transmitting and receiving of the XBee’s. When we were debugging the issue, we found that it was hard to tell if it was the physical hardware setup, the settings on XBees or if it was the code processing the information between the nodes.

When we were able to get clean data being sent from the student nodes to the professor node, we attempted to implement the ATDL rewrite for the professor node. This was challenging due to delays and inconsistent rewrites on the Arduino side. Each rewrite would take over a second long, which we found was problematic for the workflow of our code. Once again, we were unsure if it was the hardware setup, the rewrite library, or the XBees devices. One solution we found that worked was to move the XBees closer together. This allowed for more consistent sending and receiving of data.

Step 2: Confirm handshake and enter student node in the queue

Once we were able to have consistently clean data sending to the professor we wrote each of the student nodes to send their ATMY to the professor upon button press. We chose to still send this data because it is a unique identifier for the XBee, even though we decided to not use the Arduino rewrite for the professor node. We took inspiration from the initial assignment Metronome, rather than the professor rewriting the protocol the professor would transmit back the ATMY it had just received. This method was more of a call and response rather than a handshake. The professor would hear “6” coming in and then broadcast “6” back immediately. When the student node would process the incoming data and that data matched their ATMY, an LED would light up.

This step we found was necessary for our next iteration of the project. In our original scope, we had anticipated having another LED light that would indicate when the student pressed the button. The second light would turn on indicating that the student was now in The Queue.

We found this worked decently but not consistently. Sometimes the student node would send the ATMY twice, causing the LED light to flicker rather than light up. As well, sometimes the student node would miss the response from the professor node. We attempted to send the ATMY three times in a row back to the student in case their XBee dropped the first call. We found this did not work and was unnecessary. As well, we tried to have a general message that would be broadcast when someone would register on the queue. This was the one area we saw the most timing delays between the XBee radios. One student would press their button, and then, in theory, all of their lights should turn on. The lights would turn on arbitrarily depending on when the node had received the message. One thing that worked was having the XBee radios close to the professor node. Because this was unrealistic for the actual purpose of the project, we decided to move onto the next step.

Step 3: Professor logs a question queue ad cycles through with a button press

The final step to accomplish our minimum viable product was for the professor to log the incoming button presses and to store a queue. The professor would then have the ability to press a button that cycles through the queue and this would change the message that the node was transmitting. The professor would then transmit the ATMY of the next person in the queue and their node would process this information and have their green LED turn on. Our goal for this step was “to only have one LED on at a time”.

We created an array that would store every time a student button was pressed. The professor would then press the button and broadcast the first ATMY in the array, this would also move all of the data in the array over one spot closer to the beginning of the array. Our main issue with this step was figuring out how to transmit the data. Back to our initial issues in the development, there were conflicting data types that were not being processed properly and sending weird characters between the radios. Once we organized the information to all be the same data type we broadcast the ATMY to the channel. The LED light of the student who was first in the queue would light up, but also flicker. We realized this was because we were sending each command as a print line statement rather than a print. This extra character was being read and disrupting the LED light to turn on.

If the student was written in the array in two or more adjacent spots, we included a “quick blink” delay for the LED light before transmitting the next ATMY. This was to give feedback to the professor and the student, otherwise, it looked like the professor was clicking the button and nothing was changing or happening amongst the student nodes.

We placed our nodes together and created a queue through button presses and the professor node was successfully able to cycle through the queue, and as well register more ATMY slots in the array as they were coming in from the class.

img_20190127_185046

Step 4: Introducing back the handshakes

After finishing our minimum viable product, we wanted to experiment with different mechanism that would allow us a much smoother experience with more reliable communications. The most effective way we was to create a handshake communication between the Student and the Teacher, so that every time the Student pressed the key, it would send its ATMY as a message to the the Teacher until the Teacher would respond back with a message to confirm the addition to the queue.

Another issue we found was that when the teacher pressed the key, sending the ATMY of the user first in the queue, it would also act as if the Teacher had received a request to add a person to the queue. In order to get around that, when the Teacher sends a data to the students, it will wait to hear its own message in the channel and then it will continue functioning as normal.

In addition, every time a student request for a place in the queue, the Teacher would check if they already are the last person in the queue, if so it will ignore their request.

student_node_diagram_image
Student Node

teacher_node_diagram_bbTeacher Node

img_7803

Findings

As noted in our process the hurdles we encountered in the development of this project included the messy data received by nodes and the master sending unreadable data. Something we attempted to use as a way of monitoring the incoming was adding a “ghost” XBee to the channel. This XBee would listen to the channel and provide us with a serial monitor of all the data that was being distributed. Our development was stalled because the device transmitting mystery data that we frequently mistook for data being sent from our own nodes. Once we isolated the issue and continued development without the extra XBee, our team noticed that there was still messy data that was being sent amongst the XBees. We isolated the issue down to sending mismatching data types. Because we were using numbers, our data types were often confused between either integers or characters. We solved this by declaring all of our transmitted and received data as characters, and placed our numbers in string quotes so there would be no confusion amongst Arduino.

The XBee communication we found was not as reliable as we had previously experienced. We found there were delays in sending and receiving data, and sometimes data would be missed entirely. The only logical solution we found, besides frequently resetting our XBee settings, was to situate the devices close together. We suspected that there may have been some soldering issues or wiring issues that inhibited the distance the XBees could have amongst each other. Once the XBees were beside each other we found the communication greatly improved. This was useful for debugging but not a long term solution if we were to expand our project to include more than four users.

Through what we achieved in this experiment, we believe there is a lot of room to expand the scope to reach our initial concept. The Queue is particularly beneficial for students or participants who still struggle with confidence to voice their thoughts or raise their hand within many classroom contexts. The setup is also quite simple given that it exists within a kit that can be used to supplement lectures in classrooms that lack this ease in feedback method.

In terms of a future expansion, each of the student and professor nodes can be placed within a 3D printed case to hold the circuits in place and can be labelled accordingly. We could have also explored the possibility of having multiple LEDs to indicate different types of feedback for the student node themselves. For example, we imagined a blue LED turning on when the student was entered into the queue, a yellow light when they were next in line, and a green when it was their turn. Perhaps a red light could turn on when a student had entered the queue too many times. We also imagined a system wherein students who had not entered the list yet would be given primacy and placed higher in the queue than those who had. There was also a  potential for a screen interface for the professor to see the queue in real-time, rather than its current view through the arduino serial monitor.

Within real life applications, this manager could expand to contexts outside of the classroom, especially within restaurants or cafes. It would act as a “smart” pager that can cater to customers who request service earlier on, or able to indicate urgency in terms of which customer needs the service the most. It can also function well in locations where internet access is limited, especially in remote conferences and even countries where radio communication is more accessible.

References

https://docs.google.com/presentation/d/1qiehq3F99ZQz_Phf2RWTu0y4yj3hSDWgFGePThFjPxc/edit#slide=id.g4d74fb7caa_0_107

Inspired in part by the server-pagers at Mo’Ramyun Restaurant at 1 Baldwin St, Toronto ON, M5T1L1

 

Colour Radio

 

Untitled-1

 

 

img_3312

 

Overview:

Colour Radio is a communications project encompassing analog signals conceptualizing an interaction through sensors and analog radio communication. This project incorporates an interactive transition through the colours in your surroundings with a colour sensor. With Neo Pixel LED lights we have created a visual output that acts as an abstract representation of environments. Our intention was to create a beautiful visual that represents analog communication through Xbee components.

 

Ideation:

Phase 1 of brainstorming incorporated the knudgeables we explored the idea of creating a wearable that surrounded an interaction based on being able to express your state of mental health.

Phase 2 was discussing other forms of communication with the Xbee radios. At this phase of discovery, we decided that we wanted to delve further into an aspect of networking. We wanted to learn how to handshake between the Xbees and formed our project around the mode of learning we wanted to explore.

image-from-ios

img_3278

Phase 3 Throughout this stage we discovered that we wanted to make a representation of radio communication.
We discussed analog technology and the older forms of electrical components like the old vacuum tubes in televisions. While contemplating the circuit and its components we decided to use a sensor that could grasp aspects of surroundings without obvious communication like words, letters or noises. We discussed Infrared sensors and other distance sensors but decided to incorporate a colour sensor into the project and use LEDs as an output of this data.

photo-on-2019-01-25-at-7-26-pm
screen-shot-2019-01-29-at-11-18-12-am

Process:

We initiated the projects 1st steps by calibrating the light sensor. This was fun for the group because it was our first time working with colour sensors. The light sensor works by shining a white light at an object and then recording the reflected colour. It can also record the intensity of the reflection. Through red, green and blue colour filters the photodiode converts the amount of light to current. The converter then converts the current to voltage which our arduino can read. We weren’t able to get good absolute RGB readings from the sensor What we have been able to get is what I would call relative RGB, as the colour it is sensing is clearly visible by one value spiking, but it’s not within the 0-255 range. So a red object might cause the red value to rise by 30, not by 255. This is clearly visible when connecting an RGB LED, it will change colour, but the colour change might not be visible, an issue is also that generally blue LEDs have a higher energy output so it might overpower.

With further exploration into the networking of Xbees with sensors and the variations of input and output while exploring various nodes we originally had some issues transferring through the RGB values.  We would get some data readings to communicate as an output through the Xbees but would continually stop working.
screen-shot-2019-01-30-at-9-26-54-am
Our goal was to establish a call/response handshake between two Xbee radios, however we had many issues executing this. We set up a transmitting XBee ( A ) connected to an Arduino that would send RGB values and a receiving XBee (B) that would get those values and use them to light a NeoPixel LED strip. We kept getting strange values in the buffer on the receiving end and data would sometimes be sent and sometimes not. We weren’t able to get it working however were able to get the handshake to work when the receiving XBee (B) was  connected to a laptop directly without an Arduino, with this setup we were able to initiate the handshake call/response action whereby the XBee(A) would send “hello” until we sent a character over the serial monitor and it would respond with the RGB reading from the color sensor. The image below shows a screenshot of this (The values returned are shown twice as we were printing out Serial and Serial1 values).

screen-shot-2019-01-30-at-9-21-23-amIn the end we modified our code so that XBee (A) transmitting color sensor values and the NeoPixel strip ran on the same breadboard, so that the strip would change colors according to the colour sensor reading. The receiving XBee (B) was connected to Processing and the values received were used to change the background of a sketch, we envisioned this as an additional representation of the color sensor’s readings.

Conclusion :

For future iteration, we would create a mesh network with colour sensors and more intricate variances in the colour changes of the led’s. Like initiating different patterns perhaps a blending of colours depending what the light sensor picks up. Exploring communication with the Xbees further and advancing on the foundation of this project we would like to add colour temperature sesnors in accordance with basic colour sensors in the future.

 

References:

http://www.robotsforfun.com/webpages/colorsensor.html

How to use the Color Sensor with Arduino board (TCS3200 & TCS3210)

https://learn.adafruit.com/adafruit-neopixel-uberguide/arduino-library-use

 

Showdown!!

045

Github Link: https://github.com/vulture-boy/showdown
The Project:

The game Showdown!! is a twist on the classic western movies showdown (gunfight). The idea was to use the Nudgeable’s as game controllers, with the vibration as a tactile indicator of being shot or injured. It is a very short game, around 2-3mins and allows for 2 players to duel out to see who is the fastest gun in the west.

Software

We used p5 to display the animations and interaction on screen. It has two components: the p5 serial controller for the serial port connection with the Arduino ‘Listener’ and the p5 code in the web browser which makes the interaction visible. The purpose of the p5 display is to track when a ‘gunshot’ is fired in order to provide to participants an accurate gauge of who shot first, as well as a scoring mechanism. We reused characters from one of Frank’s previous projects and imagined that, as players scored points, the characters might lose limbs or otherwise act out.

Development was halted when we discovered technical problems with the listener device, but we made a proof-of-concept version to demonstrate how it would have worked.

The Nudgeable Mitts:

img_362820190128_174840

The hardware provided was Nudgeables, a custom device created by Kate Hartman for the body-centric design experiments. Our intended circuit featured two xBees listening in on the Nudgeable devices; when either Nudgeable device is activated, the listening device would send a signal to the microcontroller and our p5 interface. Due to technical challenges, we were unable to realize this with the Nudgeable devices.. perhaps using a microcontroller for the job would have been more intuitive.
We sourced two pairs of mitts from a dollar store and affixed the nudgeables to the garments with conductive thread. In order to activate the device, we designed a ‘switch’ that would activate whenever a thimble came into contact with the palm of the glove, which had a layer of conductive fabric. The activation of the device was very intuitive and effective.

The Listener (+ Challenges with the Nudgeables Hardware)

img_3633

In order to create this project, we had to determine exactly how the Nudgeable xBees were communicating with each other. The Nudgeables user manual doesn’t explicitly define what the configuration settings are for the devices, but we determined that it was likely that the boards were using wireless communication to activate paired pins between the device’s modes A and B. We used serial communication with the Nudgeable xBees to collect their channel, identity, target and digital pin pairings in order to create ‘listeners’ for our projected interface that could eavesdrop in their communication and trigger events for our microcontroller and software. Based on our testing we determined that the ATD0 and ATD1 values for the Nudgeable xBees were paired with each other and that they communicated on channel 12.

With this in mind, we manually copied the configuration information we collected from the Nudgeable xBees and applied them to our own xBees in an attempt to create ‘clones’ that would listen in on the conversation of the originals. We tried to detect when the digital pins were activated on a breadboard by using LEDs, then serial from an Arduino Micro. The results were indeterminate, and when we later tried plugging in our clones into the Nudgeable devices they didn’t behave appropriately. In essence, something was still different between our clones and the original xBees.

We discovered that a software known as XTCU has a means of properly cloning the configuration of one xBee to another and decided to give that a go. Testing them out on the Nudgeable devices we created worked, although we were still unable to use these cloned xBees for our listening device.

Outcomes

img_3637

What worked:

The Nudgeables are designed to be very modular and work great as part of the gloves, we were able to get the gloves to act as triggers with conductive fabric and conductive thread.

  • Each glove vibrates when you cress against the conductive fabric.
  • We were able to clone the Xbee’s and listen in on the Nudgeables.
  • The p5 sketch works with test data to trigger and show the winner

046

What didn’t work but we would like to have:

Although we failed to realize our intended project, we did manage to learn more about the xBees and their communication protocol in Nudgeable devices as well as use the opportunity to create some fun controllers. If we were to take up the project again, we probably would resort to using an Arduino instead of the Nudgeable devices for the communication as we would have more control over the success of the communication protocol and the capacity of the Listener to capture the communication between the devices. On the other hand, it probably would be informative to ask Kate directly about how the communication protocol works and the benefits of having the devices perform such communication; one of the novelties of the devices we noticed was that they would receive illegible serial communication from the other device whenever they were both active (and one was connected to a computer via FDTI chip).  

asset-73

Week 3 Process Journal – Jingpo and April

Intelligent remote control system for the baby/child room.

 

Concept:
Based on our existing hardware equipment, we designed a two-way transceiver device which can be placed in the master bedroom and the baby room respectively. The idea is to help parents and infants/children communicate remotely (in separate rooms). We made use of the characteristics of XBee and our existing electronic components, simulated the interaction scenario of the parents’ bedroom and the baby’s room. To make this simple interactive device, we edited XBee commands and used Arduino as the power supply.  

 

Ideation:

It is challenging for young children to turn off the light and then go to bed in the dark room by themselves, also, if they wake up at night needing their parents, they have no choice but crying to make their parents awake. Keeping this scenario in mind, we were aiming to create a two-way transceiver system which is 1) easy for the baby/child to use, 2) has the function of comforting the baby/child and 3) not too disturbing for the grown-ups.  During the design process, we tried out different buttons/switches, buzzer/speakers and lights, and finally decided that for the baby/child’s room, the input in is a simple button, all the baby/child has to do is to push it to buzz the buzzer in the parents’ room, the output is a light which can be adjusted to different brightnesses, with the control of the input in the parents room (a potentiometer), it can softly light itself from dim to bright.

Design:

  • Parent bedroom:

Input: light controller (potentiometer)

Output: sound prompt(buzzer)

parent

  • Baby room:

Input: call button (button)

Output: light (led)

baby

Code:

Parent Room:

parentcode

Baby Room:

babycode

Testing:

Day one:

  1. Connect two XBees.
  2. Test the code with one led and one potentiometer.

The first thing to do is to make sure two XBees are connected, so we plan to start from the simplest LED and potentiometer. We hope that the light on the one side will be on when the button is pressed on the other side. It didn’t work.

We checked the codes for two XBees and the circuits respectively and finally got LED and potentiometer working.  We saw the LED flickering light with the change of potentiometer, that is to say, two radios can communicate. However, it did not achieve the expected result – the change of light brightness with the change of potentiometer.   

Day two:

  1. Write the code for two distant controllers.
  2. Test the code with two LEDs, one button and one potentiometer.

We wrote codes for two transceivers, one for the parent room with one button and one led and one for the baby room with one potentiometer and one LED. The experiment was very successful.

photo1

3. Test the code with one led, one button, one potentiometer and one speaker.

Since the last experiment is very successful, we exchanged LED with the speaker. It didn’t work very well. It made some slight noises.

photo2

4. Instead of using the speaker, we replaced the speaker with a buzzer! We did it!!!

 

Video:

https://vimeo.com/user83802499/review/313924989/85921ef3e2

References:  

https://www.digi.com/blog/802-15-4-pwm-output-with-an-led/

https://www.digi.com/blog/802-15-4-analog-input-with-a-potentiometer/

 

 

XBee Exploration

 

 

Summary

In this experiment we utilized Xbee radios; an analog radio transmitter and receiver that can be incorporated into a mesh network. Through this experiment there is an exploration of XBees and  initiating an interaction with an arduino component. This is communicated through ‘L’ or ‘H’ low or high to send and receive data and control pins in arduino. XBees can talk to each other by setting the channel and id’s. Incorporating the use of tones with this project I was curious to see how to program music and how that would coincide with sounds from other projects.

 

Process

Configuring the XBees through the software Coolterm to confirm the XBees are communicating was my first step. I did this by making sure ATID,ATMY and ATDL in Coolterm were properly configured.

ATID: This is the ID of the channel both XBees are talking on

ATMY: This is the ID that defines the Xbee you are using

ATDL : This is the ID of the Xbee you are transmitting to.

Using just an LED with Arduino Mirco and Xbee I tested to make sure the ‘L’ and ‘H’ commands were working. Using the sample Arduino file Physical Pixel and uploading to the controller I was able to turn my LED on and off through the serial monitor.  Then I connected my Xbee and changed the Arduino code from ‘Serial’ to ‘Serial1’. I could see my LED turning on and off so I was ready to progress and attach new components to my bread board.

While setting up my Xbee with my breadboard it’s important to note that the XBee cannot run on 5V it has to be connected to the 3V power pin in the controller.

Physical Pixel Code

const int ledPin = 13; // the pin that the LED is attached to
int incomingByte; // a variable to read incoming serial data into

void setup() {
// initialize serial communication:
Serial.begin(9600);
// initialize the LED pin as an output:
pinMode(ledPin, OUTPUT);
}

void loop() {
// see if there’s incoming serial data:
if (Serial.available() > 0) {
// read the oldest byte in the serial buffer:
incomingByte = Serial.read();
// if it’s a capital H (ASCII 72), turn on the LED:
if (incomingByte == ‘H’) {
digitalWrite(ledPin, HIGH);
}
// if it’s an L (ASCII 76) turn off the LED:
if (incomingByte == ‘L’) {
digitalWrite(ledPin, LOW);
}
}
}

 

Never having used a speaker with arduino before I started to research how to use tones. From the Arduino website I found pitches.h. This file contains all the pitch values for typical notes. For example, NOTE_C4 is middle C. NOTE_FS4 is F sharp, and so forth. So instead of writing the frequency in the tone( ) function, we’ll just have to write the name of the note. This note table was originally written by Brett Hagman. I wanted to make use of the Piezo Speaker in order to play melodies. It produces PWM signals in order to play music. While using and testing with the Piezo speaker I learned can’t use tone() while also using analogWrite() on pins 3 or 11. That’s because the tone() function uses the same built in timer that analogWrite() does for pins 3 and 11.

 

 

With time being a factor I realized creating my own music wasn’t viable but I did discover a lot of pre existing music. I decided to use the music from the video games Super Mario Brothers. I correlated the music tones with the ‘L’ and ‘H’ command through the XBee. I added an LED to initiate with the speaker. While testing with this version of the breadboard the ‘L’ function would only sometimes work. I think this has to do with the sequencing of the tones.

 

Conclusion

With this collaborative project, I learned about the different affordances of creating an abstract orchestra through different sounds with others. The aspect of testing with others and seeing their projects was a positive experience and provided a good foundation in radio analog communication. Experiencing each project as a collective with these components is something that could be expanded upon later and with a larger network now that we know how to test and troubleshoot with the XBees.

https://github.com/aliciablakey/speakerxbee.git

References

http://www.arduino.cc/en/Tutorial/PhysicalPixel*/

https://learn.adafruit.com/adafruit-arduino-lesson-10-making-sounds/playing-a-scale

Play simple melodies with an Arduino and a piezo buzzer

 

Explorations In Light Mapping

 

 

 

explorations

 

The Project:

This project is an interactive exploration of analog radio communication through the collaboration of multiple sensors communicating between Xbee modules, to create an ambient experience using Processing. the intention is to create an expressive representation with formula based shapes by integrating multiple nodes with light mapping. We chose concentric squares to conceptualize this perceptual field as incoming data values of light through networked modules.This interaction is transmitted through XBee radio transceivers and interpreted by Processing, creating a geometric array, sequencing a colour experience that reflects art, light, data and cooperative interaction; portraying the output of a network through a creative interface.

 

Explorations in light mapping ideation

Ideation:

As a group, our main goal was to create a collaborative ambient experience. Our ideas gravitated towards active and body focused experiences. The initial idea we arrived at was to use temperature sensors to create a heat map of everyone’s body temperature in the room. We tested the TMP36 Temperature Sensor to measure the temperature of our bodies, the sensor can detect a wide range of temperatures from -40 to +120, but we noticed that there wasn’t much difference in our body temperature, to the extent that it would barely be noticeable.

Our ideation continued forward using the inspiration of a map as our guidance. We discussed the possibilities of mapping different parts of a room through different inputs. The group consensus was to use photoresistors to map the light in the room. Mapping the light could be taken on in many different ways: we could place XBees in either active/inactive areas or place the radios in the various window sills to map out the light of the building. Eventually, we decided that we wanted an active participatory experience over an ambient passive project.

Creating the participatory experience was interesting as everyone had different distinct images of how the differing light values should be reflected on the screen. The first discussion was informed by our conversations of mapping: could the Processing screen be split up into a grid and reflect where the person was in the room? We decided that this idea was a bit too rigid and wasn’t easily adaptable for a various amount of XBees. This discussion of the project had the realization that we wanted a project that collaborative, generative, and was not restricted by numbers.

We came to the consensus of displaying the data in concentric squares. The squares would generate based on how many radios we had connected, and the inner values of the squares would be mapped to the values of the incoming light sensors. Each person participating would have a square that their radio would control. We liked this idea because the shapes of the concentric squares had connotations to optical illusions. It was playful, adaptable, and generative so we went forth with this idea.

Parallel to this decision, our group was still considering the concept of a lightmap. We realized that the code for mapping the light the concentric squares would be almost identical to the code for mapping light in the room. We decided that if we had time, we could create a floor map of the Digital Futures lounge in Processing and map the light in the different areas.

These two above ideas our group decided were manageable, interesting, and could be played with in many different ways depending on how many radios were connected, or how many people were actively participating at any given time. The results of both ideas had variables that had the opportunity to change and produce new data visualizations.

Exploration light mapping testing

 

Design + Testing:

Through experimentation, we designed three generative outputs that all use photoresistors to control light or colour values on the screen. The first is concentric squares that are mapped to the analog value by changing the displayed gray scale of the square (lower values are closers to black, and higher values are closer to white); the second is the same as the first concentric squares, but rather each square is assigned a different hue that lightens or darkens to reflect the analog value of each individual sensor rather than solely black and white values. For the third experiment, we mapped out a floor plan of the Digital Futures lounge where the team envisioned placing the XBee radios in different locations to get a “light” map that would change throughout the day.

The responsive concentric square layout was designed to be collaborative. We designed the Processing side of the project to, in theory, take in as many radios as possible to contribute to the generative scene. A new radio could be added, and the squares would adjust in size to accommodate. The concentric squares allow a group to playfully collaborate to create different illusions. Each operator of a photoresistor would be able to alter the values by changing the gray-scale or the hue of the squares to create different compositions as a team. Something to note is that we did not label the squares according to the participant’s individual radios.

This allows for investigation and communication amongst those participating in the group. Through our own testing, it is interesting to try and create different depth illusions by controlling the value of the photoresistor. Since the sensor is quite sensitive to any amount of light in a room, there was a little trouble in achieving a darker color for the low values, making the composition more often gray in situations where the light in the room is turned on. This would mean that the ideal setup for this experiment would be in a dark room, where the light values can be controlled by the individual through a flashlight on their smartphone or by fully closing the space around the sensor with our hands.

The intention of the floorplan design was to place the radios amongst the room and remove the active participation from the group. The light sensors would calibrate to the spaces that they were placed and would reflect changes or movement within the space. This would allow for an ambient light map of the space with the ability to change along with the different parts of the day, such as if the automatic lights turn on or off, if the sun sets, or if someone in the room blocks the light directly. This passive design has connotations of surveillance. The photoresistors act as light monitors of the space; if someone were to come in the middle of the night and turn on the lights the sensors would automatically react wirelessly to the sudden change of light. Overall, we did not have time to fully test this idea. As an early iteration of this experiment, we took the mapping of the values from Arduino to Processing from the code for the concentric squares. We also designed the floor plan on Processing and tested it similarly to how we tested the previous experiments. We hope to be able to test this one at some point to see the results.

The Hardware and Circuit

screen-shot-2019-01-22-at-11-40-59-am

The Software:

We gained some valuable insights through the process of testing, connecting the nodes and Xbees to the network. Throughout this process, we problem solved some communication issues between the Xbees removing RX and TX connections before uploading code (Arduino will give an error). If you find yourself in the scenario where your XBee just won’t communicate; try resetting to the default values.

Although many of us have worked with the photoresistor in previous projects, we learned that calibrating the sensor is important to ensure that it does not take in values that are negative or beyond the maximum value set on the Arduino code. Creating code for this project our initial goal was to have created a dynamic network through Processing; a process by which an infinite amount of radio nodes could by dynamically networked as they were detected. Due to time constraints we opted instead, for this iteration, program a static array of radio nodes into Processing, one for each node that will be at the presentation. In the next iteration we intend to make the code dynamic, adding a potentially endless number of XBees, that are automatically detected and added to the visualization.

Conclusion:

Demonstrating our light mapping project was highly successful. The outcome was a fun collaboration that intrigued some game interaction. This aspect of the project stemmed from the curiosity of trying to discover which grid of the concentric square your Xbee is controlling.

As a future iteration of this project, were it to be expanded upon;  we wanted to explore a node network meshed theremin. The theremin would create reflective colours, represented through values from distance sensors. If we wanted to expand our network this project taught us aspects of the processing code and how to develop a possible larger more complex network.  Considering  Zigbee mesh networking as the next step, utilizing the exploration of a small network of radio communication proved successful. This project taught us many things about working with Xbees.  Xbees cannot manage the received or sent data. However, they can communicate with intelligent devices via the serial interface.

https://github.com/npyalex/Ubiquitous-Connectivity-Project.git

References:

https://www.digi.com/resources/documentation/digidocs/pdfs/90000976.pdf

http://www.ardumotive.com/how-to-use-xbee-modules-as-transmitter–receiver-en.html

https://core-electronics.com.au/tutorials/how-to-network-xbee-and-arduino.html

 

 

 

 

Experiment 1 Xbee Talking

In this experiment, we explored the Xbee communication protocols.  The Xbee is a wireless radio device which can be used to communicate to each other with or without microcontrollers.

Mind Map

You can download the entire mindMap as a PDF from the image above.

The above is the basic ecosystem of how the Xbee works based on Kate Harmans Slides and Digi.Com website.

The Main Concepts:

  • The basic idea is the same as a Walkie Talkie, where one device can send and receive signals, they are transceivers.
  • The device runs on 3.5V and can damage the device if you use a higher Voltage.
  • The Xbee is configured and not programmed, you cannot add new commands to it only configure the properties set in the Data Sheet.
  • The Pin spacing is different for a breadboard so you have to use a breakout adapter.
  • It also operates in two modes transparent Vs Command Mode, transparent is the data mode to send and receive and command mode is activated with +++ where you configure the device.

The Experiment 01

For this experiment, we had to receive signal from one Xbee in the form of a PULSE, H would turn the power to the Pin On and L would turn the Pin to Low/OFF.

My idea was to use the signal to drive a fan which would be a tube and levitate a ball. I got the pieces together, but the fan was not strong enough to drive the ball into the air.

img_3491

What can I do to change this up?
I can get this to work if I use a Mosfet and use a higher powered fan and use a lighter ball. I did not have time to go back to get the parts and change my design.

img_3490

I went back to the drawing board and went with the simplest solution of programming a buzzer to receive the signal and creates a simple pulse.

img_3492

Gitcode: 

https://github.com/imaginere/Ubiquitous-Computing

References:

Digi.Com: https://www.digi.com/resources/documentation/Digidocs/90001456-13/Default.htm

A “Silent” Alarm / An XBee x Arduino Process Journal

silentalarm_blog

This little device is a simple desk companion that acts as a visual silent alarm for the hearing impaired. In theory it would react to a doorbell and spin indicating that someone was at the door or that mail had been left. Alternatively, it also acts as a distracting toy that reacts to notifications.

XBee Chat Exercise:

During the chat exercise, my partner and I didn’t have much trouble configuring our XBees and were soon able to send messages back and forth. However we didn’t really do much chatting as we noticed that we weren’t able to send long messages because packets would get dropped. One of us would send “Hello” and the other would receive “Hell”. It was interesting to see how this led to funny miscommunications. This led me to the conclusion that the XBee wasn’t really a communication device in the traditional sense of the word. I would have to think of communication beyond words. We found that the most effective way was to send characters.

XBee Radio Receiver / Transmitter:

Tip: Use the edge of the breakoutboard not the XBee to determine where to put the power, TX, and RX pins for the XBee connection.

While testing the XBee using the Arduino Physical Pixel example, I was able to control an LED bulb using the serial monitor, however when trying to control the Arduino x XBee setup using another XBee we ran into some issues. We were able to only achieve one way communication. My LED would light up or go off when receiving signals from my partner’s XBee but I was not able to light their LED using my radio. This was also happening with another group.

Troubleshooting (Tested with 3 different partners as the one-way communication issue occurred each time I tried to connect to a new XBee)

We noticed that:

  1. The radios would work when both were configured on the same laptop.
  2. (Best troubleshooting method) The radios would work after sending messages back and forth over chat.
  3. The radios would work when brought closer together.

XBee Metronome Receiver

For my device’s design choice, I was thinking of the instructions we got in class; to think of an interaction that would be around 20 or more others. Initially, I had wanted to use sound as an output however I figured something more visual would be a better choice as it would still be able to be noticed when around other devices that were reacting to the metronome. Since I removed sound from the equation, and began to focus mainly on visuals, this made me think of the hearing impaired and I thought “What if you could have a tiny visual desk alarm that spins when someone rings your doorbell?”. I also wanted to learn how to work with a servo as I had never used one before.

Issues:

When conceptualizing my design I had envisioned having a rotating cylinder or disk inspired by spinning tops and wheels, however, I realized that the micro-servo can only make 180 degree rotations not 360 degree as I had imagined. I didn’t have the knowledge to hack my servo or the time to get another one so I improvised the rotations to still create an optical illusion effect. Below are some images from my prototyping.

Future expansion:

I would like to continue to explore making desktop companions thinking along the themes of accessibility and self-care toys. I’d also like to work with more servos.

Github link to code : here

Process Journal #1: XBee + Metronomes

Xbee Metronome
Process Journal #1
Olivia Prior

LCD screen

 

In this class, our assignment was to explore XBee radio communication through the concept of high (“H”) and low (“L”) commands as incoming data. Our class was each given an XBee radio, and an XBee breakout board to experiment transmitting and receiving radio commands. Our final assignment was to set up our XBee radios to control an actuator of our choice on Arduino that would respond to a “metronome” that was transmitting a count from an accompanying  XBee.

Part 1: Testing with the XBee Radios

Step 1: Internally sending and receiving

The first step was to connect my XBee radio to CoolTerm to test if my Arduino Micro would respond to high and low commands. I opened up CoolTerm, connected to the serial port that my Arduino was hooked up to, and tested typing “H” and “L” commands to turn on and off an LED. I had no problem with this working.

Using CoolTerm as serial communication, I type H to turn the LED on and L to off the LED.
Step 2: Connecting to a friend
Upon my initial testing, many students in the class had upfront issues with connecting to another XBee radio. I paired with a classmate in the studio and we switched our ATID to be on the same channel, and I changed my ATDL to match their ATMY. At first, my LED light was not responding immediately so we were unsure if the connection was working. We then realized there was a small lag within the communication, but were satisfied with the results knowing that we were able to transmit and receive commands to each other’s Arduinos.
To ensure that this was completely wireless, I changed the Serial library in the code to use “Serial1” to allow my Arduino to be disconnected from the machine.

A classmate and I testing out our transmitting and receiving with each other. They are transmitting “H” and “L” to control the LED light on the Arduino. 

 Step 3: Playing with new actuators
I removed the single LED and connected a small strip of addressable LED lights to my circuit. I jumped 5v to on side of the circuit as to not overpower the breakout board for the XBee on the other side which requires 3.3V. Using the same “H” and “L” commands as before, I used this data to turn on and off the strip lights. I used the AdaFruit NeoPixel library to control these LEDs.

Turning the LED lights on and off using “H” and “L” commands. 

Step 4: Changing the code

I was inspired by the realization that the Philip’s Hue Lights use a type of radio communication as controls. I have a few within my own possession and wanted to see if I could create a very simplified version. I copied the “H” and “L” command code, but rather than simply turning on and off the lights, I used different keys to change the colour of the strips. Here in the video below, I use the commands “R” for red, “G” for green, “B” for blue, “P” for pink, “Y” for yellow, “O” for orange, and “W” for white.

Creating a simplified version of Philip’s Hue Light by transmitting simple letter commands to control the hue of the LED strip. 

Part 1 overall outcome

At this point in my radio experimentation journey, the most exciting part is the ability to control one, or multiple other, devices through transmitting commands. I feel quite inspired through the mock “Philip’s Hue Lights” test to create smaller bespoke lamps for my own bedroom.

When testing by myself, the simple commands for turning the actuators on and off do not feel that different from what I have the ability to do with an Arduino solely.

When testing with others, I found that it was interesting to see the differences in “lag” depending on the radio. The classmate I was testing with had about a second delay on their commands, which led for confusion when attempting a proof of concept. The lag only went one way; when I would send commands their LED would turn on and off almost instantly. I wonder if this has anything to do with the machine speed on either side.

Part 2: Metronome

For this part of the experiment, I wanted to count how many “beats per a minute” the metronome was outputting. I decided upon this after choosing as my output, which was an LCD display.

Step 1: Choosing an output

For this experiment, I rifled through my toolkit to see what I had that would be newer and interesting to play with. I found an LCD LiquidDisplays that I had inherited from my Dad’s electronic supply and decided the use it.

LCD LiquidDisplay
LCD LiquidDisplay

I found readily available documentation for the LCD screen on AdaFruit. I followed the tutorial and connected the screen to my Arduino Uno. I was able to get a “hello world” screen up and counting the seconds that have passed.

LCD screen displaying the LiquidCrystal example which prints out “hello, world!” and on the second line counts how many seconds the program has been on.

Step 2: Connecting the XBee and LED lights

I then moved the connections to my Arduino Micro. I used the same code that worked for initial experimentation that made the addressable LED lights turn on and off. Rather than simply turning the LEDs on and off, I changed the brightness of them. I found that if I turned them on and off fully, it was too much of a drain of the power in the circuit. This would cause the LCD screen to flicker. As well, on the high command, on the screen I printed out “high” and on the low command I printed out “low”.

LCD screen connected to CoolTerm, and receiving “High” and “Low” commands to change the input of the screen, and as well the brightness of the LED strip. 

Step 3: Counting the beats

Because I was testing this at home, I wrote a pseudo metronome in my code to mimic different speeds of a count.

Mock metronome for testing purposes
Mock metronome for testing purposes

I would change the value of the metronome to get varying results. I would count the passing millisecond’s in-between counts, take 60000 and divide the result, and multiply by two to take into account the offbeat. I took this count and printed it out to the LCD screen.

LCD screen displaying a rudimentary BPM of the program

Step 4: What are you dancing too?

I took the BPM of the mock metronome, and then created statements that would print out what type of music correlated to the actual bpm that was listed. If the BPM was lower than 60 or higher than 200, messages of “slow is also good” come up, or a warning saying “Don’t dance too fast!” appear.

 

LCD screen showing the bpm of the program & what type of dance music is associated with that beat.

The one bug I have yet to fix is that the program initially detects that the bpm is half of what it is within the first couple of counts. It quickly fixes itself upon the second round of metronome counts.

Part 2 overall

I found this program to be interesting because it was not simply turning something “on” or “off”. It dynamically takes in data and will be able to print out the data accordingly so.

Testing out the screen with the metronome did not give as specific data as I thought. The range for the metronome was 100 – 5000 milliseconds, while my code is optimized for a range 200-1500 milliseconds. This is not necessarily a bad thing, as it requires someone to slowly go through the range as opposed to just cranking the potentiometer in either direction.

Overall the experiment was an interesting exercise in communication. It was interesting to see what other folks did with the metronome data in the class. Some interpreted the data to count every fourth beat, while others used the data to control servo-motors. The work altogether was interesting to see, and because of the nature of the rhythm of the metronome, it seemed to all connect in a collage-like way.

Liquid Crystal: https://www.arduino.cc/en/Reference/LiquidCrystal

Tutorial for LCD screen: https://learn.adafruit.com/adafruit-arduino-lesson-11-lcd-displays-1

Different music types: https://accusonus.com/sites/default/files/media/blog/Electronic_music_tempo-.pdf

 

Papillon

Project Name: Papillon

Project Members: Omid Ettehadi

Code: https://webspace.ocad.ca/~3170557/UbiquitousComputing/Week1/Papillon.rar

Process Journal:

The XBee Chat Exercise was a fascinating opportunity to explore the XBee devices and to test out the capabilities of the XBee and the CoolTerm Software. Wireless communication in its simplest form can lead to a much simpler connection between different devices in a project.

While doing the exercise, I could not stop thinking about how this system can replace the complicated, internet-based systems that I had used for my previous projects, and how much things would become more reliable with a much shorter lag time.

To make a metronome receiver device, I decided to start with a simple example first, to test the limits of the XBee and find out the range and the kind of packaging I could use in my project. I ran the “Physical Pixel” example and placed the device in different locations to see how different materials affect the range.

For my next step, I changed the Serial in the example to Serial1 and used another XBee with the CoolTerm software to send data to the device.

mvimg_20190113_132413For this project, I wanted to play with something that I hadn’t had the chance to play with before. I decided to create a simple piano by playing different frequencies for each note and allowing the metronome device to set the tempo for the music.

To create different frequencies, I used the “Melody” example, I turned on the speaker, waited for the period of the note and then turned the speaker off and waited for the period of the speaker again. I repeated the same procedure for 20 times so that the note is created. For the music, I chose the soundtrack of the movie Papillion and wrote down the notes for the song in an array.  Every time the device receives an H or an L, it will play a note from the array.

To add more to the project, I added a servo motor installed on a drum piece, so that after every 4 notes, the drum will make a sound as well, keeping the beat of the music.

mvimg_20190114_162206mvimg_20190114_163832

mvimg_20190115_092041

Component List:

components-list

Final Schematic:

circuit_bb