Moon Gaze

Moon Gaze

A wearable interactive partner finder. A project by Yiyi Shao and Finlay Braithwaite for DIGF6037, Creation and Computation, Digital Futures, OCAD University.

人有悲欢离合,月有阴晴圆缺,此事古难全。但愿人长久,千里共婵娟 — 苏轼 《水调歌头》

Translation: The moon does wax, the moon does wane, and so men meet and say goodbye. May we all be blessed with longevity though far apart, we are still able to share the beauty of the moon together.

Our idea inspired from an old well-known Chinese poem Shui Diao Ge Tou by Shi Su (also known as Tungpo Su, 1037 – 1101). This poem describes the poet himself travelling long distances and missing his family. The moon is a comfort for him, because no matter how far people are separated, they are still watching and sharing the beauty of the same moon. Chinese people still carry on this traditions as part of the Mid Autumn Festival.

Moon gaze is a contemporary take on this desire for long distance connectedness through facing one another. With Moon Gaze, one can find, face, and connect with their partner irregardless of physical distance.

Moon Gaze creators Yiyi Shao and Finlay Braithwaite facing one another at a short distance.
Moon Gaze creators Finlay Braithwaite and Yiyi Shao face one another at a short distance.

User Experience

With a concept taking shape, we began thinking about how we could create a meaningful experience for users. A primary consideration was making the experience as natural as possible, requiring little to no input from users.
We focused on the experience of two partners facing one another, albeit at a great distance. One output of our system would allow a user to determine if they were facing one another. Interacting with this output would also allow the user to hone in on their partner’s bearing. We also saw meaningful interaction in knowing if your partner was facing you. This feedback would be simpler, only revealing if your partner was facing towards you or not.

With Moon Gaze’s summary objective of partners facing one another, we resolved that the interaction should be connected to the users’ body. The interaction would not be with a screen or input device, rather the interaction would respond to the users’ movements. With this in mind, we proceeded to develop Moon Gaze as a wearable interactive technology.

In making this determination, we also investigated Moon gaze as an object detached from the body. We thought it could be nice to have small arrow that pointed towards your partner. We also flirted with the idea of implementing Moon Gaze in a chair; a middle ground between object and wearable.

Proceeding as a wearable, we weighed our options for physical notification. We went back and forth between using a vibration motor or an LED. We felt a vibration motor would be discrete to the user and create an intimate haptic physical connection between partners. However, we felt that an LED would make for a stronger presentation of our first iteration and we felt confident in our ability to implement and finesse the behaviour of LEDs.

Moving forward with LEDs, we concluded that the blink rate of LEDs in the wearable would help orient partners to face one another. The faster the blinking, the closer the match. While we felt it could be interesting to know which direction, right or left, you needed to turn in order to find your partner, the math in determining left and right in a 360 degree bearing system was beyond the scope of the first iteration.

A second set of LEDs would turn on and off whether your partner is facing you or not.

Design

Physical

Design Sketches

To match our concept, we attempted to get as close as possible to the theme of the moon. In deciding to make wearables, we faced a lot of options. There are so many possibilities for wearables; it could be a hat, a t-shirt, a jumper, a pair of trousers. Which one is better? Which kind of material would be the best to play with? What colour should we choose? Where should we put our LED moon? Where should we put our hardware? Imagining, designing, and creating our wearables was a significant portion of the overall project.

Sweaters on the rack at Uniqlo.

We choose to hack knit sweaters for our project, as they allow us to weave a string of LEDs into the knit itself. We went to Uniqlo to find suitable sweaters. In the men’s clothing section we found dark grey knits that perfectly mimic the dark night sky, a perfect backdrop for our moons.

LED string woven into sweater.
LED string woven into sweater.

We began to embed and sew the LED from the sweaters’ insides. However, it wasn’t a good look with only bare LEDs on the outside of the sweater. The light from the LEDs also appeared harsh without diffusion. Ruminating on this for quite some time, we designed a fabric cover for the LEDs but it still didn’t have the look we wanted. We found a bag of feathers in the Digital Futures studio space and they immediately found a home in our design. After playing around with feathers for a while, it seemed to be a perfect material to diffuse the light!

LED feather covering.
LED feather covering.

Tip: Adding a fabric backing helped cover the wires, making the sweater more comfortable to wear. Also, the LED string won’t be snagged when the user removes the sweater.

LED protective backing.
LED protective backing.
Moon Gaze wearables ready for presentation.
Moon Gaze wearables ready for presentation.

We wanted the LEDs to be highly visible and felt that individual LEDs would be difficult to implement. At this point, we weren’t up for the challenge of working with Neo Pixels, although they are the natural progression for our next iteration. We went a trip to Michaels and found a fancy LED string which is used to decorate Christmas trees. We checked the voltage of the LED string (4.5V) and found it should be fine to work with Feather M0 board if we connect to the USB pin to get the 5V power. We cut off the built-in battery pack from the LED string to gain access to the string’s positive and negative leads. As we were using 5V from the USB pin, we had to switch the LEDs using transistors connected to a digital output pin on the Feather.

Coding

Code on GitHub

The main coding challenge we faced was determining a vector between two locations on earth. In investigating geographic calculations, we determined that we needed to resolve a rhumb line bearing between the two coordinates. A rhumb line bearing is a single bearing that will take you to a destination on Earth. On a flat map, a rhumb line appears as a straight line. However, the earth is not flat like a map. As such a rhumb line is not the shortest distance to a destination. If you consider the earth as a sphere, the shortest distance between two points is a ‘great circle’ route. The haversine formula allows you to calculate the route between two points on a sphere, allowing the calculation of a great circle route on Earth. Despite the benefits of following the shorter great circle route, it cannot be described as a single bearing. If following a great circle route, your polar bearing changes over the course of the route. Sailors often used rhumb lines for that very reason. With a rhumb line you can maintain a constant compass bearing for the entire voyage (‘Keep north on your left and sail till dawn!’). We resolved to use the rhumb line to calculate a single bearing between our Moon Gaze partners.

The math for calculating a rhumb line bearing is the stuff of geographical math textbooks and fairly easy to find on the internet if you know what you’re looking for. We found a goldmine in Chris Veness’ ‘Movable Type’ site Calculate distance, bearing and more between Latitude/Longitude points. This is a live portfolio piece demonstrating his capacity for interpreting complex systems and creating interactive online tools to study them. The material on this site is free to use as long as his copyright is cited in any resulting works.

In addition to having an interface to calculate rhumb line bearings, Veness breaks down both the math and Javascript code.

Rhumb line bearing calculations from 'Movable Type Scripts'
Rhumb line bearing calculations from ‘Movable Type Scripts’ © 2002-2017 Chris Veness

While all freely available, it was a formidable challenge as relatively inexperienced coders to make sense of the math, evaluate the code, and translate it for the Arduino IDE’s C/C++ compiler. Here’s our translation:

double rhumbDIFF = log(abs(tan(rLAT2 / 2 + M_PI / 4) / tan(rLAT1 / 2 + M_PI / 4)));
 double rhumbBRNGrad = atan2(deltaLONG, rhumbDIFF);
 double rhumbBRNGdeg = rhumbBRNGrad * 180.0 / M_PI;
 headingToPartner = fmod((rhumbBRNGdeg + 360), 360);

Our favourite part of this code is the final modulus function that corrects for a negative bearing. It was an effective introduction into the power of modulus to place a value within a range.

Not knowing what our GPS data would look like, we also created functions to convert between degrees.minutes.seconds, decimal degrees, and radians (required for the rhumb bearing calculation). This math is less specific and available from a number of sources. Our source was Steven Dutch’s site Converting UTM to Latitude and Longitude (Or Vice Versa).

With the rhumb line bearing calculation in hand, we created functions to compare the local heading to the bearing pointing towards the remote partner. We wanted to determine the absolute difference in these values. The lower the difference, the faster the LED blink rate. The main challenge here is the 360 degree system of headings and bearings. 360 degrees equals 0 degrees. With this in mind, a heading of 10 degrees and a bearing of 350 degrees are only 20 degrees apart, yet with simple math they are 330 degrees apart. Both statements are true, but if you’re moving from 10 degrees to 350 by rotating 330, you’re going the long way around the circle. With this logic in mind, we created a correction if the absolute difference between heading and bearing is greater that 180 degrees (the long way around the circle).

double blinkCalc = abs(bearingToPartner - localHeadingDegrees);
 if (blinkCalc > 180){blinkCalc = -blinkCalc + 360;}

The next challenge was connecting our paired devices to one another on the internet. For this we employed PubNub (pubnub.com) to publish live data from each device. PubNub’s history function was crucial for allowing interaction even if both devices were not online as it stores the last published messages for each channel in a buffer.

For our interaction, each device publishes its local GPS coordinates as well as a local match status boolean that describes if a user is facing the other. The publishing was fairly straightforward using PubNub’s Arduino API code as massaged by our professors Nick Puckett and Kate Hartman.

Appreciating that a local user might not have a GPS fix during a session, we created a ‘self read’ function that pulls back the last known GPS coordinates of the device for use in the bearing calculations. If a GPS fix is achieved, the current data is published and used in updated bearing calculations.

A key challenge was the ‘blocking’ element of the Arduino PubNub code. While reading data from a subscribed PubNub channel, everything else in loop function temporarily stops. This includes sensing magnetometer data and our blinking LEDs. This behaviour is so irksome that we are looking into other data publishing solutions for our second iteration, including using the Particle Photon instead of the Adafruit Feather M0.

Another requirement of our coding was integrating the Adafruit GPS and SparkFun Magnetometer (compass) sensors. However, both companies provide expansive documentation with example code, making the integration of these devices fairly straightforward.

Privacy and Safety

It is reckless to publish one’s GPS coordinates if unsure about the identity and motives of that data’s audience. For future versions of this project, all bearing calculations would happen server-side, negating the need to share coordinates between users.

Electronics

Moon Gaze provided an exciting opportunity to explore GPS and Magnetometer sensors which neither of us had used before.

Moon Gaze electronics overview.
Moon Gaze electronics overview.
Adafruit Ultimate GPS Featherwing

The Featherwing made for easy integration with our Adafruit Feather M0 micro controller. The featherwing is designed to stack on-top of the feather easily connecting the default connection pins.

The GPS unit connects via a serial connection and the example code makes it easy to pull a GPS coordinate from the device. However, we struggled most with getting a GPS fix in the Toronto downtown core and, as mentioned, adapted our code to operate without a GPS fix by pulling the last known address from PubNub on boot.

Link: Adafruit Ultimate GPS Featherwing

SparkFun HMC5883L Magnetometer

This little sensor board provides the local compass heading required to determine if you are indeed facing your partner. This device connects via I2C and the provided documentation and examples made this fairly straightforward to integrate.

Link: SparkFun HMC5883L Magnetometer

Future Developments

We see a lot of potential for future iterations of this project.

As a two-week crash project, we’re incredibly happy with our results, but are already planning next steps based on our team’s observations as well as feedback from our colleagues.

The first refinements would come in developing the physical wearable further. Scaling the electronics down and housing them in a wearable enclosure would make Moon Gaze a more natural experience. We are open to exploring different types of wearable notifications for Moon Gaze. We are interested in testing haptic feedback such as vibration motors for a future iteration.

Unhappy with problems inherent with PubNub’s Arduino implementation, we are looking at other data publish/subscribe options. Adafruit IO is top of the list as it could be more tightly integrated with the Feather M0. Another alternative would be to move to the Particle Photon which features built in networked functions for device-to-device variable sharing.

Moon Gaze could be further developed by adding a web component that displays connections for each relationship. On a meta level, this could be an aesthetic data visualizations of all the connections on Earth. As our cohort mentioned during our presentation, it can also be a useful product for people to find their friends in large event or activities such as a music festival. We do put into concern about our users’ privacy, so we will be very careful for this aspect for any future developments.

Video Demo

 

Harmonizer

 

Finlay Braithwaite

Harmonizer

This project realizes an interactive audible tone and interval reference. Think of it as a modern day pitch pipe that allows users to explore complex chords instead of a single note. The harmonizer can reproduce any frequency in the audible spectrum. With a fundamental ‘root’ frequency established, users can explore three additional harmonic intervals to build any four-part chord of their choosing. Harmonizer allows for either equal or just intonation, illuminating a difficult to demonstrate yet fundamental aspect of musical theory.  Tactile controls allow modification of each voice’s volume, waveform, and pan. In addition, a filter can be applied to shape the tone of the Harmonizer’s output. The Harmonizer system is the combination of a web application and USB peripheral.

Demo Video

Harmonizer DEMO

Code – Harmonizer v0.0.1

HTML

https://github.com/braithw8/OCAD_webspace/blob/master/Harmonizer/index.html

CSS

https://github.com/braithw8/OCAD_webspace/blob/master/Harmonizer/style.css

Code for Adafruit Feather M0

https://github.com/braithw8/OCAD_webspace/blob/master/Harmonizer/arduino/multiIO/multiIO.ino

P5.js javascript

https://github.com/braithw8/OCAD_webspace/blob/master/Harmonizer/sketch.js

Breadboard Layout

harmonizer_bb

Wiring Schematic

harmonizer_schem

Process Journal

As this experiment is a rare occasion to focus on one’s self and make a device solely for them, I naturally wanted to create a sound peripheral and application of some sort. My artistic and professional background is in music and sound design as well as audio operations and engineering.

At first, I wanted to make a tool to analyze real-time acoustic information and create a fun yet useful visual display. I thought it would be interesting to design my own device that was equal parts sound pressure meter, real-time spectrum analyzer, waveform monitor. I began investigating the feasibility of such a project. I would connect an Arduino compatible microphone to the Adafruit Feather M0. From there I would ultimately connect to a P5.js javascript platform that would translate my incoming audio data into various visualizations. The beginning and end seemed very realistic, but I began worrying about the serial data connection between the Feather and P5.js. Our in-class examples ran at 9600 baud, but I felt I might need a much higher data rate. Furthermore, in probing the audio possibilities of P5.js with my laptop’s on-board microphone, I felt that an Arduino microphone was overly complicated on one hand and redundant on the other.

Moving on, I became intrigued with p5.sound’s sequencer functions. This function allows for a sequence of media files or musical notes to be triggered in a repeatable sequence. Following this exploration, I resolved to create a step-sequencer. Users would be able to define the parameters of the sequence using a physical controller. A physical peripheral becomes so import in this context. A computer really only has two variable inputs that can be controlled simultaneously, the X and Y positions of the mouse. I wanted to open this up so one could physically control eight variables simultaneously. To get the most mileage out of this, I thought of using potentiometers as my input. It appeared that the Feather would accommodate eight analog inputs, so I began dreaming about what I could with eight pots. My first design had eight knobs controlling the pitch of eight notes in an eight note sequence. If I could build that, I would think about adding additional functionality. This would resemble the Doepfer Musikelektronik ‘Dark Time’ analog sequencer (http://www.doepfer.de/Dark_Time_d.htm), where every note is set using a potentiometer.

The sequencer was a nightmare to work with in the context of my design priorities. Once the sequencer pattern was set, there didn’t seem to be a fluid way to insert new note pitches into the pattern from the potentiometer value. Also, as the sequence progressed, the CPU demand of the host browser escalated until the app choked out. While this sometimes made an interesting or comical sound, it highlighted that the sequencer was turning oscillators on but not turning them off. Finding the p5.sound code slightly impenetrable with my coding experience, I didn’t not find a workable solution to this problem.

With this left-turn in my design plans, I switched directions and began developing a tone and interval reference system. I’m fascinated with the concept of intonation, the division of the audible spectrum into musical notes. There are many different systems of musical tunings stemming from the world’s diverse musical cultures. Dividing the spectrum is first done proportionally in octaves. Each octave is a doubling in frequency. The difference between 240 Hz and 480 Hz, for example, is an octave. Within each octave we can apply simple fractions to derive tones and semitones. In western music, each octave is divided into 12 semitones. Simple fractions create intervals known as a third, fifth, seventh, etc… This system is beautiful in it’s simplicity and the tones it produces have a pure relationship with one another. This type of intonation is known as just intonation. While theoretically ideal, these ratio relationships of notes don’t transpose well. Instruments created in just intonation can only play in one scale.

To get around this limitation of just intonation, a system of equal intonation was created. Equal intonation has each note equally spaced in terms of frequency from its surrounding notes. It’s a beautiful system that allows an instruments to play music in every key. However, the simple relationships of just intonation are lost, making all combinations of notes slightly dissonant. It’s a tradeoff required to create instruments for all keys and, as a result, is part of the foundation of western music. Interestingly, we’ve become so accustomed to the slightly dissonant intervals of equal intonation, that music with just intonation seems antiquated and quaint; medieval in quality.

Here’s a video that does well in demonstrating the differences between equal and just intonation. You can see the challenge in this demonstration as he has to manually create each individual frequency for each interval he creates.

https://www.youtube.com/watch?v=VRlp-OH0OEA

The concepts of intonation are hard to illustrate acoustically as there are few instruments to elaborate with. This inspired me to create such a reference, so that educators and scholars could have a reference to complex intervals in both intonations.

My design was to create a four oscillator tone generator that could leverage a fundamental frequency oscillate at intervals related to that note. A user would set a four-part chord and then be able to switch intonations.

In my previous experiment, I have explored the concept of a two-state application. For ‘Frame It Up’ we had the launch screen and gameplay states. I wanted to take this further and created a dynamic multi-page experience for the user. I didn’t want to limit the exploration of the user by the number of physical controls. I endeavoured to create a six-page experience. I planned four pages for oscillators and their parameters, one page for effects, and one page for global parameters such as global intonation.

I first created three arrays of intervals, one for equal intonation, another for just intonation, and one with labels for both intonations. Each array entry contains the calculation to leverage a root frequency and calculate an interval. The arrays would allow me to apply a frequency to my four oscillators.

var justScale = [1, 25/24, 9/8, 6/5, 5/4, 4/3, 45/32, 3/2, 8/5, 5/3, 9/5, 15/8, 2];

//calculation of just intonation. Each interval is a simple ratio to the root.

var equalScale = [1, Math.pow(2, 1/12), Math.pow(2, 2/12), Math.pow(2, 1/4), Math.pow(2, 1/3), Math.pow(2, 5/12), Math.pow(2, 1/2), Math.pow(2, 7/12), Math.pow(2, 2/3), Math.pow(2, 3/4), Math.pow(2, 5/6), Math.pow(2, 11/12), 2];

//calculation of equal intonation. Each jump from semitone to semitone is a calculated exponential increment.

var scaleLang = [‘Unison’, ‘Minor\nSecond’, ‘Major\nSecond’, ‘Minor\nThird’, ‘Major\nThird’, ‘Fourth’, ‘Diminished\nFifth’, ‘Fifth’, ‘Minor\n Sixth’, ‘Major\nSixth’, ‘Minor\nSeventh’, ‘Major\nSeventh’, ‘Octave’]

//names of intervals, regardless of intonation

I thoroughly enjoyed making the code for this project. I pushed myself to become more comfortable with functions and objects. I was able to make multipurpose functions that would satisfy a number of scenarios without writing the code over a dozen times. I feel I could take this further and make my code tighter. In making the code smaller, I feel I’d be able to get a better overview of the global logic of my code while at the same time allowing me to make quick changes with global effect.

The function I’m most proud of is latchKnob. With latchKnob, the potentiometers need to move through a current value before it takes control of that value. It makes the dynamic switching of pages possible. Without this, all parameters on a page would snap to the current potentiometer values when the page is loaded. latchValue is an absolute (+ or -) range that widens the function’s parameters.

function latchKnob(value, input){ if (Math.abs(value – input) < latchValue) { return input; } else { return value; } }

20171113_114257

I am very happy with physical peripheral. It’s housed in a sturdy plastic container. As a prototype it allowed me to flexibly configure and troubleshoot the electronics and Feather. It’s open base made accessing the electronics easy. It was fairly straightforward to make a symmetrical design with my eight faders. Instead two banks of four, I wanted to separate the function of two buttons while maintaining the symmetry of the device. The dial on the left would control the page and the dial on the right would control the root note.

20171113_114854

Overall, I’m very happy with how this project evolved. I still feel it’s a work-in-progress and I’m excited to continue working on it. As a first version, I learned a lot about the system’s strengths and weaknesses and found direction for future versions. For the next version, I’m going to make a page that displays and controls all intervals for all oscillators in one space. I also want to explore the sequencer again, allowing you to play chords in a rhythm or possibly as an arpeggio.