Warm Thoughts

By Quinn Rockliff & Emma Brito

Project Description

Warm Thoughts allows friends to offer comfort to one another through the warmth of a remote hug. It operates with one friend pushing an inconspicuous button in their wallet which will then warm their friend’s matching wallet. It is a subtle way to offer support and warm wishes to one another while also letting them know that they aren’t alone. The idea is that this can be done from anywhere so that the warmth can be a spontaneous surprise. This project facilitates communication without the need for words even at a distance.

GitHub Code


Process Journal

Brainstorming and Beginnings – Initially we were thinking that this project would be a good way to expand on the ideas Quinn was working with in experiment 3. We would be able to create a supportive interaction between two people. We liked the idea of creating a warm and friendly interaction that could offer support remotely. We quickly ended up on the idea of using a heating pad to do this because of the therapeutic functions heat can offer physically and mentally. We were playing with the idea of pressure points but ruled them out because of the problems they posed in regards to wear. We still wanted to maintain the calming quality if the project and decided to try using the heating pad to warm up essential oils. These oils would create a relaxing environment with their aroma once heated, something that would be instigated from one friend to another.

Initial Idea – The full idea that we decided to pursue is one where both people would be receiving and publishing signals from PubNub. The whole interaction would be initiated by the first person pushing a button because they are in need of comfort and support from their friend. This push of a button would light up a LED of the friend’s so that they could in turn push a button to heat up the pad. This would then warm the essential oils of the first person so that their environment would become more comforting thanks to the thought and action of their friend. The idea was that the the LED and button circuit would be portable so that they could be reached wherever they are, while the essential oils and heating pad would have to be stationary.

Wiring and Code – We decided to first get started with wiring each individual circuit and testing it with sample code before having the devices speak to each other. The buttons and LED we were able to get up and running without too much issue.


Button we used in the beginning

The heating pad’s circuit we tested with a circuit used for a fan and motor code. It required more voltage so we had to add in an additional power source. Once the fan started running we replaced it with the heating pad.


The two circuit boards

Once all of the pieces were working separately it was time to try to get them to speak to each other. We used PubNub examples as a starting point for our code.  We had to alter them according to whether a light or motor was being triggered and different code was used on either side of the interaction. We got the button to LED light code working as well as the button to motor code, but had not written them together.


LED being lit through PubNub

Our next goal would be combining the two.

Button Changes – One issue that we encountered with the button is that while it triggered the reaction on the other end, it did not hold it. For instance, in order to  properly heat the pad the button would have to be held down for a long period of time. Needless to say, this isn’t very user friendly. We decided to look at switches and other options that would be able to hold an “on” value. We looked at traditional switches but decided on self-lock switches that stay down when pushed, and are released when pushed again. This way the user does not have to physically hold the button down for the entire interaction. This meant that we had to rewire the button slightly, but were able to use the same code. There are few references for this online, but eventually we were able to find that there is a specific prong to draw the power to.


Our new self-locking switch

Issues and Challenges – The big issues began to arise when it came time to merge the codes for the two PubNub interactions. We wrote “readFromPubNub” and “publishtoPubNub” in the Void Loop section of the code so that it is easily called on and that there is a stream of information when reading. We included the value information and if-statements in this section as well. We found a number of syntax errors after this point and had to spend some time cleaning up the code.

Upon going to office hours it was suggested to us to not constantly print a value to PubNub because it can overwhelm the system. Instead we were told to trigger a response in each other’s device. We liked this idea a lot, but it turned out to be easier said than done. We adjusted the code accordingly for each breadboard, but they were not responding to each other properly. The buttons would get stuck on a “1” or “0” value but not switch when pushed in the way they had before. The LED and heating pad would also be turned on occasionally, but not from the buttons. We could not figure out what was triggering them. We tried putting in the old code as well, with the printing values at a delay. The buttons became more reliable with this, but still were not triggering the motors or LEDs. We also re-wrote the if-statements slightly to include an “else” but there were no changes.

Final Iteration & Prototyping – Due to the interaction that is encouraged with this project, there were a couple features we wanted to pursue in our final prototype. The first was that it can be easily carried around with someone on a daily basis, so that it can be used anywhere. Ideally it is useful even when not triggered. For instance, we thought that the button would be great if a part of a pencil case or wallet. This way it is handy and inconspicuous. For the heating pad, we played around with the idea of a scarf, a shirt, and even an eye mask. Ultimately we chose a matching little purse because it is one of a set, and can also be easily carried around while still offering warmth and support. We chose to get smaller breadboards to facilitate this as well.


One of the matching change purses


Breadboard for prototyping with trimmed wires

We thought that a rechargeable battery pack would be ideal, but they were sold out everywhere we checked. We decided that a AA battery pack  would work as an alternative. We tested them out during our prototyping, but they started to spark and melt the wires. Due to this we decided to use our computer cables instead.


The “X” to mark the button on one. The warm one has a heart to distinguish it.

Sketches, Design Files, and Photographs


Brainstorming how to wear the items – t-shirt, eye mask, and necklace


Stationery items – bedside aids

Project Context

Like mentioned above our initial inspiration was a companion piece to Quinn’s experiment 3 project in which she was able to push a button when she felt triggered by content seen online. Here we wanted to create an interaction that offers support to an individual who is going through a stressful period. It was significant that the heat is offered by another person in order to emphasize that they are not a burden and that they are not alone in their experience. The heat is suppose to reference the warmth of a hug, even though if it is at a distance.

Fritzing Diagrams


Button Breadboard


Heating Pad Breadboard

Video of the Device Being Used


The two final prototypes on display.


This heated pillow helped explore different forms that we could use the heating pad in. The pillow could be used to comfort a kid, as well as keep them warm.  https://www.smokonow.com/collections/pillows/products/foxyl



Comforting Items

PubNub Code: Creation and Computation GitHub Examples – Feather

Button Code: https://www.arduino.cc/en/Tutorial/Button


By: Emma Brito

Project Description

SleepyTime is a laptop peripheral addition that aims to assist in lulling the user to sleep. Every element of the project seeks to be a relaxing addition when falling asleep. To use Sleepytime the web page hosting it must be open. Then one of three calming instrumental songs will begin to play. The volume of the song is controlled by a light sensor. As the light gets dimmer the music also gets quieter with the goal of gradually lulling the participant to sleep. The light sensor is hidden within a teddy-bear so that nothing seems out of place within a “sleeping” environment. Finally, if the participant chooses to leave the laptop screen illuminated, they will see an animation of twinkling stars in the night sky.

GitHub Code


Process journal

Brainstorming and Beginnings

Initially I was excited to do this idea with a pulse sensor as the input, and create a relaxing activity that involved a visualization of pulse rates on the laptop screen. I felt that it suited the prompt of “making something for the computer it doesn’t have” very well. After all, laptops and computers often are a source of stress because of their relationship to work. Since we bring them home with us it seems like we can’t get away from stress. This is where a project that focuses on relaxation would come into play. It emphasizes de-stressing.

To create this idea I went to Creatron and purchased a pulse sensor. I worked with the pulse sensor for a while, but I found it difficult to work with. I initially had a hard time getting access to the values. Once I did the sensor itself was unreliable, and when I was able to get readings they were inconsistent. I tried a couple different codes I found on GitHub to see if it made any difference (there were no examples like we have for some of the other sensors because pulse sensors were not included in our initial kits). After playing around with the sensor for a number of hours I decided it was time to modify the project slightly.

New Idea

I still wanted to pursue the idea of relaxation within the project, when I had the idea of taking it to its fullest extent, i.e. sleep. Once I came up with this as the ultimate goal, I quickly decided on a light sensor as the input. Initially I was planning on it having it serve more as a prompt for sleep, with the lullaby turning on once it hits a level of dimness within the room. I altered the intention slightly once I realized that light levels can drastically change from room to room, regardless of the time. Having the web page opened intentionally is the best way to ensure that it is most effective. THis also meant that the project would become more of a sleep assistant.

First Issues

I was easily able to wire the light sensor into the breadboard and upload the code into the serial monitor. This was the first thing that I did. Unfortunately getting started with P5 was not as smooth. I was taking things one step at a time, I made sure my libraries were in order and tried loading an .mp3 into my sketch. I then put it all on my webspace. This led to a number of different errors unfortunately.


I couldn’t figure out what the issue was, especially since the code was lifted straight from their examples. It was working for other people, but not mine. I deleted everything and re-downloaded the P5 libraries and sketch. It still didn’t work. After going through the index and code once more, the only thing I could think to do was try downloading the libraries yet again and trying the same code and file. Luckily this time it worked! It loaded successfully on my webspace.

This meant that I could finally get to work connecting it to the Arduino. I added the port, console.logs, etc… Once they were both up and running I could then begin making my project. I started with the basic the sound code.


Adding Variations

I added in my “if” statements for the volume to adjust to the light values ranges. The music will get quieter as the light lowers. I also added in a few more song options and a “random” feature so that there was more variety of the music. This was done in an effort to keep the sound from becoming too repetitive. This appeared on the webspace.


Issues with the Light Sensor and Sound

Once I tested the whole thing out, I found that the “random” songs was the only aspect of the previous stage that had worked without issue. The volume of the songs were not changing. I tried shining light on the sensor as well as covering it, but there was no difference. I went back to look at the values listed in the “if” statement and compared them to the sensor readings and found that the light values had changed within the room. I adjusted the ranges but still couldn’t find a difference. Since everything else was working, I had a difficult time figuring out what the issue was.


Ultimately the code in Arduino had to be changed from Serial.print to Serial.write. The volume then would change with lower values of light reaching the sensor, but would not increase in volume once uncovered. This led to another readjustment of the If statement ranges.



Once the sensor and sound system was running properly, I was able to focus on the visuals of the webpage. This had not been my priority because visuals in creating a “sleepy” environment seemed to be less important (closed eyes, and all). Making the web page more appealing to look at was ideal though. Originally I was going to work with visuals of water, but ultimately settled on the night sky. While I wanted to avoid this because of my previous project dealing with the stars I decided it was more relevant and water would be better suited for different audio. Originally I uploaded an image onto the canvas, but it did not enhance the experience very much. I then decided on a GIF because it could easily be looped, and the repetition of it could be hypnotic.

This meant that I had to download the P5 play library, and place it in my file. I separated a series of still images of twinkling stars from a video on photoshop in order to create a loop of five frames for the GIF. I followed the sample code and was able to load them onto the webpage as an animation, although they appear as a GIF. The main issue I encountered with the animation was that it did not fill the entire screen. I found it distracting, so I had to re-export the frames in a larger format. I found that this created a more immersive experience.


Final Touch

Once everything else was completed, it became time to hide the breadboard. I didn’t want to just place it in a box because this didn’t seem to suit the overall intent of the project. Something related to comfort seemed like the best idea, especially something could have a purpose. I landed on the idea of a pillow or teddy bear that would have the light sensor exposed, but could still be cuddled. This is exactly what I did, with the breadboard in the teddy bear. Moving forward I would like to make this wireless with more variety in music and visuals.

23547019_10214740969780790_1106325090_o 23555263_10214740971620836_1863701914_o

Sketches, Design files, and Photographs

img_2003 img_2004

Above: Two early sketches of how it might be used and set up


The Fritz followed for the light sensor

Video of the device being used

The music changing sound early on

Project Context

Many of us joke about how our laptops keep us up late at night. They often serve as a tool or insomnia rather than sleep. Additionally, technology is often seen as  stress-inducing rather than relaxing because of it’s association with work. All to say I was interested in figuring out a way to create the opposite effect with the this project. I wanted to create something that would assist in relaxation and have a calming effect on the user. I also didn’t want them to have to think about using the system because that would cause potentially  more stress as well.


Sleep Machine App: The connection between this app and my project is obvious. Both incorporate digital technology and the internet to create a relaxing sleep environment. I like how this one can be personalized for each user and offers a wide range of options of nature sounds. If I were to pursue other iterations of SleepyTime, I would like to incorporate the more options for the user. http://www.sleepsoftllc.com/

Sleep Machines: These machines are updated versions of the traditional white noise machine. Rather than playing music, they offer a variety of other soothing sounds. Most involve natural phenomenon, while settings include white noise and pink noise They can now be controlled from devices. https://www.soundofsleep.com/soundsleep/

Sound and Sleep Studies: There have been a lot of studies about how sound can be disruptive to the sleep cycle and the negative health impacts that this can have on people. It can be beneficial however, depending on the kind of noise utilized. Soft, repetitive sounds within a room can drown out other disruptive noises within the room like outside traffic or even a snoring partner. It regulates the sound environment within the space which is crucial to getting a good night of sleep. http://www.huffingtonpost.ca/entry/white-noise-sleep-tips_us_5707e35ce4b063f854df7b5d

P5: I used p5js.org (http://p5js.org) and Make: Getting Started with p5.js for many portions of the code

Arduino Code: I used this code for the light sensor -https://www.arduino.cc/en/Tutorial/AnalogReadSerial

Songs used as lullaby

  • Andy McKee – Rylynn
  • Andy Williams (Acoustic) – Moon River
  • Iron & Wine (Instrumental) – Each Coming Night

Written in the Stars


By Kylie Caraway and Emma Brito

Written in the Stars operates like a digital puzzle that requires teamwork between 20 participants and their phone screens in order to view the entire night sky. It begins with a physical printed map of the sky with the constellations’ names, but is void of their images. To see a constellation, participants must go online on their phone, click on a link for a specific constellation, and then raise and tilt their phones slightly, as though they are viewing the sky through their phone. Once the phone is tilted to a specific degree, the image of the constellation will appear. In order to see all 20 constellations at once, there must be 20 people participating in order to piece the map and its proper constellations together.

The fact that each screen only displays one constellation  at a time is an important feature. When used alone, the screen only offers a small fragment of the night sky that is visible. This means that the screens, and the people holding them, are reliant on the interaction with others in order to complete the puzzle and entire image of the night sky.

Github Code https://github.com/kyliedcaraway/Written_in_the_Stars


  1. Andromeda
  2. Aquarius
  3. Aries
  4. Cancer 
  5. Capricorn 
  6. Cassiopeia 
  7. Centaurus 
  8. Draco 
  9. Gemini 
  10. Leo
  11. Libra
  12. Orion
  13. Pegasus 
  14. Pisces 
  15. Sagittarius
  16. Scorpio 
  17. Taurus
  18. Ursa Major
  19. Ursa Minor
  20. Virgo

Process Journal



When we first received the assignment, we quickly decided on using stars and constellations as the focus of the project. This backdrop could utilize simple shapes in complex ways, which we found to be both doable and effective in P5 javascript.

Initially we liked the idea of having all of the constellations in a single  3-dimensional space, so as a device turned the sky-scape would change as well. We liked the idea of people having their own experience and perspective within the same space. (We later realized that this would rule out interaction between participants, therefore eliminating the need for 20 phones in a particular space.)

Beginnings/Trial and Error:

We found a P5.JS code called “orbit” that created a 3-dimensional space and would allow us to hang shapes within in it. When used on a laptop, the canvas would orient to the mouse as it was dragged, yet it would snap back to the original view when the mouse button was released. This created a problem of creating a realistic night atmosphere. We decided we would instead use phones and devices with an internal compass feature so that the change in position was registered based on the rotation of the device. Laptops were ruled out as a result.

Unfortunately, we also quickly found it difficult to manipulate the orbit code. We couldn’t randomize the spheres within the code in order to mimic a starry sky, and it was difficult to pinpoint new shapes where we wanted them to go. 2D planes were also very difficult to place in the 3D view. The 3-dimensional space itself was limiting in size on our phones, which would make all of our 20 constellations impossible to include.


The New Plan:

We scrapped 3D orbit after we realized it wasn’t going to work well for us, and instead decided on a 2D iteration of the night sky, as Kate suggested. We decided to give each user one constellation, as a piece of the larger puzzle of the universe surrounding us. Using p5.js, we would create a 2D landscape, a constellation, and an interaction comprised of tilting the phone to create an interactive experience that relies on the participation and interaction of 20 users.

Atmosphere / Arrays:


At first, we searched for codes or examples of astronomical atmospheres that created linkages between the stars as you clicked. (This idea can be visualized through particles.js). Unfortunately, we could not get the particles.js library and code to work within our canvas. There were issues in the javascript console between pieces of code within the particles library, which was too daunting to attempt and problem solve. Next, we looked at parallax effects using arrays. These seemed to work best on laptops, but would not translate well onto a phone without a mouse hover function. They also felt like they were more appropriate for a video game (such as asteroids) rather than an observant experience. Finally, we found a star array code that did not rely on interaction or extra libraries. This began to serve as our basis, creating an atmospheric background to surround our constellations. We changed portions of the code, because the stars were too slow and not visible on our phones. We adjusted the frames per second, the ellipses colors and sizes, as well as the orbit’s location.


Kylie incorporated a gyroscope measurement code into our project so that the motion of tilting the phone would result in the appearance of a constellation, in order to mimic the act of looking up to stargaze. The video of the constellation would then play on loop until the phone was lowered and no longer tilted. We only focused on the variable “beta” in our phone, which measured how much the phone was tilted on the X-axis. At first, we told the program to only draw the constellation when it was greater than 120. While this angle is more aligned to how users actually look up into the sky, we realized this would create problems with our map on a flat wall. We changed the code to draw the constellation when it is greater than 80, so people could view the constellations as they hold their phones against the map on the wall.


Once we got a grasp on the new kind of code we would be pursuing, we started making the constellations themselves. We chose to include the 12 astrology signs, as well as 8 of the more well known constellations. Astrology signs were important to us, because they are commonly a subject that implements conversation and interaction between users. We decided to create the constellations as simple animations in After Effects, and then place the video to play over the array. Each constellation would follow the same basic format with a few different variations (colour, effect, shapes) in order to keep them stylistically aligned without becoming repetitive. We also toyed with the idea of animating the myths behind the constellations, which would only be seen if users pressed the constellation, but we decided against this for 3 reasons: 1) users pressing an image this is tilted above their head would result in an uncomfortable, non-intuitive experience 2) this feat of ultimately 40 animations was beyond our scope and could not be accomplished in our timeframe 3) getting one video to work was proving impossible; two videos that rely on multiple, sequential interactions was asking for disaster. Although we tried numerous ways to get the video/GIFs to work, our code did not want to embed the frames within the canvas. In the end, we used 1920 x 1080 JPEG photos, so we could keep the quality of the design without large file sizes.

Challenges and Issues:

Our biggest challenges in this project revolved around code issues. We had slight issues at first with WEBGL and 3D scapes. Pieces were difficult to move around, the code was very sensitive to any changes, the orbit control did not move the way we visualized it would on a phone, and the 3D space it constructed felt too confined for our project. Through these issues, we opted for a 2D space instead.

We also had issues with star arrays and background visuals (as mentioned before). After trial and error, we researched different arrays that depicted constellations, and ultimately found one that was easy to understand, implement and edit, and fit nicely within our project.

Our largest obstacles were displaying video files and GIFs of our animated constellations. Our first signs of trouble began when we ran into problems when we rendered the constellations. We wanted to keep their alpha channel so users could view the stars behind them, but the video files were huge (200 MB or more,) and both Atom and Sublime crashed when we used them as assets in the code. We then tried to take the video and create PNG sequences out of them. Atom and Sublime didn’t like this either, because our animations were anywhere from 5 seconds to 10 seconds long (looking back on it now, I believe this could be why we were unable to play the videos or sequences). We downloaded the P5 Play library, and attempted to run the PNG sequence, but the animation would never load. We finally decided that we had to scrap the Alpha channel, and plan to have a background color behind the constellations. This realization forced us to change the location of the code, so a black box would not appear on top of our array. Ultimately, we had to draw the star array after the constellation, so that the canvas background and constellation background would blend seamlessly.

We also tried GIFs, with no optimal results. The GIFS would not load as videos.  They would either hold the first frame (creating a still image), they would draw on a weird spot on the screen that was not within the canvas whatsoever, or they would not draw at all (the most common scenario).We attempted to download a GIF p5.js library, but there were issues with the library, and the GIF would never play. We also attempted to use the P5 DOM library and elements code to run the GIFs. By using “create image” rather than “load image”, the GIFS would finally appear… Except they were writing over the array, regardless the location of the code, and their location would change based on the device. In the end, the GIFs never operated how we wanted them to.

Going back to video, we attempted to make smaller videos that would load more easily onto the phones. Unfortunately, the videos would either crash the site, never load, would only load the first frame, or would load outside of the canvas and ask you to press play on the video, which would open a new tab with the video. After multiple attempts to implement the code, from the p5.js book and website, to other tutorials online, we could never get the video to load in our code. After meeting with Kate and Nick on Wednesday, we attempted to take apart the pieces of the code. After separating the various portions, we could not get our videos to load within a canvas on both the iPhone and Android phones. In the end, we ultimately decided to use images rather than video. These images were able to load quickly, they were placed in the right location, and they were reliable, as they worked on both types of phones.


Code Issue Examples:

In this example, we attempted to load an MP4 file. While this code would respond to the Gyroscope and place the video in the proper location, it  would only show the first frame of the video.

In this example, we used the P5 GIF library in attempts to load a GIF. The GIF would not display in the proper location, it would not play, and it would not respond to the Gyroscope code. This attempt was the most failed, as nothing was working properly.

In this example, we used p5 Element code. This was our closest success story. The GIF would respond to the gyroscope code, it would play the GIF, and it was centered for iOS. Unfortunately, it also drew over our array, creating a black box around the GIF, even though the code was beneath the array. Additionally, when we attempted this on an Android phone, the GIF would not center and created issues with the canvas fitting to the phone screen size.


Our attempts to deconstruct code. Removing all other code, we tried to load a MP4 video by itself. No success. I assume this is the result of our video file sizes or the video length.


In our first iteration, our map consisted of both the constellations’ names and diagrams. We were relying on the map to help users place their constellation in the larger image, but we realized that the devices would be obsolete if the information was already on the map, so we removed the constellation graphics to create a game element. Removing the constellations allows the user to have an incomplete visual without the assistance of their devices.

Final Iteration:


Regardless of the iterations this project went through, we are very happy with the final incarnation of Written in the Stars. It differs from our original plan since it is an image presented rather than a video, but the other features are present. The gyroscope prompts the image to appear, while the array is a constant. We incorporated the physical map in order to encourage interaction between people and devices, after all stargazing has been a social activity, with stories and mythologies associated with each constellation. We also provided information on zodiac signs for participants unfamiliar with astrology to learn their sign, as well as horoscopes for a fun read and conversation piece to connect with our interactive installation. As each participant has their own constellation, they can participate with others to create a full atmosphere of the night sky.

Vimeo link here : https://vimeo.com/240378307

Sketches, Designs, and Photographs


Sketch of our initial brainstorming ideas. While we scrapped the 3D atmosphere and 2D game, we implemented our original user experience, complete with animation, gyroscope, and our revised sky map.

We considered incorporating written information, such as a constellation’s history, science, or mythology, but we decided it clogged up the screen and detracted from the overall image of the night sky.

This depicts our colour palette, as well as the aesthetic style we implemented in our project. We strove for clean and simple lines, ellipses and stars, with limited colour options, in order to remain cohesive, yet have enough variety to be visually appealing.


This is a process image of the creation of the Cancer constellation in After Effects. This is one of the smaller of the 20 constellations included in Written in the Stars.


Gemini was our test constellation within the code. This is an image of the visual that appears after the gyroscope is activated. It was with this image that we first realized the video wasn’t playing and launched a series of trial and errors in order to attempt to make the animation play.

Graphic of all of our constellations


First design of our map. Kylie made the mistake of making it in Photoshop at 72 by 48 inches with a resolution of 300 pixels per square inch. The file was huge, wouldn’t save, and kept crashing. She finally was able to save it as a PDF. It was 1.64 GB and the print shop she sent it to would not accept a file that large. She then recreated it in Illustrator. While she couldn’t get the faint nebulae texture she had used in Photoshop, Illustrator was overall the preferable tool: the map could be resized to any desired amount, the print shop preferred illustrator files, and the file size was less than 1 MB. Lesson learned: use illustrator for large prints. Another lesson learned: don’t wait 2 days before project is due to get your map printed. Print shops love to charge you somewhere around 500% markup of the original price for a rush order…


Our poster rolled out for the first time!


Final Presentation Day!

Written in the Stars in action:

The presentation was successful. The coding and images worked properly and we got the desired reaction from the class. We also turned out the lights for added effect.



Our project can be contextualized through a couple of different avenues. As was touched on previously, it was important to us to include the astrological constellations because of the personal connection and sense of ownership people feel for their sign. We don’t need to look further than the fact horoscopes are a staple in nearly every newspaper.  A stir was even caused because of this  in 2011. Astronomers said the moon’s gravitational pull on the Earth had changed the axis of the Earth, resulting in different astrology signs for each month. After public commotion, NASA had to put out a statement reminding the community that astrology is, in fact, not science.

Stars, and the night sky in general, have been a popular subject throughout history in various forms of art, and later in media. From its initial introduction by the Babylonians as a storytelling technique, to its representation in artwork, such as Salvador Dali’s illustrations of the signs and Vincent Van Gogh’s iconic “The Starry Night,” to astronomy’s current popularity as both a marketing and social tool, there is no question regarding human affinity for the stars. Due to the ubiquity of astrology and the love of stargazing, this project is relatable to a wide audience. We wanted to capitalize on the social aspect of this activity as well, and the 20 screen requirement allowed up to do this.

While currently Written in the Stars currently serves as an installation which encourages communication and interaction, Written in the Stars could be further developed as an educational tool to teach astronomy. This project could additionally be used as a data visualization tool. Both NASA’s Kepler Space Telescope that monitors astronomical phenomena and the European Space Agency’s Gaia telescope that has produced a revolutionary catalog of the structure of stars in the Milky Way Galaxy serve as a model for this project.

In the end, Written in the Stars has the potential to be used for discussions about physical sciences as well as social sciences, as a digital puzzle that can be used for entertainment, group participation, and to illustrate “the unique cognitive-emotional link that makes us the intelligent creatures we are” as we sort through pieces of “randomness” and “information” in order to create a full, comprehensive picture of our surroundings (Mutalik).

References and Influences

It’s impossible to talk about our Written in the Stars project without mentioning the Sky Map (https://play.google.com/store/apps/details?id=com.google.android.stardroid&hl=en) app. It serves as both inspiration and aspiration for this project. While our project differs from Sky Map because of our focus on the interaction of people working together versus an individual experience, Sky Map is thorough and places all the constellations within one space. We would like to move this project forward to include Geolocation of the constellations, as Sky Map has effectively implemented throughout their app.

When we searched online for images of constellations and maps, we noticed that there were extreme variations of constellation forms, number of stars, and location of constellations. We decided to use a reputable source, National Geographic, (https://www.nationalgeographic-maps.com/media/catalog/product/cache/7/image/8ecabcfb697832bc77ac7e2547ded39f/x/n/xng195712a_90.jpg) as our resource for constellation formations, locations, and our map iteration.

Kelsey Oseid’s book, What We See in the Stars: An Illustrated Tour of the Night Sky (https://www.penguinrandomhouse.com/books/553191/what-we-see-in-the-stars-by-kelsey-oseid/9780399579530/) provided inspiration for visual aesthetic, as well as information about constellations. Although we were unable to get animations to run in this prototype, Oseid’s book will continue to be a great reference through the development of this project into an interactive storytelling tool about constellations, astrology, and the science behind our universe.

Astrology.com (www.astrology.com) provided us with the horoscopes and dates for each of the zodiac signs. During the installation, we handed out slips of paper with the constellation, dates of the zodiac, our website link, and their horoscope. This provided an extra entertaining detail to get participants engaged with their constellations before the installation began.

This was the initial code for the array we used. We altered both the size of the ellipses and the colour of the background to better suit our phone screens. Other code was gradually simplified, altered and added, from changes in frame rates, to position and flow of the orbit, to the number of stars.

This is where we received the code for the Gyroscope/ Accelerometer. We used this code to measure the phone’s position as we moved and tilted the phone. We realized we would only be using the Beta variable, so we removed the Alpha and Gamma. We then deleted the rectangle and code that showed the values of each axis. In the end, we ran a simple “if” statement, so that when the Beta variable was above 80, the code would draw the constellation.

We also referenced p5js.org (http://p5js.org) and Make: Getting Started with p5.js for multiple portions of our code.


Alessio, Devin. “The Cocktail You Should Be Drinking Based on Your Zodiac Sign.” Elle Decor, 21 Dec.

   2016, www.elledecor.com/life-culture/food-drink/g2889/

   the-cocktail-you-should-be-drinking-based-on-your-zodiac-sign/. Accessed 25 Oct. 2017.


“Daily Horoscopes.” Astrology.com, www.astrology.com. Accessed 26 Oct. 2017.

Darley, James. “A Map of the Heavens.” National Geographic, Dec. 1957,


   x/n/xng195712a_90.jpg. Accessed 25 Oct. 2017. Map.

Garreau, Vincent. Particles.js. www.vincentgarreau.com/particles.js/. Accessed 20 Oct. 2017.

Guarino, Ben. “Chaos in the Zodiac: Some Virgos Are Leos Now (But NASA Couldn’t Care Less).” The

   Washington Post, 26 Sept. 2016, www.washingtonpost.com/news/morning-mix/wp/2016/09/26/


   Accessed 25 Oct. 2017.

Johnson, Michele, editor. “What Does Kepler Have Its Eye On?” NASA, 31 Aug. 2017, www.nasa.gov/

   image-feature/what-does-kepler-have-its-eye-on. Accessed 25 Oct. 2017.

Kuiphoff, John. “Gyroscope with P5js.” Coursescript, edited by John Kuiphoff, 2017, www.coursescript.com/

   notes/interactivecomputing/mobile/gyroscope/. Accessed 25 Oct. 2017.

Max. “[p5.js] Starfield.” Codepen, 9 Oct. 2016, www.codepen.io/maxpowa/pen/VKXmrW. Accessed 25 Oct.


McCarthy, Lauren, editor. P5.js. p5js.org/. Accessed 25 Oct. 2017.

McCarthy, Lauren, et al. Make: Getting Started with P5.js. San Francisco, Maker Media, 2016.

Mutalik, Pradeep. “Can Information Rise from Randomness?” Quanta Magazine, 7 July 2015,

   www.quantamagazine.org/information-from-randomness-puzzle-20150707/. Accessed 25 Oct. 2017.

Oseid, Kelsey. What We See in the Stars: An Illustrated Tour of the Night Sky. Ten Speed Press,


Popova, Maria. “Salvador Dali Illustrates the Twelve Signs of the Zodiac.” Brain Pickings, 19 Aug.

   2013, www.brainpickings.org/2013/08/19/salvador-dali-signs-of-the-zodiac-1967/. Accessed 25

   Oct. 2017.

Sky Map. Android and iPhone app, Mobius Entertainment, 2016.

Wolchover, Natalie. “From Gaia, a Twinkling Treasure Trove.” Quanta Magazine, 14 Sept. 2016,

   www.quantamagazine.org/gaia-telescopes-first-data-set-released-20160914/. Accessed 25 Oct.


Adult Sky Dance Off

Ramona Caprariu

Emma Brito

Finlay Braithwaite



The Adult Sky Dance Off is a series of “sky dancers” which operate at 2 speeds upon the power being turned on by the button sensor. It features figures made from rubber condoms in order to  comply with the requirements, which will be blown up by the a fan motor and the Arduino microcontroller. This is the ultimate product of this assignment which called for 4 main components of: a button as a sensor, a fan motor, rubber as a main material, and needed to be “funny”. The participant stands behind the set of the sky dancers an turns on the program, by pushing the button they are then able to create a dance that others can view for entertainment.


When figuring out how to create our sky dancers we used videos to figure out the best way to mimic the movements







Process Journal

We began this project initially with the idea of creating a whoopee cushion. And then moved to the idea of creating a phone accessory wind-blowing-selfie-effect. Both ideas were discarded for the same reason; the strength and speed of the fan would not be able to power these projects. We readjusted for our limitations and developed this sky dancer idea.

We began by writing the code to power the 5V fan. We quickly ascertained that the speed would still not be sufficient for our idea. So we bought three 12V fans to replace the ones we were already equipped with in our kits. Along with them we needed a 12V power supply outlet as well. Our prototype ran better with the new equipment, however there was still a bit of difficulty regarding the fact that the condoms weren’t ‘dancing’. So to account for the slow fill-up time, we decided to use the button not just as a power switch, but as a way to adjust the fans at two variant speeds that would simulate a kind of ‘dancing’.

22281128_10214439986936407_104552421_o-1 img-1479

We then had to find the condoms which would best fit the nature of our project. We sought out both lightweight and colourful options.

22279192_10214439881613774_1428237870_o  22312250_10214439881293766_2140255125_o


When it came time to construct the project it became clear that the decision to use higher volt fans was the right call. We found that even these fans were not powerful enough at their highest setting to fully inflate the condoms, and this became the largest challenge we faced. We experimented with cutting off the top and poking holes for air to escape and give the dancers more of a dramatic movement. These ideas proved fruitless. Ultimately we decided that cutting them to make them shorter was necessary.

To further air in inflation we discovered that the fans needed access to more air beneath their base. Using screws legs was effective in this regard and led to better results.

20171005_152632 20171005_152636

Video and Final Prototype

Design Context

We had to successfully embody our 4 variables and initially the most difficult one to include was “funny.” By the end it was this characteristic that the project was built on. We strove to attain this by the materials we chose and the element of interactivity. It at once references the innocent entertainment of a puppet show while also possessing a level of cheekiness and humour to the final sky dance.




DojoDave> (<2005>) <Arduino/DigitalRead Serial> (<Adafruit>) [<Adafruit>]. Arduino. – modified 30 Aug 2011 by Tom Igoe

Dancing Inflatables. “Smallest Skydancers in the World Dancing Inflatables.” YouTube, YouTube, 4 Nov. 2008, www.youtube.com/watch?v=Fsq9QX6TsuM.

Greenspan, S. (2014). Inflatable Men. [podcast] 99% Invisible. Available at: https://99percentinvisible.org/episode/inflatable-men/.

Hartman, Kate> (<Sept. 25, 2017 >) < Analog Input circuit + Analog Output circuit> (<code version>) [<Adafruit>]. Arduino.

skydancereurope. “SkyDancer AirPuppets Dancing for You.” YouTube, YouTube, 11 July 2008, www.youtube.com/watch?v=LOAfGKcLJUY.