Author Archive

Experiment 4: The Stress Bull


By Ania Medrek

Link to code on GitHub:
Link to data spreadsheet:
Link to Video:

The Stress Bull is a toy that is squeezed in the hand and massaged by fingers. Stress Bull is intended to help relieve stress and muscle tension. Squeezing a ball (or bull) lifts tension from the muscles and distracts the mind from anxieties and concerns.

The bull was stationed on the 6th floor of the Graduate Studies building at OCAD University for 8 hours straight on November 22, 2016. Anyone who stepped off the elevator or came out of the stairwell couldn’t miss the Stress Bull booth, and more than 50 people stopped and gave it a squeeze. Students and faculty from the IAMD and Digital Futures programs reported that it was a fun and welcome distraction from all the school work piling up as the end of the term nears.

A pressure sensor embedded in the handmade stress toy collected data throughout the 8-period. Every second, if a new value was registered, it would be sent to Adafruit IO. Using IFTTT, the readings traveled from the Adafruit feed to a Google Drive spreadsheet. In total, almost 1000 values were registered. The data was measured on a scale from 0-800 but rarely went below 50 because the sensor is highly sensitive and even the slightest touch triggers a reading.

Only two participants squeezed hard enough to trigger a reading above 750. The majority of squeezes were of light-to-medium strength. Below is a chart with estimated percentages of how hard participants squeezed altogether. ‘Dubious data’ acknowledges the readings that are impossible to categorize. The ‘dubious data’ refers the low-ish values that could have been triggered by a variety of things: the table the Stress Bull sat on, the fabric enveloping the sensor, someone simply holding the bull in their hands, but not squeezing and more. Because of these possible factors, I considered any readings under 100 to be ‘dubious’.




Check out the video (at top of post) for more on process and results.



How does the device blend in or stand out in the environment?

I made every effort to made the device stand out on purpose. I wanted to attract as many participants as possible with funny signs (inspired by Honest Ed’s) and a colourful booth table. I made the stress toy a cute and cuddly bull to make it more inviting than a plain, old stress ball.

How does this device respond to the data it is collecting? How does this behaviour evolve as it collects more data about or for this person/place?

This device responded exactly as intended. It sent all the values as they came in and survived the hardest of squeezes. I am particularly happy that the Stress Bull stayed in one piece because I sewed it all together myself, mostly out of a reindeer-shaped hat from the dollar store. The behaviour of the Stress Bull did not change as it collected more data.

What is the overall size and form of the object?

The object is a hand-crafted sphere with horns. It fits in the palm of the average hand.

Does the object encourage interaction from others?

Encouraging interaction from others was the main goal of the Stress Bull’s size, form and signage. From design to execution, the silly puns and bull shape was intended to be engaging and draw in participants, stressed or not. I may have taken the pun too far, but I decided to go full force with it because I believe it encourages interaction and at the very least — a laugh or two.


As many people stop and reflect on our fast-paced, high-stress society, they are reaching to their smartphones for apps that can help alleviate negative health consequences of anxiety. According to Statistics Canada, daily stress rates are highest in the core working ages (35 to 54), peaking at about 30% in the 35 to 44 and 45 to 54 age groups. In 2014, 23.0% of Canadians aged 15 and older (6.7 million people) reported that most days were ‘quite a bit’ or ‘extremely stressful’.

Stress balls are available everywhere, but not many are rigged with sensors and collecting data.

Stress Bull fits into the larger phenomenon of products and apps that track personal and public health data and make it readily available for analysis. Anxiety and stress are addressed by hundreds of apps teaching methods like acupressure, meditation, and hypnosis — there’s even an app called Inner Balance that hooks up to your earlobe and monitors heart rhythm.

Devices such as Fitbit and Apple Watch that measure heartbeat and exercise data are very popular. A future iteration of Stress Bull could be a more polished and accurate product that uses sound and lights to tell users how hard they are squeezing — and encourage even harder squeezing. It has the potential of being incorporated into a phone app. Squeezing stress bull could be a fun break time activity that charts squeezes throughout the workweek.






Nick’s Fritzing example:


Experiment 3: Lost in Space



by Ania Medrek and Orlando Bascunan

Link to code:

Link to process video:

Link to circuit diagram:

Lost in Space is an interactive shape and colour recognition game. The game has physical and digital components: a game box that is placed in front of players on a table and a web interface players use to input responses. Up to 10 players.





Designate who will be the Game Master. The Game Master controls the server that will allow him to trigger the game to start. The game box is placed in front of the players (everyone must have a clear view of the game box). Each player opens in a browser on their laptops.

When all players have signed up to play by inputting their names, the Game Master will trigger the game to start.


Inside the game box, there are five cards that are programmed to pop up one at a time. Each card shows one colour and one object (ex. green alien). When the first card is triggered, players have five seconds to determine its colour and type and compare it with the four cards that have appeared on their screens. The first player to click on the card that DOES NOT have the colour or object on it gains the most points. The players have five seconds to click a card. After ten rounds, the player with the most points wins.


After ten rounds, the player with the most points wins.




As a group, we agreed early on that a game would be a fun way to incorporate networking, messaging and notifications. We were inspired by games like Ghost Blitz and Anomia which are simple and easy to learn. Our first challenge was to pick a theme for the game and work out the game structure. The space idea had a wide potential for objects and colour, so we decided to use a green alien, a purple robot, a blue planet, an orange spaceship and a red UFO as the basis of the game. We drew from other card-matching games to come up with the idea of mixing objects and colours that the player will have to sort out.

There are only eight different cards that show up on a player’s screen throughout the whole game, but only four appear per round. We figured out eight options (Cards A-H) is enough with the chart below.



There is nothing random about the game. There is only one winning card in each round of the 10 rounds. There are five pre-programmed rounds, that are played twice. We designed the game to be as straightforward to code and design as possible. The trick of the game is that it is so face-paced that it feels like there are dozens of randomised cards and options.

The programming part of this project consists of three different codes: the client, the server and the Arduino. The server is what controls the game flow, registers the clients and calculates the score. It also controls the Arduino part by sending it messages to trigger whichever flag is needed to each game round.

The client part is what each individual player sees on their screen. It displays the instructions and gameplay details, including the cards. It registers the users’ card choices.

The server and clients communicate using PubNub. We created different ‘types’ of messages, such as ‘join’, for when a user joins the game and ‘pick’ for when they choose a card. Other messages like ‘start’, ‘nextRound’ and ‘gameover’ were also used to control the game flow.

We also needed a bit of HTML to allow players to input their names.

Once the code structure was figured out and the graphics were complete, we moved on to creating physical game box. We followed a simple circuit diagram from to learn what parts we needed to regulate the power and not fry all our servos. We used a transistor, two electrolytic capacitors, and connected everything to ground to make sure this would not happen.

We found the servos very temperamental. They would reset and start twitching around whenever a new Arduino code was uploaded, or just seemingly for no reason at all. Every time this happened, we had to unplug them from the power source and re-calibrate them so that the cards were at exactly a 90-degree angle.

We had to build a few different prototypes of the game box. The first box was too small, and tape was not enough to keep the servos steady. We ended up hot-glue-gunning the servos to a piece of foam. We tried a version made with cardboard, but foam proved to be sturdier and looked more polished.


Project Context

Our game is part of the ongoing to trend to update the homely card game. Some of the most popular examples of this trend are Pokemon’s recent AR creation and Dungeons and Dragons multi-platform empire. It’s crazy to think Pokemon first became famous as a card-collecting activity. Even classics like Monopoly and Solitaire now have high-tech re-incarnations. Board games have included electronic physical ‘game-boxes’ for decades now (for example, Monopoly’s batter-operated cash-counter). Lost in Space was inspired by these games, and took it one step further by creating a web interface that works in-sync with the physical game box.

CASE STUDY – BreakfastNY Thread Screen



Presented by Ania Medrek and Mahsa Karimi

Link to class presentation:

Link to BreakfastNY’s promotional video of the project:


The Forever 21 Thread Screen is an 11-foot-tall, 2,000 lbs. custom machine that takes Instagram photos and recreates them out of spools of thread. While the machine was up and running in a New York City Forever 21 store, shoppers could tag Instagram photos with the hashtag #F21ThreadScreen and prompt the spools start spinning. It was available to the public for six days in July 2015, then moved to Breakfast New York headquarters for the duration of the F21 marketing campaign. While it was in the store, the screen was broadcasted on a dedicated YouTube page so that people could participate from home.

The F21 Thread Screen took a small team of Breakfast New York employees a year and a half to make. It could only live in the store for six days because of how many people it took to run. A webcam live-streamed the installation on Youtube to help programmers and technicians monitor the spools. When a fabric pixel went out of alignment, technicians needed to be present to figure out which spool was out of sync and reset it. Another reason the exhibit was temporary was that it exerted a nearly-hazardous amount of heat. Half a dozen fans kept the back of the machine cool, and reports that the spinning threads also created an “impressive amount of static electricity,” that without grounding, would cause the whole thing to go up in flames.



The Thread Screen was built out of more than 200,000 custom parts. Breakfast New York co-founder Andrew Zolty told media that it would have been “easier to build a car”.

When the screen is up and running, there are 80x80 ‘pixels’ of threaded fabric, with 36 different colours on each strip, which rotates similarly to a conveyer belt. Each spool has a motor that drives the fabric to its appropriate colour. Each strip has a reflective strip which is scanned by an infrared sensor that tells the machine the colour each spool is currently showing, allowing for corrections. 

The render server software that BreakfastNY created for this project takes Instagram photos tagged with certain hashtags and automatically optimizes them for the Thread Screen’s 80×80 resolution. Next, it calculates what the best visual representation will be based on the available colours on the fabric strips.

The commands are sent to the motor controllers over 10 RS485 buses (interconnecting cables). Software used includes Ruby, Node.js, Firmware written in C and C++, Bash, Linux, WebGL, FFmpeg, SQL. 

Technical information courtesy and

Images of the project from every angle available here: and here:


Clearly, the full force of Forever 21’s public relations department was in effect to get the word out about this project. Dozens of news outlets were invited to come and try it out first-hand, and BreakfastNY gave a lot of interviews in which they revealed a lot about the Thread Screen’s technology and process. Media fawned over the project, commenting on its originality, complexity and beauty. Harder to find, is the public’s reaction to the in-store experience. Questions we have for the makers would be: Did you do usability testing? Was there long line-ups? How many people participated?

In June, the Thread Screen won a silver medal prize for Innovation from the global Facebook Awards, and another silver medal in the Tangible tech category from the Cannes Cyber Lions.




Breakfast is an American rapid product and prototyping company that designs intellectual and technological properties. Focusing on engineering, and design and innovation, they call themselves “the modern-day inventors.” BreakfastNY has done mass produced products along with one-of-a-kind custom designs.

They develop and design all their products and the technology behind them from scratch. They believe that “technology doesn’t need to stand out and look like technology; It can blend in and hide the complexity behind great design.”

Some of the projects that they have worked on are:

  1. Thread Screen – a custom designed piece for Forever 21 in NY that portraits photos with the #F21ThreadScreen in Pixel art format, using 6400 threads.
  2. Points – The most advanced robotic sign on earth  
  3. Open Band – An open source watch band for Apple Watch
  4. Lester Hashtag Printer – The world’s most popular social-driven products for events.
  5. MLB Mission Control – A NASA inspired mission control for Major League Baseball.
  6. Google Verbalizer – An open source dev board for google voice search.
  7. B.LINE – Your cold war inspired direct line to BREAKFAST for their customers.
  8. CONAN BLIMP – the world’s first blimp with an Artificial intelligence.
  9. Electromagnetic Dot Screen – The first super-speed, real time flip dot display.

The Electromagnetic Dot Screen is similar to the F21Thread Screen. The dot screen is an interactive design by BreakfastNY that can display both static and moving images. The screen is connected to an infrared projector and a camera that can find objects and people in front of it and depicts the silhouette of the moving object up to 15 times faster than any other similar products.


Daniel Roosegarde:



Cuppetelli and Mendoza



HYBE – Hive for hybrid environment:




Cannes Cyber Lions award winners:

Global Facebook Awards winners:

Experiment 2: 100 Futures Lane

by Ania Medrek and Bijun Chen

Project Description









100 Futures Lane is an interactive model of an apartment building that allows the user to play the part of the nosey neighbour. There are 20 different window narratives users can tap and play with, from a couple fighting to a unicorn vomiting rainbows. The project uses audio, animation, and P5.js to bring simple digital illustrations to life.

The apartment building structure is a laser-cut piece of masonite, attached, by magnet, to a 5-foot plank of fibrewood. The plank allows 100 Futures Lane to be portable and self-supporting. When propped up against a wall, the project sits at a slight angle, making it easier to view and interact with.


Links to Videos

Process Video:

Presentation Day Video:



GitHub link:

Our code was written with phone screens in mind. On a computer screen, some images will appear bigger than others because we experimented with sizing. The larger files work best on 6-inch smartphones, while the smaller ones are better for 4-inch iPhones. In the end, pinching the screen to zoom in and out was the simplest solution.

Link to Interactive Windows

100 Futures Lane


Process Journal

We were inspired to create an interactive apartment building by looking across the street and imagining how fun it would be to see different scenarios in each window. We knew it would be too simple to apply one code to 20 phones, so we set out to create 20 different ‘window’ narratives.  During the first week, we came up with the initial 10 illustrations and used those to test different code examples from P5 reference guide. Originally, we wanted to use shake, flip, tap and swipe so that there would be a large variety of interactions, and the apartment building would be like a digital ‘dollhouse’.

By trial and error, we learned that shake, flip and swipe worked — but not very well. Flip was particularly glitchy and it was hard to figure out what angle the phone needed to be to trigger an action. We decided it would be too confusing for the user to need to figure out what to shake and what to tap. We decided to stick with the most intuitive input and make all interactions tap-based. To simulate movement, most of the window scenarios are slightly changed PNG files, appearing in a sequence, for example the Cat Sequence (which also meows):

We were still hoping to incorporate swipe into some windows, but when we tried it on the phones, it jerked the whole browser around and was overall less effective than simple taps. The Plants and Crime windows would have been nice as ‘swipe’ in particular; maybe a later iteration could use bigger screens that would allow a smoother interaction.

After we had most of our drawings and coding finished, building the structure was the next step. We decided the common denominator of screen size would be 3X2 inches. We mapped out the apartment’s face in Illustrator, then again in a separate file the laser cutter could register. We planned to build an actual 3D model of a building, with walls, a roof and all. But when spoke to Reza in the maker lab, he pointed out that we forgot to leave enough space for the little shelves the phones would sit on. We adjusted our blueprint and had a second apartment face laser cut. We decided to ditch the 3D model and create a self-supporting wall. This solved the problem of keeping the phones in place and made the project more user-friendly.

On critique day, we presented first hoping to get as much set up as possible done before class started. We asked those with iPhones to line up and scan QR codes we prepared to match up with specific window spots. This process took longer than we thought, and in a future iteration, we would try to make the ‘loading’ time shorter. The main challenge was accommodating all the different shapes and sizes of phones. We created little blocks to prop up smaller phones, but putting it all together in class was a longer process than we would have liked.

To make lining up a little more fun, we had binoculars for participants to use, so that they could get excited to play the part of the ‘nosey neighbour’. In future iterations of the project, we would explore using iPads and IPad minis to make the windows larger and test out longer and more complex interactions using P5.js.

References and Resources

P5.js Reference page:

Ted Talk by Aparna Rao: Art that craves your attention:

We were inspired by Aparna Rao’s Ted Talk about enticing art, in particular, her ‘Framework’ example. In ‘Framework’ little care-free characters run around the frame of a window. This is a light-hearted piece of work that has a strong impact, similar to what we were going for in 100 Futures Lane.

Material Madlibs – SONIC FISHBOWL


Group members: Ginger Guo, Katie Micak and Ania Medrek

VIDEO: Sonic Fishbowl on Vimeo.


Project Description

Don’t want to bother taking care of a real fish? There’s no need to tap on a bowl to try and make your goldfish move any longer  — the Sonic Fishbowl provides an energetic pet, no food flakes required!

The Sonic Fishbowl is a decoration for your home or office that turns on when triggered by an ultrasonic sensor. When a person is within 30cm of the fishbowl, the ultrasonic sensor tells Arduino to turn on the small fan inside the base of the bowl. The fan causes the felt fish to spin around in a circle, creating an interactive experience. The Arduino hidden in the base is programmed with a simple code that tells the bowl to turn off when a person moves out of range.

Process Journal

Our early ideas for the project included a scent diffuser, a snowglobe, a spinning wheel, and even a farting dog. We researched different kinds of felt and learned that felt comes in a very light, fluffy form, perfect for a low-voltage fan. We settled on a snowglobe at first, and created a wintery scene out of felt to place at the base of the globe. We planned that air from the fan would move little pieces of felt around the globe — what could go wrong? (A lot, it turns out.)

We began assembling the snowglobe and looking online for an example of a code using an ultrasonic sensor. We found a DigitalWrite code on and plugged the fan into power and ground on the breadboard. This first try worked, but we found that 5V was nowhere near the amount of power we needed to make something move.

Next, we tried to harness 9V using a transmitter and an AnalogRead code — still, this barely moved the snow. We finally recruited a second breadboard and tried the fan at its full 12V capacity. This made the snow move in a circular ‘tornado’ motion.

The snow was also getting stuck to the mesh sheet that we used to keep the snow from falling into the fan. This was our first indication that we might need to change the snowglobe concept. A challenge we faced on the aesthetic side of things, was that felt is not strong enough to hold up a glass globe, so we needed to find some sort of box.

A lot of trial and error was involved in the building process. The minute we got the ultrasonic sensor working, the power source stopped. Over the course of two weeks, we needed to continually learn how to ‘upgrade’ our voltage and code. With the help of classmates and the internet, we finally settled on The Sonic Fishbowl.

Parts List

Arduino MICRO

One HC-SR04 Ultrasonic Sensor

One Cooling Fan

One Plug

One 12V Power Adapter

One 560 ohm (Green, Blue, Brown, Gold) Resistors

One Transistor

Two Full Breadboard

Eleven Male/Male hookup wires


(High-res of circuit in GitHub link)

Project Context

Everyone in our group is new to Arduino coding. We searched for ‘ultrasonic sensor’ and found some code, which turned out to be out of date. With Nick’s help, we figured out that we need to download a new library to use the ultrasonic sensor.  In class, we found out the Analog code would help us with the speed of the fan, and adjusted the code accordingly. We needed to Google each computation accessory to find out what it is and what it does. It was a huge learning curve for all of us, but a rewarding experience.

References and Resources



Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.