by Yuxi, Hudson, and Maz.
The goal of this project is to create a connection between individuals revolving around sharing past experiences. The system uses a hood, which records the experiences of the wearer (light, sounds, and step) at a random time when the hood is worn down. This recording is transmitted to a base station whenever the hood comes into proximity, triggering data visualization of that recording. A previous recording, from another hood wearer, is then transferred back down to the hood at this point, creating a network effect of shared experiences. The downloaded experience is then played back by the act of wearing the hood up, connecting two individuals across time, location and experiences.
Experience recording (Data 1)
Single user experience with data transmission, sending (data 1) and receiving (data 2)
Following the path of a single recorded experience (data 1) over time between two hoods
Playback of sent experience (data 1)
The Object: Hood
Wearing hoods can serve many purposes, weather protection being foremost. It also serves other purposes, such as providing social cues. Hoods can communicate that a person do not want to connect with other people since: they create a personal space which only the hood wear occupies.
We’ve flipped these perceptions and experience around. Instead of withdrawing into a hood it becomes a social experience that is tapped into: two people occupy the same hood space. Pulling up the hood acts to remove the wearer from the situation they’re currently in and project him/her into a new space created by the sense experience of another hood wearer.
A tweet, post or a profile picture allows an individual to project a self-image that they choose for their social network to interact with and experience. We want to challenge this: what would the interaction with and experience of another person be without knowing their name, demographic, or likeness; all the things we’re used to having, to identify that person? How would this experience be interpreted by each user?
As this is an interpretive experience, we don’t believe this to be a more “real” or “authentic” means of conveying a person or experience. What we are providing is a mean of experiencing another in a unique, unfamiliar method. New experiences have the power to open our eyes to other possibilities; perhaps onemile can provide such opportunities.
When worn down, our Arduino-equipped hood randomly initiates recording of the hood wearer’s experience. This recording consists of the light values, via photo sensor, steps taken, via accelerometer, and sound values, via microphone. The hood wearer is alerted that recording is taking place by the rhythmic toggling of a fan, near the hood wearer’s neck, to simulate breathing. This creates the illusion of another person closely following the hood wearer; which, in turn, acts as a reminder that their current experience will eventually be occupied by another person.
Our hood is equipped with an Xbee for wireless communication of stored data. When a hood wearer approaches our data visualization base station its Xbee is used to transmit the hood’s stored data to the base station’s Xbee. At this point, the Processing sketch running on our base station triggers data visualization of that recording. In addition to viewing their own data visualizations, users are presented with averages of other, past hood wearers’, data. This allows for users to connect on deeper level with their own experience by quantifying it along with other users’ experiences.
After a hood’s recorded data is transmitted to a base station a previous recording, from another hood wearer, is then transferred back down to the hood and any existing recordings are deleted/overwritten. When the hood is worn up this shared recording from another hood wearer is played back: light values via an LED strip around the hood edge, steps taken via a vibration motor at the chest, and sound values via speakers at ear level.
We began this project with three wholly different directions: linking two place digitally and letting inputs filter across, wearable technologies such as transformable clothing, and creating hardware that acts as driver for a new form of social network. Brainstorming sessions acted to distill out the idea that resonated the most with each team member creating shared experiences through wearable technology.
Developing this idea began by first understanding what the experience would be like to use our project as well as why one would use it. Discussion revolved around individuals that may currently be having particularly bad experiences. Our thought was that perhaps we could offer a means of escapism; we would offer a wearable device that would project oneself into another persons’ experience, a better experience. We also discussed the possibility of the opposite being true. Perhaps individuals having a bad experience could be comforted by the presence of people piggybacking off their experience and offering some form of solace. This would have taken the form of a simulated hug.
The form factor of the hood came out of discussions of devices and clothings that acts to remove people from the present space, in particular, reflections of people wearing hoods and headphones on public transit. In our interpretation, these individuals seem detached from the reality around them.
We also had to take stock of what inputs and outputs could be shared. We explored sound, light, breath, hug, pulse, vibration, GPS coordinates, and social media feeds. This quickly lead us to our final three inputs/outputs as cost, complexity and feasibility were weighted. Transmitting a hug was far too complex and potentially heavy, GPS data was costly and, like social media feeds, the form GPS output would take was unclear.
Tasks and Roles
Development on the hood generally broke down into three main categories and three sub categories. The main categories consisted of the hood, hood code, and visualization code. The sub categories consisted of the hood hardware, data, and branding. Each team member was charged with leading one main category while also contributing to the two related sub categories. The following diagram illustrates the categories, their relationships, and the team members charged to them:
Flow of ideas and work between Yuxi, Hudson, and Maz
Hardware / Software – Prototype Unit
Initial development focused primarily around the hardware and developing a prototype of the sensors which would eventually find their way into the hood. This prototype guided decisions made in other aspects of the project. For instance, it was from this prototype that it was discovered we needed space for two battery packs in the hood (see power limitations). There were cases, however, where external influences affected the layout of the prototype; our initial idea to use a flex sensor to detect the state of the hood proved impractical in practice, so the prototype design was changed to a switch.
The Arduino can provide power out in three manors, 5V at 40 mA from each I/O pin, 3.3V at 50 mA from the dedicated 3.3V pin, and a dedicated 5V safely up to about 500 mA. The devices underpinning our hood require widely varying levels of power and amperage. Powering directly off the I/O pins, 3.3V pin and using the 5V in combination with TIP120s to regulate the power flow seemed to be the power solution we required. Sadly, I found that drawing current from the 5V pin with our fan to simulate breathing would cause our sensor data, acceleration and frequency, to fluctuate, thus making data logging of any high precision to be impossible. A low-pass filter to regulate the voltage was a likely solution but my implementation of one seemed to limit all sensor variation altogether. The only present solution was to power the fan off a separate power supply despite the added weight.
Prototyping / development unit
Fritzing Diagram of the hood’s circuitry
Storing and Recalling Data
Arduino code flow / if statement decision tree.
Recorded experience data is saved onto the internal memory of the Arduino inside of arrays, with one array for each of the following: step time, light change time, light change value, sound change time, and sound change value. The two different arrays for both light and sound are linked via a shared counter value used for incrementing through both recording and replaying. Step also has a counter used to increment through step times, but no associated arrays such as intensity of step.
The data in these arrays are what is transferred whenever there is communication between our Arduino equipped Hood and our data visualizing and handling Processing sketch. A extra byte of data is transmitted first to identify the data type, then the array is incremented and transmitted with a comma delimiter.
To Arduino and Processing, this data looks like…
A stylized, human readable (used for debugging), version of that same data…
Step # : 0 , at time: 11
Light # : 0 , at time: 5 , with brightness: 128
Sound # : 0 , at time: 5 , with frequency: 0
Step # : 1 , at time: 944
Light # : 1 , at time: 870 , with brightness: 127
Sound # : 1 , at time: 956 , with frequency: 1069
Step # : 2 , at time: 1815
Light # : 2 , at time: 1763 , with brightness: 125
Sound # : 2 , at time: 1208 , with frequency: 0
Using this system of array, memory capacity was an ever-present concern. Recording anymore than 50 data points per sensor was a surefire method to crash our Arduino Uno test system. This was because the hood is wholly reliant on the built in memory of the Arduino. Suffice to say, that does not provide much space for data storage. Time constraints prevented any deep exploration into using an SD-card reader to expand the memory. In order to be functional without the expanded memory provided by the SD-card I had to be very strict about how often data was logged, else run the risk of running out of space in seconds.
Hardware – The Hood
Pattern, Fabric, Sewing and adding details
Concept- providing a private mental space anywhere
Solution: large sized hood, cover eyes, thick but soft fabrics.
Technical requirements – speakers, fan and vibration motor have to be placed on the right places in order to guarantee the experience.
Solution: nylon fastener tape, snaps and pockets boxes made by fabric.
Hood component pockets
Design requirements – To hide the wires
Solution: two layers, covers and pockets.
Hood sensor layout sketch
Hood circuitry and control system (Arduino)
Hooking up circuits
Due to lack of experience, we decided to use a breadboard, which caused us a lot of troubles at first. We later realized that it could have been better to solder everything on a protoboard in the first place. We decided to make the change on the day of the presentation. Unfortunately, we did not have enough time to test everything. The system broke before the presentation. It was a hard lesson for the team. I realized that it is very important to fully understand the specific features of differences circuit board and parts before implementing them on the project, as well as to test the possibilities in wearable gadgets. Therefore, in future projects, we are able to choose the appropriate parts and minimize the chance of malfunctions.
Hood breadboard circuit – initial sensors’ connection to Arduino
Hood protoboard circuit – final sensors’ connection to arduino
Up/Down detecting sensor
In order to realize our concept, which was hood down – recording, hood up – replaying, we needed to figure out the appropriate sensors to detect the hood movements. We tested number of sensors including tilt sensor, flex sensor, force sensor, magnet, snaps and the homemade sensor (nuts). Although all of them worked, we were still not satisfied because the fabric could not hold them very well. In the end we decided to use claps, which was not as good as the other sensors tested, but it worked consistently.
1.Hood cannot be worn fully put down, because the Arduino was right above the fan. When the hood was worn down, the fabric cannot be bended because of the Arduino.
Solution: I switched the Arduino to the place where the battery pack was held (moved the battery pack somewhere else) and the problem was solved.
2. Due to the technical requirement, we had to add one more battery pack. The problem was if I put two battery packs at the back of the hood, it would be over weight and not justifiable.
Solution: I used the same extra fabric to make two straps. I placed one battery pack at the bottom of each strap in order to hang them in front of user’s body. That was to balance the weight between the front and the backside of the hood.
3. The most difficult thing was to always keep the fan facing the back of the hood wearer’s neck. The fabric was too soft to keep the fan at this exact spot.
Solution: I used the extra two straps (mentioned earlier for the battery packs) to cross over the fan area. When wearing the hood, the weight of the battery packs stretches the straps, which would in turn pulling the back of the hood up while holding the fan at the right place. In addition, I created a “scarf” at the bottom of the hood, which can bond around the user’s shoulder and steady the entire hood.
Hood internal layer (inside out) and fan with cover
Reflection from the project
From this project I realized that making clothes for wearable technology is tricky. Sometimes the concept of the project could influence the design greatly. However, I really enjoyed the process and I loved to solve the challenges through better design solutions. I have strong passion towards wearable technology, not only because I enjoy making clothes, but also to create things that are more practical to use. For example, to work out the conflicts between design and technical needs is my favorite part of the process. Starting from this project, I want to pay more attention to the current situation of the wearable technology, as well as the future trend of this industry.
Inside of the hood
Outside of the hood
Sensor locations on hood
Software – Visualization
Data visualization design, hand sketches
As a general rule, we opted for iconic visuals to represent the input data of the user. A frame of influence was webicons, software navigation icons and video game symbols. We thought to include a pulsing heart or lung in the corner of the data viz display merely to reflect the “breathing fan” embedded in the hood but the visualization got a bit cluttered so it was excluded. Obvious choices for the sound input were speakers, EQ waves, but finally a speaker icon that emitted pulsing waves was chosen due to its simplicity and universal recognition. A light bulb was chosen to represent the light input for similar reasons as the speaker icon. Lastly, for the movement input, we played around with a few whimsical angles such as GIFs of the Beatles walking around Abbey Road, and the stop hand and walking man found on street lights. Eventually, we all fell in love with a simple walking man GIF that also match the styles of the other two icons. Consistency in design choices was key to maintaining the aesthetic quality of the visualization.
Final Processing data visualization software
I received tremendous technical support from Hudson regarding the engineering of the code and how the data viz receives input from the Arduino and how the average data sets for each user experience switched upon the commanded of the viewer. The Processing library GifAnimation provided some support to integrate the GIF files into the display and from that I learned to adapt them for our specific requirements.
It was important for me to make sure that the data viz display was very much a part of the onemile experience and not simply something patched on. To address this, I kept the branding of the hood with the data viz display consistent and unified. This involve continual coordinating of colour, shape and texture with Yuxi.
Additionally, I wished to engage the hood wearer with the visualizations of their data beyond just looking at it. My intent was to have them engage directly with the data. Specifically, regions of the display would respond dynamically when moused over them, providing additional information about their data in a look not unlike the rest of the visualization so as not to be too jarring.
Ideally we would augment the memory of the Arduino with an SD-card reader. Time constraints prevented this from being an option for this iteration. Additionally, our current data storage and recall methods, both on the hood and the visualization system, limit our ability to store large amounts of data as well as recall any data after power-off. We explored the possibility of storing data in a text file on the computer for later recall but again found time to be a limited factor.
Our team has explored the possibility of recording light colour in addition the light intensity but find the hardware solutions available to use to be lacking. The distance from which colour could be perceived by the colour sensor was limited to only a few centimetres. As such, we chose not to implement a colour sensor in this project, though the thought of providing such data is still appealing and worth further investigation.
Sound detection is another area where hardware proved to be a limiting factor. The microphone in our current setup reports frequency but fails to report amplitude. As a result, we are not able to get a true reading of loudness around our hood as we had intended. Instead we are only able to report on sound activity (changes in frequency).
Control Unit and Power
The hood is very heavy at the moment. This partly a result of the two power supplies and also because of the Arduino we used.
Regarding the batteries, initially our design had called for a single power supply to power the Arduino which, in turn, would power the rest of the hood. We had to scrap this due to interference created in our sensor data by the fan powering on and off. A low-pass filter to regulate the voltage is an ideal solution and we’d like to explore this more.
Similarly, our choice of Arduino was not our first selection. Our Arduino Uno test system, a unit which has 32KB of built in memory, would lock up from lack of memory if recording more than 50 sensor events. For comparison, the Arduino LilyPad, a better Arduino for wearables, has only 16KB of memory, only 14KB of which is accessible. Our sketch alone is 13KB. The Arduino LilyPad, at least in our project’s current configuration, does not work.
Unlike our original concept, the hood doesn’t transmit or playback data in real-time. Our concept had originally been to transmit data in real-time over wifi networks in a high bit-rate stream but there did not appear to be much precedent in this area. As such we scaled back to a better documented technological solution: using Xbees to transmite recorded data logs. Ideally, given enough time, we would explore and implement our initial concept with wifi data streaming.
The following is a link to a Dropbox folder containing our code:
Sources / Inspiration
Hood and Wearable Technology
A vest that gives you a squeeze for every facebook “Like” you get
By Melissa Kit Chow
Scarf hood trend
As found on Google image search.
We explored many devices that logged daily activity such as step count and sleep patterns. This research proved vital when we began to visualize our logged data. Of particular help was the method in which the Jawbone Up was used to log step data, as it cannot be graphed in the same mode as light or sound.
Nike+ Fuel Band
Nicholas Felton’s Personal Annual Reports that weave numerous measurements into a tapestry of graphs, maps and statistics that reflect the year’s activities.