Category: Experiment 4

Buzzervator

intro

01

02

03

05

06

08

07

09

10

Code: https://github.com/Shreeya2017/C-C-Project-Four/commit/f95958c1e8cde07f2e3cc1de9d2c9b72142f4ab1

Loudness-Tracker [EXPERIMENT 4]

LOUDNESS TRACKER
April Xie

loudness-tracker

https://tinyurl.com/loungeloudness

circuitCODE

https://github.com/apexies/C-C4/tree/master/microphone4
https://github.com/apexies/C-C4/tree/master/P5.JS%20LoungeLoudness

DATA

https://docs.google.com/spreadsheets/d/1uzDBl6x5vmKJvUTr3oiw81ATKyA8yIOmrQI_Ud7pc6A/edit?usp=sharing

Loudness Tracker is an ambient noise-level monitor that discreetly hangs from the ceiling of a public study space, and sends noise levels to a web-interface dashboard in real(ish) time, to allow people to plan whether they want to study there in advance. This prototype is made for and installed in the Digital Futures graduate lounge. Loudness Tracker explores ideas of monitoring noise pollution, “smart” buildings, and implications of making ambient data available to users remotely.

************************************PHASE 1: IDEATION***********************************

1A) EARLY IDEAS

I began ideation by thinking about various sensors I wanted to explore, and their capabilities.

img_3609

  • Early ideas included:
    • GPS Garden Gnome – Where will kind strangers take him?
    • Monitoring paper towel dispensing and usage in bathroom, showing how many trees have been used in a day
    • Punching bag
    • Pulse monitor for passerby’s heart rate levels over the day in 100 McCaul foyer

I settled on wanting to create an ambient monitoring tool that would 1. be of use to students in an everyday context, and 2. explore the idea of ‘smart’ environments, and how such a device would change relationships to an interior, for better or worse.

I also wanted a simple concept, as I had basic coding and hardware skills. As I mulled over these ideas, I happened to be in the Digital Futures graduate lounge when it was particularly bustling and loud. I was distracted, grumpy, and not very productive. I thought, “I wish I knew where the quieter spots on campus were right now”. Bingo.

I sought to make a “smart interior” device that would allow students to track noise levels in various rooms to monitor  how loud a particular room was at a given time, and decide in advance if that is where they’d like to study.

Many things could be mined from the data: assessment for ‘noise pollution’ levels within various study spaces, correlation with surveyed productivity levels, surveyed perception of the noise level tracker, and whether it influenced perception of the space, etc.

1B) CONTEXT & RESEARCH

Themes to investigate: Sound levels and productivity; surveillance and monitoring; “smart” spaces and ambient data collection

Some questions:
How do other ambient data projects execute their designs?
Do they blend their hardware into the environment, or make them conspicuous? Why?
Where does the data go? Is it made public, is it used by the public?
What are the implications around the rhetoric of “smart objects/environments” -sense of control over environment, linear, predictable, safe, quantified, “conquered” information?
Do smart environmental sensors  encourage participation and thinking around the source of trends shown in data, whether the status quo is desired? Does making invisible phenomenon visible incite behaviour change?

Sound levels and Productivity

  • Research shows sound levels have a profound effect on our mood and well-being. In particular, productivity is heavily affected by the levels of ambient noise around us. There is a “sweet spot” of ambient noise for creative thinking – a low buzz, like in a coffee shop. An environment that is either too loud or too loud decreases productivity to varying degrees, depending on the person.
    • “Office workers … have chosen sound levels  between 48 and 52 dBA for ideal work settings”  Link
    • “Sound affects us psychologically, psychologically, cognitive and behaviourally, even though we’re not aware of it” Link
    • “Moderate (70 dB) vs low (50 dB) level of ambient noise enhances performance on creative tasks, and increases the buying likelihood of innovative products. A high level of noise (85 dB), on the other hand, hurts creativity” (http://www.jstor.org/stable/10.1086/665048)
  • There is a flurry of apps and sites on the market that recreate the “sweet spot” of ambient noise for enhancing creative output:
    • Coffivity: “The Sounds of Productivity” – continuous loop of coffee shop noise
      coffitivity.com
    • Ambient Mixer: lots of white noise loops, can combine, adjust, mix noise tracks
      itunes.apple.com/us/app/ambient-mixer-free/id731882796?mt=8
    • Focus@Will: “scientifically optimized music to help you focus” – music in phases sequenced to follow natural attention span
      www.focusatwill.com/app/pages/v7
    • Raining.fm
    • Rainymood.com

“Smart” buildings, cities

  • Temboo.com: Creates IOT objects for smart city monitoring. Display noise level in districts compared to # restaurants and bars.

  • Urbiotica – “U-Sound noise monitoring sensor: wireless acoustic sensor designed to measure noise in the city. It makes continuous measurements and every minute, it sends a new equivalent continuous sound level value (LAeq1’) through the communication network. This information is available in the cloud and can be consulted in real time remotely”

  • Smart cities
    • “Smart cities are places where information technology is combined with infrastructure, architecture, everyday objects, and even our bodies to address social, economic, and environmental problems” Anthony B. Townsend. (2014) Smart Cities, W.W.Norton & Company Link
    • “Projects of smart cities have an impact on the quality of life of citizens and aim to foster more informed, educated, and participatory citizens. Additionally, smart cities initiatives allow members of the city to participate in the governance and management of the city and become active users” Link
    • Dr. Guy Newsham, National Research Council Canada, 2009: project to develop and link arrays of indoor environment sensors to improve building’s enviro health, energy efficiency, maintenance, comfort of occupants. Link 
    • “The ICT sector has not lived up to the expectations of the media, city governments and other organisations with respect to its role in intelligent urbanisation…failure in the industry to look holistically at the smart city, to clearly articulate in a meaningful way a vision of how ICTs could enable a different and better life, and to bring citizens, politicians and others along on that journey.” Link

1C) EARLY TESTING & IDEA REFINING

  • First concept:  make a deliberately conspicuous object, one each for two-three rooms, that would visualize the current noise levels in the room for people to take notice of. With this object, there would be a sign that invites passerby to follow @loudnesstracker on Twitter to see which of the two-three rooms was the quietest every 30 mins.
    • Object would be a box that sits on a desk or windowsill, and features a dynamic VU reader with LEDs.

      f1vhbc2gow3yka1-rect2100

      Example VU reader

      img_3610

      Ideation sketch

    • Challenges:
      • microphone placed on table may not pick up ideal ambient noise levels. Object should be installed on the ceiling.
      • Doing multiple rooms simultaneously is ambitious. Start prototype with one room.
      • Attempted to make prototype of VU reader with LEDs – was too complicated. Need to get Feather printing to Adafruit.io and calibrate microphone first.
  • Second concept: 1 room, tweet noise level of room every 30 mins. OR, tweet only if it’s been low for a certain amount of time (certain amount below average). 
  • Design decision: “news” of the data (past 20 minutes have been quiet), or live data?
    • News of data – Twitter (tweet only every 30 mins or so, otherwise overwhelming and not a useful account)
    • If live(-ish), Twitter is not the best platform. Would have to make a dynamic dashboard that people can open and see movement in real time
      • Live(-ish) because data can’t ever be truly live – need to take certain amount of readings and give average number first for number to be useful.

Final decisions

  • Make one reader for one room
  • No VU reader – focus on getting the microphone up and running only.
  • Make a live(ish) dashboard with P5.js; no Twitter

************************************PHASE 2: BUILDING***********************************

2A) SETTING UP PARAMETERS FOR MIC

  • Electret Microphone with breakout board –  records amplitude level for measuring sound levels.
  • Amplitude: size of the vibration that determines how loud the sound is
  • Not scientifically accurate, but good enough for proof of concept for prototype
  • In future, would need to use a sound pressure level meter and read decibels

2B) CODE AND CIRCUITS

  • Step 1: Got basic input reading onto Adafruit.io
  • Step 2: Set up circuit
    • Voltage divider – use 3 resistors of same ohm value, one in series, two in parallel. Divides 3.3V out down to 1v for analog pin
      img_3512
  • Step 3: Captured first readings of microphone to Adafruit.io 
  • Step 4: Digital smoothing – taking averages
  • Step 5: Created thresholds for loud, medium, quiet
    • Considered coding thresholds as ratios calculated against dynamic average sound level over time. Ended up being too complicated. Decided to make static thresholds.
    • Experimented with different sound levels at different distances
    • At approximation:
      • Quiet ambient sound sat at below 100, low murmur far away 100-200
      • Talking with mic 2 feet away – 500
      • Talking close into mic 500 and up
      • Blasting music next to mic – 700, 800, 900
    • Made the following thresholds:
      • 500 and up: loud
      • 200-500: medium
      • 0-200: quiet
  • Step 6: P5.js Sketch for live(ish) dashboard
    • Used Nick’s example code to make dashboard. Draws 1/2 diameter of average peak-to-peak level printed on Adafruit.io
    • Modified code as follows:
      • Loud threshold turned red, caption would say “loud”
      • Medium turned orange, caption “medium loudness”
      • Quiet turned green, caption “quiet”

giphy

2C) DESIGN

Decided to not put the breadboard in a container  – the circuit was small, and I thought it would be interesting to have an obvious prototyped microphone hung from the ceiling. It would make people aware they were being monitored, when most ambient monitoring devices try very hard to blend into the environment. I wanted to gauge people’s reactions.

Bought a two-pronged coat hook that can be hung off a door. The curve and size of the hooks fit the breadboard perfectly. The back of the breadboard was duct taped to the hooks.

img_3522

Played around with encasing the breadboard in a homemade parabolic curve, to amplify the sound waves picked up by the microphone. DIY solutions found on the internet included:

Small metal salad bowl was bought for parabolic curve. The bowl and the metal hooks could potentially be soldered together:

img_3515

In the end, the metal bowl was not incorporated into the design, due to lack of time. The final prototype was left as is. 5v wall adaptor used for power.

********************PHASE 3: RUNTIME – NOV 23 12:25-8:25PM ************************

3A Setup

  • Hung by ceiling, putting up sign on door inviting students to visit web-interface dashboard

3B Last test before run

  • Decent responsiveness
  • Lag in adafruit.io data – would go down to zero for 30 seconds often and freeze, too much data? Circle dashboard therefore laggy and also not accurate / very real time
  • Eg playing a loud song next to mic, kept going back down to “quiet”

3C Runtime data

Red zone = loud
Orange zone = medium
Green zone = quiet
graph-withlevels

4787 total readings, 598 readings printed / hr

readings

Feather freezing:  There was a lot of freezing in the middle of the 8 hour period; Feather had to be restarted. Difficult to extract patterns from data because of inconsistency of reading from freezing.

  • 12:25-3:25 – no restarting, minimal freezing
  • 3:25-4:25 – froze for 30 minutes, half the time
  • 5:25-6:25 – stopped working at 5:02, noticed and restarted feather at 6:11PM
  • 6:25-7:25 –  stopped working again at 6:28 – unplugged for a while, feather was very hot.  restarted at 6:38. Seemed to work ok again after
  • 7:25-8:25: stopped working for 17 mins

Other bugs

  • General – would go down to zero often… for 10-20 seconds
  • Some very high readings – 900 (not sure how)

People’s general perception of object during 8 hours

  • Most people didn’t notice microphone strung up on ceiling
  • Those who did notice were not compelled to change any behaviour. They were however curious to see the dataset to reaffirm what they already knew – it gets really loud in the lounge.
  • Four people said they looked at URL dashboard, all said they would like a service that showed noise levels across several rooms in real time. Utility of Loudness Tracker in only one room questionable

3D Future Iterations

  • Freezing: Need to send even less data through to Adafruit next time. Averages over longer periods of time.
  • Sound level calculation: refine, use decibel reader
  • Calculate averages over certain amount of time rather than certain amount of readings
  • Make decibel thresholds for different levels of productivity, best levels for creativity, best levels for different tasks (memorization, focus)

CANDY LOVE

Project Title :
Candy Love

Project Description :
The objective of my project was to collect data over an 8 hour duration (over 2 days) while keeping the concept simple, effective, engaging, fun and interactive. ‘Candy Love’ emerged after much brainstorming and the process of elimination. The initial idea was to work with a sensor that detected the opening of a door with the use of Adafruit HUZZAH ESP8266 WiFi microcontroller board that would notify email/tweet/text the owner when their door was opened. A concept that would be way more cost effective than a guard dog or an alarm system for that matter. However, I wanted to turn this into an exciting and fun project, so I decided to use a personal locker in the Digital Futures lounge that would have a similar set up and a CTA (the big surprise, the bait… candy!) that would draw people to engage with it by opening/shutting the locker door. The result was very effective. Not only did the lockers active and resting stages result in a lot of data, but the candy vanished as well…as you can imagine! Everyone had fun, including me 🙂

Video Presentation :


Contextual Images : 

img_7472        img_7471

20161121_160032        20161121_160111        img_7473


Photographs / diagrams that describe what you learned from the testing :

screen-shot-2016-11-21-at-2-36-21-pm   screen-shot-2016-11-22-at-3-27-55-pm    screen-shot-2016-11-22-at-5-15-26-pm
graphscreen-shot-2016-11-22-at-8-53-02-pm


Circuit Diagrams :

circuit-board circuit-board


Code & Overall Process :
https://drive.google.com/drive/folders/0B0v0C2NFCXOiYU9NV0FqLUMyYXc?usp=sharing
https://drive.google.com/drive/folders/0B0v0C2NFCXOiUWN6WVpjSkVzcE0?usp=sharing

The process consisted of 3 primary stages. An arduino code that was uploaded to the Feather Huzzah/Breadboard which was in turn connected to Adafruit for data collection and IFTTT for data visualization through an automated excel spreadsheet. The resulted excel sheet was then converted into a graph in Adobe Illustrator that represented the number of times the locker door was opened in the duration (the spikes in the diagram) and the time that it remained in its resting stage (the plateaus in the diagram). The circuit board (with the Feather Huzzah) was also connected to a charged portable power supply to allow it to work continuously for 4 hours at a time over 2 days. The candy jar was placed in the middle of the locker. One of the challenges I had, was ensuring that the candy was taken without touching the set up. Perhaps, the next iteration could have a better execution so that the circuit board and power supply are hidden and less precarious. Another challenge I seemed to have had is the time code in the excel sheet, which seems to be showing Indian Standard Time even though I made changes in the settings to our current time zone.

Project Context :
https://learn.adafruit.com/using-ifttt-with-adafruit-io/overview
https://learn.adafruit.com/adafruit-io-basics-digital-input?view=all
https://www.youtube.com/watch?v=kpEON8j2rEI
https://io.adafruit.com
https://ifttt.com/create/if?sid=0

The reference links above inspired me and directly contributed towards my project. The use of a sensor with the opening and shutting of a door to collect data that informed us of how often it was opened was the primary goal.

#hashFeed

#hashFeed is an attempt at having ‘close to real time’ hash tags trigger a micro controller, and have those triggers do interesting things with light and sound. The web service ‘If/this/then/that’ routes Twitter to adaruit.io, which then hands it off to a feather Huzzah. The proof of concept video shows a frequent series of #trump and #thanksgiving tags, triggering alternate LEDs on the Huzzah in one second intervals.

The project is submitted as a graduate school assignment, and a key part of the assignment was to make an untethered object running on our local network, a la IoT. That thing was then to collect or display data on the local network, running autonomously for at least eight hours. On this requirement my submission falls far short, as it displays little staging, zero build and was unable to run for more than thirty minutes without crashing.

Though the actual physical device is unfinished and laughably simple, the underlying network tools and data collected through twitter, ifttt and adafruit fill up quite a few cells on a spread sheet. The total hashtag count as of this writing is over fifty thousand, and counting!

 

Here is the proof of concept video (including crash):

 

Next up is a visualization of data collected through twitter > ifttt > io.adafruit, and animated as an interactive graph in keynote, then exported as a video. Initial hashtags like #kitten #puppy #taylorswift and #drake, were put aside for some tags with a little more action. Notice the huge spike of #gilmoregirls around the 0:04 mark, I believe just a few hours before it was released on netflicks.

 

 

The same spike can also be viewed in the following graph, made in wps spreadsheet (an excel-like program).

 

fifty-hours-trends

 

Here is the ratio of tweets, surprisingly quite even.

 

tweet-ratio-breakdown_large

 

An earlier, working idea for the output of the micro-controller, involved a ‘climbing robot’ that would move up and down a rope or string, untethered by power.  Though the plan wasn’t executed (for a number of reasons, including a lack of justification for the build), considerable time was spent conceptualizing and planning for the robot (all components have been purchased).

Here is an animation representing how a modular ‘climbing robot’ array could work.

 

 

Context

Big Data and the Internet of Things are both colossal agents of change. Data is torrential and objects are being connected, making the internet(s) at once both very ephemeral and very real. The possibilities opened up by this exercise are awe-inspiring, pointing towards the mountain of constant daily data and my brain’s inability to process more than fraction of it.

Much is missing from this submission, including the documentation of considerable research into the integration of an an audio board. There was also little to no exploration of other artists or similar projects, undoubted of which there many. The plan is to continue with this project in the near future, making this a beginning of sorts.

Much thanks to Professors Hartman & Puckett, as well as to Junjun Zhu, for key help with coding and spread sheet calculations.

 

KatieCam: Single Pixel Live Data Performance

Project Description:
KatieCam is an experiment that examines whether or not movement and narrative can be communicated through minimalist colour changes over time.

For this performance I wear an ultrasonic sensor on her neck which captures and records distance from objects or people as she moves through physical space. This information is uploaded live to a website which translates proximity into shades of grey- the closer I am to an object or person, the darker the grey (or black- the least amount of light in between) the image becomes, while the further the object or person, the lighter the grey becomes.

Greyscale is assigned to distance as follows:
0-30cm        20 (RGB)
31-60cm      60
61-90cm      100
91-120cm    150
121-180       200
151- 200      225

The images on the website is a simple flashing square-this is my data visualization.

Reactions:
The reactions from this project vary based on who my audience is. Audience members who are in my physical space are more interested in the object of the ultrasonic on my neck- it is a confrontational device that looks similar to a camera, and is out of place on my body causing people to question its presence, or laugh at its silliness.
Audience members who only experience this performance via the website often reported that it ‘wasn’t working’, it was boring, and I was lazy (not moving enough). Interestingly, other audience member reported that it ‘made them feel closer to me’ , ‘it was nice to see me’ and that it was an actual connection.

Project Context:
When presented with ‘Experiment 4” it seemed plain to me that this was a surveillance assignment. I believe this is an important territory to investigate in the Digital Futures program, because it seems that many students are interested in looking at how to create effective designs basics on live input, causing a reaction. I believe that any sensor, or monitoring system, with the intention of ‘correcting’ a ‘problem’ is surveillance. And, I believe the implications of creating digital objects that provide feedback is only moving deeper into private spheres for the sake of perceived efficiency, without considering how these inventions are just extensions of systemic control. If every object in my home is reacting to my body, that information is collected, analyzed, eventually monetized, and inevitably used to enforce an ideal situation.

It was important for me to show the sensor- so I choose to wear it. The reason for this was to turn myself into a monitoring device, and even though I’m monitoring my own distance from objects and people, those objects and people are participants without consent in the experiment.
I did not want to create a scenario where people had to ‘do work’ for my benefit, so I choose a passive interaction and integrated the sensor into my routine.
And, as mentioned above, I was most interested in seeing if any type of narrative or interaction could be interpreted by simply changing a colour on a screen.

Art History and other influences:
Minimalist and Abstract Expressionism, influenced my output aesthetic by keeping the elements on the canvas only to what was necessary to communicate what was important in the project. This of course, is also a performance.  I am also interested in the concept of the ‘cam girl’ who perform through the Internet on live video.  I am interested in aligning myself with net based performances as the expand audience possibility, offer new ways of representing the self or body online,  and have an embedded archive.

Links:
KatieCam: https://webspace.ocad.ca/~3159294/katiecam/ 
Video: https://vimeo.com/193080829
Collected data via Adafruit IO (Googledoc): https://docs.google.com/spreadsheets/d/1sndZhDYitX74XOI2uPjxQdP8ZDzELPuDFrwHQxOKuEc/edit?usp=sharing
Github: https://gist.github.com/katiemicak/ef47b786486a6af0b8651554d3e2b6c4

Process images:

Initial drawings of concept:

-shows how I would wear the ultra sonic and the range it would cover

-describes possible online images as a result of performance

-charts distance and grey value

img_3676                                   img_3677

 

First tests:

screen-shot-2016-11-27-at-11-13-14-pm screen-shot-2016-11-27-at-11-13-31-pm

 

Final Image: not sure why it’s so blurry.

screen-shot-2016-11-27-at-11-17-12-pm

 

 

Board Diagram:

screen-shot-2016-11-27-at-10-54-00-pm

Language of Love – Orlando

Language of Love is a collaborative project between Rana Zandi and Orlando Bascunan, it features an 8 hour experiment where we tracked our heart and respiratory rates to find patterns of affinity in a romantic couple.   

Code: https://github.com/obascunan/BreathingTracker

The study behind this concept was performed by the University of California in which they tested 32 couples examining the interdependence in their physiological signals, the results concluded that there is a detectable association in every task they performed.

Study: https://www.ncbi.nlm.nih.gov/pubmed/21910541

To execute this experiment we designed a wearable device that consists in a battery that fuels two feather huzzah boards, one is connected to a stretch sensor to measure breathing and the other connected to a pulse sensor to obtain the bpm.battery

The battery and boards will rest in a pocket at the bottom of the shirt so the weight and swinging of these components is less unpleasant for the users.

Thinking about the documentation and recording of this experience we chose to leave the sensors uncovered so the audience could have a better and faster visual understanding of how the device works.

 

Breathing Tracking

screen-shot-2016-11-27-at-6-58-07-pm

Measuring breathing rate is quite straightforward. A stretch sensor, made with conductive rubber, is attached to the chest and connected to the feather. High and low points of the stretch are recorded with a timestamp to obtain the breathing rate information.

devices

With the graph of this information we can visually detect the patterns that emerge.

separate-readings

Video Production

rana-pic-2

Equipment:

  1. Canon t5i.
    Two white florescent lights stands.
  2. A black backdrop.
  3. 2 wireless chest microphones.
  4. One professional stationary microphone.

Editing software:

  1. Adobe premier & after effects. Garageband Audio:
  2. A downloaded theme song (open source) Phone call conversation recorded and edited in Garageband Narration by Rana recorded and edited on Garageband A downloaded heart beat sound track (open source) Breathing sounds that were produced by us.

Storyboard:

  • Consists of 30 scenes.

Wearables:

 

Our main objective for the wearables was comfort. We had to come up where we wanted to place the hardware so that we would remain comfortable through out the 8 hours. Also, we didn’t want to hide any of the hardware within the wearable (as part of the concept of our project “displaying the language of love”).

Future Iterations:

  • Wearables: Use of thicker material for the shirts. It seemed that the hardware was too heavy for the cheap shirts we had purchased
  • Figure out a way to place the hardware on the wearable properly so that the wires don’t detach through out the experiment
  • Run the experiment for longer period of time to allow for behavior to adapt to the idea of being analyzed.
  • Expand the team behind the work in order to get more precise data.

Questions Answered:

How does the device blend in or stand out in the environment?

The device is displayed on top of the shirts worn by the couple. “Displaying the mechanical language of love”

How does this device respond to the data it is collecting?

It just expands with its user

How does this behaviour evolve as it collects more data about or for this person/place?

It gives a more tangible and visual feedback regarding a very intangible feeling such as love.

What is the overall size and form of the object?

Shirts. (M size)

Does the object encourage interaction from others?

Yes. You and your partner would wanna know how your breathing and heart changes when you interact with one another.

References:

http://www.businessinsider.com/the-science-behind-zestx-the-bachelor-love-lab-2016-2

https://www.zestxlabs.com

http://www.dailymail.co.uk/health/article-2277586/Two-hearts-really-DO-beat-youre-love-Scientists-couples-vital-signs-mimic-other.html

https://www.ncbi.nlm.nih.gov/pubmed/21910541

me-working  rana-pic shirt-2 shirt-3 shirt storyboard-2 storyboard-3 streetch stretch together-white-shiet whatsapp-image-2016-11-27-at-19-27-10

Experiment 4: The Heart Beat

This project is about acquiring information about my heartbeat rate during regular activities. Our bodies are constantly working around the clock to keep us alive and healthy, and it’s important to know how to prevent any risks that could damage this cycle. I created a Heartbeat monitor device which would collect data from your heart and show relay the readings about your whole day.

VIDEO LINK

Why monitoring your heart rate is so important! ?

Did you know that your heart rate can be an important health gauge? And so, even if you’re not an athlete, or not even remotely considering becoming one, knowing your heart rate can help you monitor your fitness level and can potentially help spot developing health problems.

Heart rate, or pulse, refers to the number of times the heart beats per minute. Resting heart rate refers to the heart pumping the lowest amount of blood needed. The resting heart rate for most individuals is usually between 60 and 100 beats per minute. You will be able to take your pulse on your wrist, inside your elbow, side of your neck or top of your foot.

Richard Stein, M.D., professor of medicine and cardiology at the New York University School of Medicine in New York City, said, “As you age, changes in the rate and regularity of your pulse can change and may signify a heart condition or other condition that needs to be addressed.”

Active people tend to have lower heart rates as their heart muscle is in better condition and does not need to work as hard to maintain a steady beat. Lower heart rates are more commonly seen in people who get a lot of physical activity or are very athletic.

“If you’re very fit, it(resting heart rate) could change to 40. A less active person might have a heart rate between 60 and 100 (beats per minute). That’s because the heart muscle has to work harder to maintain bodily functions, making it higher,” Stein said.

Your pulse is a tool to help get a picture of your health, and very low or frequent episodes of unexplained fast heart rates can be a signal of potential health issues, especially if they cause you to feel weak or dizzy.

heart_diagram-en-svg

How Other Factors Affect Heart Rate

  • Air temperature: When temperatures (and the humidity) soar, the heart pumps a little more blood, so your pulse rate may increase, but usually no more than five to 10 beats a minute.
  • Body position: Resting, sitting or standing, your pulse is usually the same. Sometimes as you stand for the first 15 to 20 seconds, your pulse may go up a little bit, but after a couple of minutes it should settle down. Emotions: If you’re stressed, anxious or “extraordinarily happy or sad” your emotions can raise your pulse.
  • Body size: Body size usually doesn’t change pulse. If you’re very obese, you might see a higher resting pulse than normal, but usually not more than 100.
  • Medication use: Meds that block your adrenaline (beta blockers) tend to slow your pulse, while too much thyroid medication or too high of a dosage will raise it.

 

normal-heart-rate

following is chart for heart rate reading at different situation.

heart_rate_variability_hrv

CONCEPT AND IDEATION:

cetain-amount

screen-shot-2016-11-27-at-8-10-57-pm

 

PROCESS:

So for my project i use The AD8232   chip used to measure the electrical activity of the heart. This electrical activity can be charted as an ECG or Electrocardiogram. Electrocardiography is used to help diagnose various heart conditions.

heartrateboardiso_small flx4lsiilqtbu1p-medium

Typical Sensor Placements

body

img_3375

after connecting circuit and after placing electrodes i wrote code you can find my code on following ink

https://github.com/afaqahmedkaradia/Heart_Rate

 

How does this device respond to the data it is collecting? 

At first I needed to understand how the heart rate sensor works. I noticed it would usually read from 60-100 bpm, here it declared that 60 bpm is an average rate whereas 100 is above average. There were other cases of 40 bpm which could either be described as being calm or lethargic. There a few technical issues experienced as my device would disconnect from my portable battery thus, my readings would show a flat line, much like if I was dead.

How does this behaviour evolve as it collects more data about or for this person/place?

The sensor does not change it’s behaviour nor does it adapt or evolve. It is just accurate in reading my beats per minutes.

What is the overall size and form of the object?

The overall size is approximately the size of my fist, or the size of my ‘heart’, although with a few extensions from wires and the like.

Does the object encourage interaction from others?

No it does not, at the moment it only requires an individual only.

ecg_principle_slow__1_-1

these are few examples of how i am reading data i used adafruit.io to read data and i connected adafruit.io with IFTTT to get an reading in shape of excel file thsoe were numeric reading according to time and date.

 

DATA AND VALUES :

reading-01 reading-02 reading-03 reading-04 reading-05

screen-shot-2016-11-27-at-8-02-14-pm screen-shot-2016-11-27-at-8-02-32-pm

final-visiali

future visualization of data can be done in these forms. we can illustrate those factors which are causing heart rate levels and that we can overcome this factor by applying some action in regards of such activities.

2016-02-16-11-43-35_thumb

 

 

 

Experiment 4: HugMeee

hugmeee1

Description

This project is called HugMeee, it’s a cuddling 3-foot penguin with hidden ultrasonic sensor which measures and re’cords the distance of it’ and the people walking towards it. There is a wifi module connecting to the sensor, allows the data to be sent out to adafruit server in real time.

The goal of this experiment is to bring some extra love and joy to the lounge , also to find out when do people more likely to hug , and how long they would like to hug it.

I chose to have a huge hugable penguin rather than a smaller one because i want to grab people’s attention, also i believe people would like to spend more time on it if it’s cute and cuddling like this, the gigantic stuffed toy can make them feel like a kid again

I used ultrasonic sensor instead of a flux sensor or a pressure sensor because the large surface area this penguin has, it’s hard to have people hug just one place, and different people hug different ways and areas

The data was collected for 10 hours. I have received more than twenty-one thousand distance data, and I used many methods to visualize the data

The longest hug lasted for twelve minutes, and the shortest ones are only a few seconds

Also, I found out that people hug it a lot more during evening time rather than afternoon time

 

Video

Process Images

Things learned during process:

  • Using ultrasonic sensor (also shock and pressure sensor which I did not end up using)
  • Using Feather Huzzah ESP 8266 board
  • Using adafruit io
  • Using IFTTT
  • Collecting (a lot of) data from above things
  • Hiding electric and wires inside of an object without breaking them
  • Understanding spreadsheets
  • Analyzing and visualizing data

Circuit diagram

ultrasonicsensor

Code

https://github.com/bijunchen/HugMeee

Data Sheets

https://drive.google.com/drive/folders/0B8TIIfhx7ftLSG9heHdzemViN1E?usp=sharing

Data Visualization

Project Context

eCLOUD Project is a permanent art work installed between gates 22 and 23 at the San Jose International Airport, it was done by artists Dan Goods, Nik Hafermaas, and Aaron Koblin. Similar to HugMeee project, it collects the real-time weather condition all over the world and reflects them into the huge amount of polycarbonate that can fade between transparent and opaque state. The eCloud has 100 custom designed circuit boards that control the liquid crystal pixels. As information is sent from the master computer, it goes to the circuit boards, then they tell each pixel whether to turn on of off.

To compare eCLOUD to HugMeee projects, eCLOUD not only collects real-time data, the visualization part happens real-time as well. For HugMeee, the data is used afterward when collecting is completed. The output, the eCLOUD, is a form of art as well.

 

References and Resources

Nick’s Fritzing example

“Vivacity” Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

http://www.ecloudproject.com/

https://github.com/densitydesign/raw

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.