Category: General Posts

Blog Post 1 – Review 3 related projects

 

Project 1 – The Haystack Project [1]

What:

The Haystack Project, a collaboration at the International Computer Science Institute (ICSI) at the University of California, Berkeley, among multiple academic institutions, starts with an Android app that captures data right at the source. (It’s not yet available for iOS.)

The Haystack Project is an academic initiative led by independent academic researchers at ICSIUC Berkeley and IMDEA Networks in collaboration with UMass and Stony Brook University. At the core of the project is the Lumen app, an Android app that analyzes your mobile traffic and helps you to identify privacy leakes inflicted by your apps and the organizations collecting this information.

 

The Lumen app monitors what Android apps do with your data

 

How/Why:The app reports back fully anonymized pieces of information, allowing researchers to understand the kind of personal information that’s being extracted and transmitted. “We’re seeing tons of things like some applications linking the MAC address of the Wi-Fi access point as a proxy for location,” says

 

How/Why:

The app reports back fully anonymized pieces of information, allowing researchers to understand the kind of personal information that’s being extracted and transmitted. “We’re seeing tons of things like some applications linking the MAC address of the Wi-Fi access point as a proxy for location,” says Vallina-Rodriguez. (A base station’s MAC address identifies it uniquely, and is used by Wi-Fi location databases run by Apple, Google, and other firms.)

 

So what:

Most likely, very soon after turning on Lumen you will quickly learn interesting facts about the apps that you run on your phone. You can use Lumen to understand where your apps connect to, which data they share with third parties and even how much traffic they waste for advertising and tracking purposes so you can decide whether to uninstall those that strike you as too intrusive. Not all devices provide the features required by Lumen to operate.

Project 2 – Smart mirror for ambient home environment [2]

What:

This project describes the design and development of a futuristic smart mirror that represents an unobtrusive interface for the ambient home environment. The mirror provides a natural means of interaction through which the residents can control the household smart appliances and access personalized services.

This project describes the design and development of a futuristic smart mirror that represents an unobtrusive interface for the ambient home environment. The mirror provides a natural means of interaction through which the residents can control the household smart appliances and access personalized services.

Why/How:

A service-oriented architecture has been adopted to develop and deploy the various services, where the mirror interface, the appliances, and the news and data feeds all use Web service communication mechanisms. The smart mirror functionalities have been demonstrated by developing an easily extendable home automation system that facilitates the integration of household appliances and various customized information services.

A service-oriented architecture has been adopted to develop and deploy the various services, where the mirror interface, the appliances, and the news and data feeds all use Web service communication mechanisms. The smart mirror functionalities have been demonstrated by developing an easily extendable home automation system that facilitates the integration of household appliances and various customized information services.

Smart mirror for ambient home environment (PDF Download Available). Available from: https://www.researchgate.net/publication/4317313_Smart_mirror_for_ambient_home_environment [accessed Jun 24, 2017].

 

Image result for smart mirrorImage result for smart mirror

 

Project 3 – MobileSync Web [3]

A study of user experiences for webpage interactions on computers working with mobile technology by Chen Ji

What:

With the development of mobile technology, smartphones have become a necessity in our daily lives. Various sensors and multi-touch screens of smartphones have contributed to a large amount of functional and excellent mobile applications and games. However, the support on interactive webpages is not sufficient. Since, in the most circumstances, smartphones are available when people use computers to browse webpages, the author consider whether mobile technology might be effective to enhance the user experience when people browse webpages on computers and whether it has the potential to be a new way for web interactions. Through researches, mainly user testing, and analyzes, a project as a form of interactive webpage integrating mobile technology shows the potential needs of this combination. This project proposes a new way for people to browse interactive webpages which can lead user experiences, by use of mobile technology, to a new place.

 

Why/How:

This thesis project is conducted by various mobile sensors, but the camera is missed since the iPhone does not allow its camera to be used by web browsers; also, the computer’s web camera or laptop’s camera is not available on the latest version of Chrome, Firefox and Safari as well. The software engineering is a crucial limitation. In terms of mobile websites, it has not worked out so well in practice although it runs on any hardware and any operating system in theory (Banga & Weinhold, 2014). The platforms (iOS, Android, Windows 8, etc.) have different permission. For instance, at iOS platform, web apps are not allowed to access the camera of iPhone users, which means, for QR codes scanning, iPhone users need to install a native app with the scan function.

So what:

I personally havent figured out the So what of this project but it shares similar concepts to mine regarding mobile app sensors

 

[1]https://www.fastcompany.com/40407424/smartphone-apps-are-tracking-you-and-heres-how-to-monitor-what-they-know, https://haystack.mobi/

 

[2]https://www.researchgate.net/publication/4317313_Smart_mirror_for_ambient_home_environment

[3]http://openresearch.ocadu.ca/id/eprint/585/1/Ji_Chen_2016_MDES_DIGF_THESIS.pdf

From Data to Perception – Research Blog

Research on What kind of Visualisation we should use

We spent a few days researching the different kinds of visualizations we could use to communicate the data we were collecting.

1. Radar Charts

 

radar_chart-1 radar_chart

Radar Charts are a way of comparing multiple quantitative variables. This makes them useful for seeing which variables have similar values or if there are any outliers amongst each variable. Radar Charts are also useful for seeing which variables are scoring high or low within a dataset, making them ideal for displaying performance.

2.Parallel Coordinates Plot

parallel_coordinates-1-01 parallel_coordinates-01

 

This type of visualization is used for plotting multivariate, numerical data. Parallel Coordinates Plots are ideal for comparing many variables together and seeing the relationships between them. For example, if you had to compare an array of products with the same attributes (comparing computer or cars specs across different models).

3. Parallel Set

Parallel Set charts are similar to Sankey Diagrams in the way they show flow and proportions, however, Parallel Sets don’t use arrows and they divide the flow-path at each displayed line-set.

parallel_sets-1parallel_sets

 

We finally decided to go with the following visualization:

fsafesa-01dsadsad-01

 

 

Listed Activities the participant needed to do

1.Play 10 games of Speed Chess
2.Workout for 30 – 45 minutes
3.Watch a horror movie
4.Play a sport/board game for 30 – 45 minutes
5.Play with an animal/dog
6.Control Day (participant goes about their normal day)
7.Learning a new instrument
8.Watch family videos/photos
9.Watch a comedy show
10.Wordplay, quizzing, and other literary games
11.Learning a new language
12.Meditation
13.Playing a video game
14.Memory Test
15.Sleep

We tried out the Muse headset and then found out that the user had to be still while the Muse was active for us to get accurate readings, so many of our planned activities had to be replaced with ones that did not involve a lot of movement.

We planned how every session would take place –

Each session includes a participant and an observer who serves as a moderator.

While the session takes place the observer takes notes about events, time, and external response. These notes could be written or audio. We will also take 3 photos for documentation purposes. One at the beginning of the session, one in the middle and one at the end.

At the end of each session with the participant, we record the quantitative data we get from the muse and submit it to a google doc. We also note down descriptions of the sessions so that we can trace activities to particular times and events.

Muse Activity Recording

sf whatsapp-image-2017-04-05-at-6-46-10-pm-1 whatsapp-image-2017-04-05-at-6-46-10-pm-2 whatsapp-image-2017-04-05-at-6-46-10-pm-3 whatsapp-image-2017-04-05-at-6-46-10-pm whatsapp-image-2017-04-05-at-6-46-11-pm-1 whatsapp-image-2017-04-05-at-6-46-11-pm-2 whatsapp-image-2017-04-05-at-6-46-11-pm-3 whatsapp-image-2017-04-05-at-6-46-11-pm-4 whatsapp-image-2017-04-05-at-6-46-11-pm-5 whatsapp-image-2017-04-05-at-6-46-11-pm-6 whatsapp-image-2017-04-05-at-6-46-11-pm-7 whatsapp-image-2017-04-05-at-6-46-11-pm whatsapp-image-2017-04-05-at-6-46-12-pm-1 whatsapp-image-2017-04-05-at-6-46-12-pm-2 whatsapp-image-2017-04-05-at-6-46-12-pm-3 whatsapp-image-2017-04-05-at-6-46-12-pm-4 whatsapp-image-2017-04-05-at-6-46-12-pm

 

Mobile App and Website Design

We designed the Muse Monitor App using our findings while being true to the Muse brand. We conceptualized and envisioned that this what our final product would look like where we would display our final visualization.

 

app_page5 app_page1 app_page2 app_monitor

We also designed a website based on these designs

webpage4 webpage1 webpage5 webpage3 webpage2

 

Digital Games – Weekly Blogs

Link to Weekly Blog 1

https://drive.google.com/a/ocadu.ca/file/d/0B0-O5G42CfqEZ1MtU20zY2ZqZUk/view?usp=sharing

Link to Weekly Blog 2

https://drive.google.com/a/ocadu.ca/file/d/0B0-O5G42CfqEdDdBM1owd3N2ZTg/view?usp=sharing

Link to Weekly Blog 3

https://drive.google.com/a/ocadu.ca/file/d/0B0-O5G42CfqEMm9obFJQbkVpOEU/view?usp=sharing

Link to Weekly Blog 4

https://drive.google.com/a/ocadu.ca/file/d/0B0-O5G42CfqEb1BrV2w2ckFBQVE/view?usp=sharing

Link to Weekly Blog 5

https://drive.google.com/a/ocadu.ca/file/d/0B0-O5G42CfqEbE9ILUwzWWZVdUk/view?usp=sharing

Link to Weekly Blog 6

https://drive.google.com/a/ocadu.ca/file/d/0B0-O5G42CfqESmduZ29YbF8tc00/view?usp=sharing

PokeBunny

Title: PokeBunny
Video: https://vimeo.com/193089329
Code: https://github.com/Gingerguo/PokeBunny

 

Intro:

Since I get the assignment that I need to create some stuff and put it in a public space to collect a series of data, I decide to use a soft toy like Teddy Bear and put 5 sensor in different part of its body, for example putting one in the left hand, one in the right hand and the same with the legs, so in this case I can know where is the spot on the Teddy Bear that people usually want to interact with most.

But, there is always a BUT. Since a feather board has only one analogue input that it can actually grab the data from one sensor only. So I GIVE UP the five sensors idea. Instead, I put one force sensor in the Teddy Bunny’s belly, just to test if the passengers will like to press the the toy. And that is it~~~can you feel the breeze on your face when you just decide to give up on some big idea and you are still alive!!!

Working theory:
There is a force-sensitive resistor connect to an Adafruit Feather. The Feather polls the sensor every minute, and whenever the pressure measurement has a big change of over 200, takes the newest reading and sends it to AdafruitIO. There will be a feed flow shows up in the Adafruit IO.

Materials:

  1. ROUND FORCE-SENSITIVE RESISTOR (FSR) or SQUARE FORCE-SENSITIVE RESISTOR (FSR)
  2. ADAFRUIT FEATHER 32U4 BASIC PROTO
  3. CIRCUITS
  4. RISISTORS(1K, 10K, 330)

Circuit diagram:

untitled-sketch_bb

Process:

wechatimg8 wechatimg3wechatimg4 wechatimg5wechatimg1 wechatimg6 wechatimg7

Challenges:
Putting the final work in a public space and hoping people to play with it is very difficult. People who knows your project will contribute some data. But I hope that people don’t know what’s going on there can also interact with it. So I put the Bunny besides the refrigerator, at the entrance of the building and the third floor of the building where the students have classes all day. The final data file (171 values in all) shows that the interaction happens once or twice per hour during the day time.

Future Iterations:
The Teddy Bunny I got is made for the pets only and when you press on it, it makes noises. So I am thinking make this sensor more durable and powered by a battery. And then I will have a record of the pets owners’ speech so that in future once the pets bite or press the Teddy Bunny, the record will be played. So it is a silly interaction between the pets and the toy.

Pooparray

Title: Pooparray
Video: https://vimeo.com/193088975
Code: https://github.com/sharkwheels/CC_BigishData
Box Files: http://www.projects.nadinelessio.com/CC_BigishData/

Idea:
Make an IoT connected sign that tells you how stinky a bathroom is.

desk

One In Context Scenario

How it Works:
There is a methane sensor hooked up to an Adafruit Feather. The Feather polls the sensor every minute, and then after 5 minutes, takes the highest reading and sends it to AdafruitIO. On the other side of the divide is the Particle Photon. It quietly controls a string of 16 Neopixels that pulse faster or slower depending on the measured methane level it reads from the IO feed.

pooparray_bb
Does this thing actually run for 8 hours?

Yup!

Project Context:
I made this mostly as a humour break, and because I wanted to learn how to make different hardware boards talk to each other via the internet. I also wanted to port some of my existing processes and libraries to different platforms. So this was a bit more of a technical focused exercise for me. That said, there’s something really fun about making a somewhat fancy box that is a shit and crossbones. It riffs off our current weird obsession of IoT notifiers. AKA: Nice looking objects that tell us somewhat readily available things. Plus when you think about the tire fire 2016 has been, this is a box that could have so many notification applications outside of just “sense these poops!” Its The Literal Internet Of Shit.

Challenges:
One thing I ran into a lot on this project was yak shaving. If you are not sure what yak shaving is, this post has a very good breakdown. For starters I had to compile and upload using two different methods. Because Particle can be somewhat closed, and Feather is pretty open. I sifted through a bunch of libraries, and tried different message brokers, because again, the implementation of the APIs was different, and performed differently on each board. Voltage was also a challenge. The Feather’s 1v analog pin really frosts my goat, but on the flip side, it can support a 5v circuit. Whereas the particle has stronger voltage tolerance on its analog pins…It won’t output 5v anywhere. Part way through one of my at home up-time tests, I noticed the sensor would start having a bird and only send 1024. This ended up being a combo of the sensor getting too hot, and the voltage divider not being adjusted for current draw or spikes. I had to fiddle around w/ resistor values, and also figure out a way to put the Feather into Deep Sleep to give the sensor a cool down period every hour. Bonus: Extended battery life. I sometimes really enjoy yak shaving type challenges, IoT is a fiddly space, and part of navigating it, is figuring out a tool chain that is going to work.

Future Iterations? 
Probably ditch the methane sensor and make it more focused around news / twitter hashtags about shitty things…like Drumpf. Or Climate Change…or 2016 in general. Or if sticking w/ methene, find a public source or data sheet for overall methane in the city. Make a swankier sign, maybe out of cedar, with an acrylic inlay. Just something very lifestyle slick, that is still very obviously a shit and crossbones.

 

Experiment – Eaves

Eaves

Project Description

Eaves is a meta data gathering prototype that captures audio levels in a room and sends those collected values over the internet. Project Eaves takes this data and converts it into artifacts that can be put on display to raise awareness about sound levels.

Video Link

Eaves – Prototype

untitled-2-01

untitled-2-01

Prototype in place at three locations

Inside Eaves

Eaves has a Feather that allows it to transmit data wirelessly to a cloud where that data can be retrieved from. It also has a Sound Detection Sensor Module that grabs sound values from the environment. The rig is powered by a USB powered rechargeable battery.

What the Data showed us

untitled-3-01

Comparison of Sound Values between three rooms.

 

untitled-4-01

Visualization of sound levels on a line graph

 

Artistic Visualization of sound levels

The data showed us the sound values of each of our three areas to enable us to find out which area was the loudest and which area was the quietest.

Code

https://github.com/afroozsamaei/Eaves-Mudit

Circuit Diagram

whatsapp-image-2016-11-27-at-7-00-49-pm

Credit: April Xie

Q&A

1.How does this device respond to the data it is collecting?

Eaves does not respond to the data it collects. It is simple an input that collects the data.

2.How does this behaviour evolve as it collects more data about or for this person/place?

Eaves was meant to be inconspicuous and hidden. Its meant to be an objected observer only.

3.What is the overall size and form of the object?

The housing is 7 inches tall and the wires that allow it to hang from objects increase its height to 1 foot

4.Does the object encourage interaction from others?

The object avoids interaction from others. If one touches the microphone then the sensor values shoot up. Eaves was designed to be hidden and blend in with the white ceilings.

Concept Sketches

Project Context

Silence in the classroom can boost children’s exam results, improve their self-esteem and cut down on bad behaviour, according to new research.

http://www.telegraph.co.uk/education/educationnews/8841649/Silence-is-golden-how-keeping-quiet-in-the-classroom-can-boost-results.html

http://www.huffingtonpost.com/michael-taft/noise-pollution-health-effects_b_905860.html

 

References

https://learn.adafruit.com/adafruit-microphone-amplifier-breakout/measuring-sound-levels

http://www.digikey.com/en/articles/techzone/2011/jul/the-five-senses-of-sensors—sound

 

Experiment 4: The Stress Bull

20161122_155418

By Ania Medrek

Link to code on GitHub: https://github.com/aniaelizabethm/StressBullcode
Link to data spreadsheet: http://tinyurl.com/stressbullvalues
Link to Video: vimeo.com/193067698

The Stress Bull is a toy that is squeezed in the hand and massaged by fingers. Stress Bull is intended to help relieve stress and muscle tension. Squeezing a ball (or bull) lifts tension from the muscles and distracts the mind from anxieties and concerns.

The bull was stationed on the 6th floor of the Graduate Studies building at OCAD University for 8 hours straight on November 22, 2016. Anyone who stepped off the elevator or came out of the stairwell couldn’t miss the Stress Bull booth, and more than 50 people stopped and gave it a squeeze. Students and faculty from the IAMD and Digital Futures programs reported that it was a fun and welcome distraction from all the school work piling up as the end of the term nears.

A pressure sensor embedded in the handmade stress toy collected data throughout the 8-period. Every second, if a new value was registered, it would be sent to Adafruit IO. Using IFTTT, the readings traveled from the Adafruit feed to a Google Drive spreadsheet. In total, almost 1000 values were registered. The data was measured on a scale from 0-800 but rarely went below 50 because the sensor is highly sensitive and even the slightest touch triggers a reading.

Only two participants squeezed hard enough to trigger a reading above 750. The majority of squeezes were of light-to-medium strength. Below is a chart with estimated percentages of how hard participants squeezed altogether. ‘Dubious data’ acknowledges the readings that are impossible to categorize. The ‘dubious data’ refers the low-ish values that could have been triggered by a variety of things: the table the Stress Bull sat on, the fabric enveloping the sensor, someone simply holding the bull in their hands, but not squeezing and more. Because of these possible factors, I considered any readings under 100 to be ‘dubious’.

 

visualization
 

PHOTOS OF PROCESS

Check out the video (at top of post) for more on process and results.

 

QUESTIONS AND ANSWERS

How does the device blend in or stand out in the environment?

I made every effort to made the device stand out on purpose. I wanted to attract as many participants as possible with funny signs (inspired by Honest Ed’s) and a colourful booth table. I made the stress toy a cute and cuddly bull to make it more inviting than a plain, old stress ball.

How does this device respond to the data it is collecting? How does this behaviour evolve as it collects more data about or for this person/place?

This device responded exactly as intended. It sent all the values as they came in and survived the hardest of squeezes. I am particularly happy that the Stress Bull stayed in one piece because I sewed it all together myself, mostly out of a reindeer-shaped hat from the dollar store. The behaviour of the Stress Bull did not change as it collected more data.

What is the overall size and form of the object?

The object is a hand-crafted sphere with horns. It fits in the palm of the average hand.

Does the object encourage interaction from others?

Encouraging interaction from others was the main goal of the Stress Bull’s size, form and signage. From design to execution, the silly puns and bull shape was intended to be engaging and draw in participants, stressed or not. I may have taken the pun too far, but I decided to go full force with it because I believe it encourages interaction and at the very least — a laugh or two.
 

PROJECT CONTEXT

As many people stop and reflect on our fast-paced, high-stress society, they are reaching to their smartphones for apps that can help alleviate negative health consequences of anxiety. According to Statistics Canada, daily stress rates are highest in the core working ages (35 to 54), peaking at about 30% in the 35 to 44 and 45 to 54 age groups. In 2014, 23.0% of Canadians aged 15 and older (6.7 million people) reported that most days were ‘quite a bit’ or ‘extremely stressful’.

Stress balls are available everywhere, but not many are rigged with sensors and collecting data.

Stress Bull fits into the larger phenomenon of products and apps that track personal and public health data and make it readily available for analysis. Anxiety and stress are addressed by hundreds of apps teaching methods like acupressure, meditation, and hypnosis — there’s even an app called Inner Balance that hooks up to your earlobe and monitors heart rhythm.

Devices such as Fitbit and Apple Watch that measure heartbeat and exercise data are very popular. A future iteration of Stress Bull could be a more polished and accurate product that uses sound and lights to tell users how hard they are squeezing — and encourage even harder squeezing. It has the potential of being incorporated into a phone app. Squeezing stress bull could be a fun break time activity that charts squeezes throughout the workweek.

 

CIRCUIT DIAGRAM

StressBull

 

REFERENCES AND RESOURCES

http://www.statcan.gc.ca/pub/82-625-x/2015001/article/14188-eng.htm

https://learn.adafruit.com/force-sensitive-resistor-fsr?view=all

https://learn.adafruit.com/adafruit-feather-huzzah-esp8266

http://www.healthline.com/health/anxiety/top-iphone-android-apps

Nick’s Fritzing example: https://canvas.ocadu.ca/courses/22381/pages/experiment-4-resources-pt1

 

The Observer Part II

PROJECT 4

whatsapp-image-2016-11-23-at-7-30-36-pm-8

The Observer Part II by Sara Gazzaz

DESCRIPTION

This project collects data over 8 hours of how many people approach art. It also collects the proximity between the observer and the art piece. The concept for this project was chosen because of my interest regarding how people interact with art pieces in different ways and in different environments. This art piece was placed at the entrance of 205 Richmond Street West. Collecting data of the way people at OCAD interacted with the piece was what I was exploring.

 

How It Started?

Several informative posters at OCAD are put up on walls of the hallways and in different rooms for the purpose of informing or reminding us about workshops, deadlines etc. I’ve noticed how people sometimes don’t give these posters much attention and from how many they are they often don’t even notice them because they are used to them being around in the environment. Rarely have I seen someone approach them and stand to read them.

From here, I wanted to move into another direction of how people dealt with art pieces.
As an artist, the sense of touch is an important element when viewing art. I prefer touching the piece and feeling the texture and the layers of paint.

Not all galleries permit people to touch artwork and also when I asked several people, not all cared about touching the pieces and some felt that they need permission first.

Realizing that this is not a preferred interaction for everyone and people have different ways of observing art I decided to proceed with collecting data over an 8 hour period of one day to see how close people got to my painting as well as the amount people who approached it and the time of day they did. It was even more interesting looking into the different preferred ways of viewing art.

I approached several ways in deciding what kind of art to use for this project.
Here are some images of the first attempts:

dsc_0653     This idea came out when I was observing if people read the informative posters around campus. I listed out ordinary phrases such as
“Come CLOSER-this is just another stoopid poster!” and thought of how people would approach words on an art piece on the wall similar to the poster. The double “o” in ‘stoopid’ was also an idea of play on words in have the ultra-sonic sensor built in the art piece.
I then chose the wording on the above image because I wanted a more subtle interesting phrase that I thought people could relate too and still used the “..” to add the ultrasonic sensor into.

 

dsc_0656     This was my piece when I wanted to play around with the idea of touching a painting. The dotted hand print was to invite the individual to play his hand in that area. For this my sensor would be a pressure pad.

 

final     This was the final chosen piece and what I believe was best suited for my concept of simply seeing how people interact differently with an art piece.

 

Watch “The Observer Part II” on Vimeo:

https://vimeo.com/193088364?ref=em-share

Music: Starover Blue – “A Flower In Space”

Technology

Hardware

List of components and materials used:

Mixed Media Art Piece on Canvas
1 ESP8322 Huzzah Feather
Breadboard
Small Cardboard box
Portable USB Power Pack
Resistor (10k
Conductive Wires
Ultrasonic sensor
Velcro
Shoe Print Signs (Way-Finding Signs)

 

How It Works?

LOCATION: I hung this art piece at the entrance of 205 Richmond Street West because I wanted a place with high traffic of all the people that access the building. It was placed on a wall perpendicular to a big mirror on the street level. The ground area surrounding the piece where a person would stand to observe was small therefore it was well-suited because it meant that there will be no passers in front of the sensors unless they intend to see the piece. This was a way to avoid mis-readings. This position and use of the mirror was to allow people to view the piece from a distance whether they were coming through the entrance doors, out the elevator or even up and down the stairs. It would then invite them over to view it directly if they wanted to.

whatsapp-image-2016-11-23-at-7-30-36-pm-6            whatsapp-image-2016-11-23-at-7-32-14-pm

 

I also placed cut our of shoe prints on the floor as way finding signs that would also add to the way people were invited over to look at the art piece.

whatsapp-image-2016-11-23-at-7-30-36-pm-5

 

One ULTRASONIC sensor was attach to the top of the label cardboard box underneath the painting. Every time a person  approaches to observe the art a new value is registered and is sent to Adafruit IO.
Using IFTTT, the readings traveled from the Adafruit feed to a Google Drive spreadsheet on my account. 92 readings were registered as the people who approached the piece on the 23rd of November, between the hours of 14:27 to 22:30. The data collected was according to people’s proximity with the painting. I set the distance value to be registered when it was between 0-30 cm.

screen-shot-2016-11-27-at-12-12-26-pm      whatsapp-image-2016-11-23-at-7-32-13-pm-2

The above cardboard box was covered with a label showing details of the painting. It was the housing for all the wiring and power. The power, microprocessor (Huzzah ESP-8322) and data wires for the  ultrasonic sensor were secured to the breadboard on the back this box.

 

screen-shot-2016-11-23-at-8-42-01-pmView of Project 

 

 

Circuit Diagram

 

feather_ultrasonicsensorinput

Software

Code available on GitHub 

https://github.com/saragazz/theobserverII.git

 

 Coding + Challenges

I began with the process of getting my mac address in order for Nick to give me access to OCAD’s wifi. After that I set up an AdaFruit account and used a reference code example to connect to my account and publish my readings on my Feed.

I started out working on my code for the ultrasonic sensor using a reference. I was trying to adjust the code in a way telling it when the distance between the observer and the art piece should be registered. I first experimented at home with using different thresholds. I started out with a threshold of 2 between movements in front of the sensor. Then realized I was getting so many readings per person. Therefore, I increased the threshold to 10.
I also used an “if” statement for reading movements between 0-30 cm as “close” and 30-100 cm as “far”.

whatsapp-image-2016-11-23-at-7-32-13-pm-1     whatsapp-image-2016-11-23-at-7-33-54-pm-1whatsapp-image-2016-11-23-at-7-32-13-pm    whatsapp-image-2016-11-23-at-7-33-54-pm

 

The Conclusion:

The data collected on the spreadsheet was visualized in Adobe Illustrator C6 in the form of a line graph. The X-axis of this graph represents the time period and the Y-axis represents the distance. The number of people that approached the piece in total were 92 and they are seen as the number of vertical lines.
screen-shot-2016-11-24-at-7-28-07-pm

I took that data and visualized it in the abstract painting below. The width of the canvas was divided into 1 hour intervals and had clusters of circles floating vertically. The circles represent the people who approached and  the size of the circle represents the distance between them and the piece. The bigger the circle the closer that person was to the artwork.

whatsapp-image-2016-11-25-at-10-42-03-am-1

 

 

Future Iterations:

I would take this project further by testing out and looking more into eye tracking systems. It would definitely be more interesting to further develop it and see how people actually view the art piece when looking at it.

References + Case Studies:

How people observe art.

The Art of Looking: How Eleven Different Perspectives Illuminate the Multiple Realities of Our Everyday Wonderland

http://www.huffingtonpost.com/james-elkins/how-long-does-it-take-to-_b_779946.html

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.