Experiment 5 – OK to Touch!?

portfolio-image-5

Project Title
OK to Touch!?

Team members
Priya Bandodkar & Manisha Laroia

Mentors
Kate Hartman & Nick Puckett

Project Description
Code | Computer Vision | p5js
OK to Touch!? is an interactive experience to bring into the spotlight the inconspicuous tracking technology and make it visible to the users through interactions with everyday objects. The concept uses experience to convey how users’ private data could be tracked, without their consent, in the digital future.

A variety of popular scripts are invisibly embedded on many web pages, harvesting a snapshot of your computer’s configuration to build a digital fingerprint that can be used to track you across the web, even if you clear your cookies. It is only a matter of time when these tracking technology takes over the ‘Internet of Things’ we are starting to surround ourselves with. From billboards to books and from cars to coffeemakers, physical computing and smart devices are becoming more ubiquitous than we can fathom. As users of smart devices, the very first click or touch with the smart object, signs-us-up to be tracked online and be another data point in web of ‘Device’ Fingerprinting, with no conspicuous privacy policies and no apparent warnings.

With this interactive experience, the designers are attempting to ask the question:
How might we create awareness about the invisible data tracking methods as the ‘Internet of Things’ expands into our everyday objects of use?

Background
In mid 2018 when our inbox was full of Updated privacy policy emails, it was not a chance event that all companies decided to update their policies at the same time, but an after effect of the enforcement of GDPR, General Data Protection Regulation. The General Data Protection Regulation (EU) 2016/679 (GDPR) is a regulation in EU law on data protection and privacy for all individual citizens of the European Union (EU) and the European Economic Area (EEA). It also addresses the transfer of personal data outside the EU and EEA areas. The effects of Data breach for political, marketing and technological practices is evident with the Facebook–Cambridge Analytica data scandal, Aadhaar login breach and Google Plus exposed the data of 500,000 people, then 52.5 million to name a few.

datasecurity-paper
News articles about recent data scandals. Image source: https://kipuemr.com/security-compliance/security-statement/

When the topic of Data Privacy is brought up in discussion circles, some get agitated about their freedom being affected, some take the fifth and some say that, ‘we have nothing to hide.’ Data privacy is not about hiding but about being ethical. A lot of the data that is shared across the web is often used by select corporations to make profits at the cost of the individual’s Digital Labour, that is why no free software is free but all comes at a cost, the cost of your labour of using it and allowing for the data generated to be used. Most people tend to not know what is running in the background of the webpages they hop onto or the voice interaction they have with their devices, and if they don’t see it they don’t believe it. With more and more conversation happening around Data Privacy and ethical design, we believed that it would help if we could make this invisible background data transmission visible to the user and initiate a discourse.exp-5-proposal-question

Inspiration

immaterials

The Touch Project
The designers of the Touch project— which explores near-field communication (NFC), or close-range wireless connections between devices—set out to make the immaterial visible, specifically one such technology, radio-frequency identification (RFID), currently used for financial transactions, transportation, and tracking anything from live animals to library books. “Many aspects of RFID interaction are fundamentally invisible,” explains Timo Arnall. “As users we experience two objects communicating through the ‘magic’ of radio waves.” Using an RFID tag (a label containing a microchip and an antenna) equipped with an LED probe that lights up whenever it senses an RFID reader, the designers recorded the interaction between reader and tag over time and created a map of the space in which they engaged. Jack Schulze notes that alongside the new materials used in contemporary design products, “service layers, video, animation, subscription models, customization, interface, software, behaviors, places, radio, data, APIs (application programming interfaces) and connectivity are amongst the immaterials.”
See the detailed project here.

digital-fingerprint

This is Your Digital Fingerprint
Because data is the lifeblood for developing the systems of the future, companies are continuously working to ensure they can harvest data from every aspect of our lives. As you read this, companies are actively developing new code and technologies that seek to exploit our data at the physical level. Good examples of this include the quantified self movement (or “lifelogging”) and the Internet of Things. These initiatives expand data collection beyond our web activity and into our physical lives by creating a network of connected appliances and devices, which, if current circumstances persist, probably have their own trackable fingerprints. From these initiatives, Ben Tarnoff of Logic Magazine concludes that “because any moment may be valuable, every moment must be made into data. This is the logical conclusion of our current trajectory: the total enclosure of reality by capital.” More data, more profit, more exploitation, less privacy. See the detailed article here.

paper-phone_special project

Paper Phone
Paper Phone is an experimental app, developed by a London based studio Special Project as part of the Google Wellness experiments, which helps you have a little break away from your digital world, by printing a personal booklet of the information you’ll need that day. Printed versions of the functions you use the most such as contacts, calendars and maps let you get things done in a calmer way and help you concentrate on the things that matter the most. See the detailed project here.

irl-podcast

IRL: Online Life is Real Life
Our online life is real life. We walk, talk, work, LOL and even love on the Internet – but we don’t always treat it like real life. Host Manoush Zomorodi explores this disconnect with stories from the wilds of the Web, and gets to the bottom of online issues that affect us all. Whether it’s privacy breaches, closed platforms, hacking, fake news, or cyber bullying, we the people have the power to change the course of the Internet, keeping it ethical, safe, weird, and wonderful for everyone. IRL is an original podcast from Firefox, the tech company that believes privacy isn’t a policy. It’s a right. Here the podcast here.

These sources helped define the question we were asking and inspired us to show the connection between the physical and digital to make the invisible, visible and tangible to accept.

The Process

The interactive experience was inspired by the ‘How Might We’ question we raised post our research on Data privacy and we began sketching out the details of the interaction;

  • Which interactions we wanted- touch, sound, voice or tapping into user-behaviour
  • What tangible objects we should use, – daily objects or a new product which incorporated affordance to interact with or digital products like mobile phones, laptops.
  • Which programming platform to use, and
  • How the setup and user-experience would be?

ideation-comp_ While proposing the project we intended to make tangible interactions using Arduino, embedded in desk objects and using Processing with it to create visuals that would illustrate the data tracking. We wanted the interactions to be seamless and the setup to look as normal, intuitive and inconspicuous that would reflect the hidden, creepy nature of Data tracking techniques.  Here is the initial setup we had planned to design:

installation

Interestingly, in our early proposal discussion, we raised the concerns of having too many wires in the display if we use Arduino and our mentors proposed we look at the ml5 library with p5js; a machine learning library that works with p5js to be able to recognize objects using computer vision. We attempted the YOLO library of ml5 and iterated with the code trying to recognize objects like remotes, mobile phones, pens, or books. The challenge with this particular code was in trying to create the visuals we wanted to accompanied with each object that is recognized, to be able to track multiple interactions and to be able to overlay the video that is being captured with computer vision. It was very exciting for us to use this library as we had to not depend on hardware interactions and we could use a setup with no wires, no visible digital interactions and create a mundane setup which could then bring in the surprise of the tracking visuals aiding the concept.

track-remote-with-rectangle

ml5-recognize-remote
Using YOLO ml5 library to track and recognize a remote.

data-points-map

motion-tracking_code
Using the openCV library for p5js and processing.

In using the ml5 library we also came across the openCV libraries that work with processing and p5js and iterated with it to use the pixel change or frame difference function. We created overlay visuals on the video capture and also without the visual capture thus creating a data tracking map of sorts. Eventually we use the optical flow library example and built the final visual output on it. To input data we used a webcam and captured the video feed for running through p5js.

Challenges & Learnings:
Our biggest learning was in the process of prototyping and creating setups to user test and understand the nuances of creating an engaging experience.
The final setup, was to have a webcam on the top to track any interaction that happens with the products on the table and the input video feed would be processed to then give a digital output of data tracking visualizations. For the output we tried various combinations like using a projector to throw visuals as the user interacted with the objects, use an LCD display to overlay the visual on the video feed or use a screen display in the background to form a map of the data points collected through the interaction.

The top projection was something we felt would be a very interesting output method as we will be able to throw a projection on the products as the user interacted with them, creating the visual layer of awareness we wanted. Unfortunately, each time we had top projection the computer vision code would get a lot of visual noise as each projection was being added to the video capture as an input and a loop of feeds would generate creating visual noise and multiple projections which were unnecessary as part of the discrete experience we wanted to create. Projections looked best in dark spaces but that would compromise with the effective of the webcam as computer vision was the backbone of the working of the project. Eventually we used a LCD screen and a webcam top mounted.

proposal-image-1
Test with the video overlay that looked like an infographic
process-2
Testing the top projections. This projection method generated a lot of visual noise for the webcam and had to be dropped.
tracking
Showing the data point tracking without the video capture.
process-1
Setting up the top webcam and hiding it within a paper lamp for the final setup.

Choice of Aesthetics:
The final video feed with the data tracking visual we collected was looking more like an infographic and subtle in nature as compared to the strange and surveillance experience that we wanted to create. So we decided to use a video filter to add that additional visual layer on the video capture to show that it has undergone some processing and is being watched or tracked. The video was displayed on a large screen which was placed adjacent to a mundane, desk with the typical desk objects like books, lamps, plants, stationery, stamps, cup and blocks.

setup

Having a bare webcam during the critique made it evident for the user’s about the kind of interaction and learning from that we hid the webcam in a paper lamp in the final setup. This added another cryptic layer to the interaction adding to the concept.

setup-in-use

These objects were chosen and displayed in a way so as to create desk workspace where people could come sit and start interaction with the objects through the affordances created. Affordances were created using, semi-opened books, bookmarks inside books, open notepad with stamps and ink-pads, a small semi-opened wooden box, a half filled cup of tea with a tea bag, wooden block, stationery objects, a magnifying glass, all to hint at a simple desk which could probably be a smart desk that was tracking each move of the user and transmitting data without consent on every interaction made with the objects.The webcam was hung over the table and discreetly covered by a paper lamp to add to the everyday-ness of the desk setup.

Each time a user interacted with the setup, the webcam would track the motion and the changes in the pixel field and generate data capturing visuals to indicate and spark in the user, that something strange was happening making them question, if it was Ok to Touch!?

user1

Workplan:

Dates Activities
23rd November – 25th November Material procurement and Quick Prototyping to select the final Objects
26th November – 28th November Writing the code and Digital Fabrication
28th November – 30th November Testing and Bug-Fixing
1st December to 2nd December Installation and Final touches
3rd December to 4th December Presentation

portfolio-image-2

portfolio-image-1

The Project code is available on Github here.

__________________
References

Briz, Nick. “This Is Your Digital Fingerprint.” Internet Citizen, 26 July 2018, www.blog.mozilla.org/internetcitizen/2018/07/26/this-is-your-digital-fingerprint/.

Chen, Brian X. “’Fingerprinting’ to Track Us Online Is on the Rise. Here’s What to Do.” The New York Times, The New York Times, 3 July 2019, www.nytimes.com/2019/07/03/technology/personaltech/fingerprinting-track-devices-what-to-do.html.

Groote, Tim. “Triangles Camera.” OpenProcessing, www.openprocessing.org/sketch/479114

Grothaus, Michael. “How our data got hacked, scandalized, and abused in 2018”. FastCompany. 13 December 2018. www.fastcompany.com/90272858/how-our-data-got-hacked-scandalized-and-abused-in-2018

Hall, Rachel. Chapter 7, Terror and the Female Grotesque: Introducing Full-Body Scanners to the U.S. Airports pp. 127-149 In Eds. Rachel E. Dubrofsky and Shoshana Amielle Maynet, Feminist Surveillance Studies. Durham: Duke University Press, 2015.

Khan, Arif. “Data as Labor” Singularity Net. Medium, 19 November 2018
blog.singularitynet.io/data-as-labour-cfed2e2dc0d4

Szymielewicz, Katarzyna, and Bill Budington. “The GDPR and Browser Fingerprinting: How It Changes the Game for the Sneakiest Web Trackers.” Electronic Frontier Foundation, 21 June 2018, www.eff.org/deeplinks/2018/06/gdpr-and-browser-fingerprinting-how-it-changes-game-sneakiest-web-trackers.

Antonelli, Paola. “Talk to Me: Immaterials: Ghost in the Field.” MoMA, www.moma.org/interactives/exhibitions/2011/talktome/objects/145463/.

Shiffman, Daniel. “Computer Vision: Motion Detection – Processing Tutorial” The Coding Train. Youtube. 6 July 2016. www.youtube.com/watch?v=QLHMtE5XsMs

 

Leave a Reply

Your email address will not be published. Required fields are marked *