Monstrous Anonymity

  • Strategy:

This week the goal was to take a step back and work not with technology, but against it (obviously still with, but, you know). My background in photography has always made my slightly fascinated by facial recognition. Particularly since the liquify tool on Photoshop started to incorporate facial recognition in order to streamline beauty retouching. This, as a tool, generally promotes problematic and harmful ideas around beauty, but can also be used to create somewhat monstrous manipulations of the face. I wanted to explore where this technology starts to break down – when does the manipulation stop registering as a face? When does my face stop being my face to something like google? The ultimate goal here is to make a photoshop action that would take a normal photo of a face and make it not recognizable as human or not identifiable as a specific person.

  • Documentation:

I started off by turning on the facial recognition in my google photos. It took the better part of a day to scrub through all of my photos (26,260), but once set up google auto configures as series of albums that are groupings of the same face. The first photos google has access to of me is from 2010, which as an interesting side note is 3-4 years before I started transitioning, but you can’t fool google!

Or can you???

I have been told in passing the the forehead and general symmetry of the face are the things to manipulate to try and confuse google computer vision so I made a few different liquify presets and then then reapplied it to the same photo and uploaded the results until google stopped recognizing the face as me. I liked the idea of using the facial recognition of photoshop to confuse the facial recognition of google so I wanted to keep the parameters to the parts of the face that can be directly targetting by photoshop.

(as an aside, the rediscovery of many photos from the last 10 years was not always pleasant, so I wouldn’t necessary recommend it as something to go into in and unconsidered way if you’re a person who may experience *feelings*)

 

Starting Photo!

img_20190306_115817

Experiment ONE:

run Once (still me)

onceimg_20190306_115817

Run Twice  (STILL ME)

twiceimg_20190306_115817

Run Three Times (NOT ME)

thriceimg_20190306_115817

Experiment TWO

It took 5 rounds of this effect to get to a place where google wouldn’t see me.

twoone

twotwo

twothree

twofour

twofive

NOT ME:

twosix

A disturbing Gif:

https://photos.app.goo.gl/RdWcDvu3QyjP82oa6

Experiment THREE

threeone

threetwo

threethree

threefour

threefive

ALL OF THEM REGISTERED AND THAT’S JUST BANANAS.

Another Gif:

https://photos.app.goo.gl/6W5bXdGCWK4Rd84f8

Here are the photoshop actions for people to play with themselves!

https://drive.google.com/file/d/1LJcp3wPgzZgdbVS5H-xr_s01EO56TgVu/view?usp=sharing

  • Insights:

This experiment felt less insightful and more inspiring of more questions. Im curious as to what the results would be if I were to play with colour editing, or noise, or transparencies. It was surprising difficult, or rather, the warping effects felt as though they need to be quite extreme in order to be effective, which was unexpected. It feels as though there should be more errors if the parameters for what registers as my face is so broad, like other people should be getting caught in that net, but they are not. This is part of why I am curious about the manipulation of colour or noise in a photograph for potential further tests. When I was sharing the results with some friends one of them mentions that the eyebrow bridge between your eyes is very crucial in how our faces get read, but that part of the face is not targetable by the liquify panel so would be much harder to incorporate into an action.

  • Information sources:

Photoshop’s 2015.5’s new Face Aware Liquify for Portrait Retouching – https://www.youtube.com/watch?v=vyBGGuJhESU

  • Next Steps:

It would be nice to make a site where people upload photos of themselves and get returned a series of results of their unidentifiable. Or maybe even simpler, just a gallery to upload other peoples photos two after they have run the actions in photoshop. I envisioned this primarily as a weird little art project, so it would be interesting to display the results together. If I was going to make it much larger I would try to incorporate some of the colour testing to get more interesting photo outputs and would be interested to do more precise testing to see if i can discover more information about exactly where the line for identifiability is.

Massage Spike Chair

* Strategy:

the testing that interested me the most was the array of motors. The potential to move a vibration across a surface or body part felt like a fun area of exploration and I was reminded of an idea I had had last year for a feedback chair. Designing, refinishing and rebuilding furniture has always been a hobby of mine and I have been waiting for an opportunity to incorporate some furniture into some of the electronics explorations I have been doing. Naturally, since my concerns are often with weird pleasures and BDSM aesthetics I envision this chair to be a combination massage chair/nail bed.

* Documentation:

For the small prototype version of the chair I wanted to test out how the vibration would feel through a patter of clothing spikes (like the ones we put in our coats when we were young punks), so I decided to make a small pad with just a few motors. The final goal would be to take an old wooden chair and replace the cushion with one a custom made pad like the one made for this test.

img_20190227_172403

the important pieces.

img_20190227_174735

extending the wires

img_20190227_182147

gluing everything way too much so that the wired don’t pull out as I had been warned that this often happened

img_20190227_190635

I ended up putting an extra layer of leather around the motors as the upper level with the spikes was not laying flat and needed some more support.

img_20190227_190603

the final pile

img_20190227_191019

hooking it up to the arduino

img_20190227_192014

I realized there was no good way to make a video of it as the vibrations are not strong enough to show up, but it was pleasant, less painful than anticipated and much more distinct in the area of vibrations that I expected. the vibration patter was simple and just cycled through the three motors. moving forward it would be nice to experiment with writing more complex patterns.

video of testing that shows the code working:

https://photos.app.goo.gl/MRXSETV575SupQzt9

very simple code:

int VIB1 = 9;
int VIB2 = 5;
int VIB3 = 10;

// the setup function runs once when you press reset or power the board
void setup() {
  // initialize digital pin LED_BUILTIN as an output.
  pinMode(VIB1, OUTPUT);
  pinMode(VIB2, OUTPUT);
  pinMode(VIB3, OUTPUT);
}

// the loop function runs over and over again forever
void loop() {
    analogWrite(VIB1, 255);
    delay(1000);
    analogWrite (VIB1,0);
    analogWrite(VIB2, 255);
    delay(1000);
    analogWrite(VIB2,0);
    analogWrite(VIB3,255);
    delay(1000);
    analogWrite(VIB3,0);

}

* Insights:

I think one of the big takeaways from this process is that the placement of the motors and the placement of the spikes will be the most important element to finesse in the version of the chair. If the spikes are too far apart they won’t form enough of a surface to support weight and will just becoming painful. They will also be very difficult to align if they have space between them to fall over, since this thin leather is quite stretchy. It’s possible that the leather used will have to be much stiffer in order to let them sit flat. Originally, I though the motors would need to be placed quite close together, but in order for them to be distinct they need to have quite a bit of space from one another, which is very nice in a practicality sense due to them being expensive and also the port limit of the board I have been working with (feather M0). I will probably be looking into working with a Mega for the final chair as I envision the spikes and motors lining not only the seats but also the back, and even with the allowance of space between motors, the surface area of a chair will probably require more than the 13 ports on the feather. The mega is also a 5v bolt which produces much more pleasing vibrations than the 3v feather, which in this version is quite weak.

 

* Information sources:  n/a

* Next Steps:

The next major steps are the start working it into chair form. Sourcing good wooden chairs has always been as little tricky, but I have found a few that I may be picking up soon. I think the best course of action would be to make a similar pad, but larger, or maybe even a few to swap out and get to testing out how it feels to sit on them – it would be interesting to get different spikes/nails and see the way the shapes of them change the feeling of the vibrations. The whole sensation will most likely change dramatically when a person’s whole body weight is pressing into the spikes. Further to that I would like to incorporate some type of use controllers on the chair. Aesthetically, I like the idea of this being a row of potentiometers along the arms that change the speed and intensity and possibly patterns. Inputs are even more reason to move over to working with a Mega, gonna need those ports!

Welcome Mat

Strategy:

I live in a three story house. Often when myself or my partner arrive home one of is on the third floor (it’s where the TVs live). It is near impossible to hear the door opening when you are on the third floor. With this is mind I set out to make a doormat that would notify the third floor when someone (or something – we do have a 100lbs rottweiler) entered the house. This would provide a soft hello, replacing the screaming hello that often carries up the stairs.

 

workshop-4-worksheet1

Documentation:

Originally I envisioned a little LED or speaker on the third floor that would require be wired to the sensor at the entrance, but our house has Hue lights and that seemed like a much more elegant solution. Triggering the hue lights would require connecting the Feather to Adafruit IO and Adafruit IO to If This Then That, so my initial prototype goal was to get an analogue sensor speaking to Adafruit IO.

Including the Adafruit IO library in the Arduino IDE provided me with an example analog sketch to base the arduino end off of. This sketch is in constant contact with Adafruit IO which I would need to change later on as I only wanted it to push information when the sensor detects a body, but it was a good starting place.

My Adafruit IO account was set up from a previous project, so all I needed was a new feed. I used to adafruit guide for connecting to adafruit IO found here – https://learn.adafruit.com/welcome-to-adafruit-io/libraries

And then I got an error. Even though I had installed the Adafruit IO library in the IDE I was getting an MQTT file not found error. It took some googling to discover that a number of libraries need to be installed for the Adafruit IO library to work, which you would think would be in the Adafruit documentation somewhere… These additional standalone libraries include the Adafruit MQTT Library, The Adafruit HTTP Client Library, and the Wifi101 library. After these were installed I stopped receiving the compiler error and could move on to adding the SSL Certificate to the onboard wifi of the feather as instructed in the guide.

A guide  for this can be found here: https://learn.adafruit.com/adafruit-feather-m0-wifi-atwinc1500/updating-ssl-certificates

One important note for this process: I was using a Feather M0 Wifi, which to me was different than the standard Feather M0 and as such when i read the line “If you are using a Feather M0 or WINC1500 breakout, don’t forget to update the pins as necessary with setPins()!” I did not think it applied to me. WRONG. Update the sketch with the wifi pins otherwise the firmware updater will receive errors.
Finally I was able to upload and sketch and confirm that the Feather was talking to adafruit. I used a photocell sensor as a test analogue input and conveniently the analogue in example sketch was already set up for this.

photocell

AND TADAAAAA!!! Contact!

screenshot-23

From here it was a simple matter of adjusting the code to only send information when the cell detected a low enough value and the creation of an applet in IFTTT. At first I tried an applet within IFTTT that handled the logic, but wanted to not send information constantly so changed to an “anytime feed is updated” trigger:

Final code:

//  Based on Adafruit IO Analog In Example
// Tutorial Link: https://learn.adafruit.com/adafruit-io-basics-analog-input
 Written by Todd Treece for Adafruit Industries
// Copyright (c) 2016 Adafruit Industries
// Licensed under the MIT license.
//
// All text above must be included in any redistribution.

/************************** Configuration ***********************************/

// edit the config.h tab and enter your Adafruit IO credentials
// and any additional configuration needed for WiFi, cellular,
// or ethernet clients.
#include “config.h”

/************************ Example Starts Here *******************************/

// analog pin 0
#define PHOTOCELL_PIN A0

// photocell state
int current = 0;
int last = -1;

// set up the ‘analog’ feed
AdafruitIO_Feed *analog = io.feed(“DoorMatFeed”);

void setup() {

  // start the serial connection
  Serial.begin(115200);

  // wait for serial monitor to open
  while(! Serial);

  // connect to io.adafruit.com
  Serial.print(“Connecting to Adafruit IO”);
  io.connect();

  // wait for a connection
  while(io.status() < AIO_CONNECTED) {
    Serial.print(“.”);
    delay(500);
  }

  // we are connected
  Serial.println();
  Serial.println(io.statusText());

}

void loop() {

  // io.run(); is required for all sketches.
  // it should always be present at the top of your loop
  // function. it keeps the client connected to
  // io.adafruit.com, and processes any incoming data.
  io.run();

  // grab the current state of the photocell
  current = analogRead(PHOTOCELL_PIN);

  // return if the value hasn’t changed
  if(current > 300)
    return;

  // save the current state to the analog feed
  Serial.print(“sending -> “);
  Serial.println(current);
  analog->save(current);

  // store last photocell state
  last = current;

  // wait three seconds (1000 milliseconds == 1 second)
  //
  // because there are no active subscriptions, we can use delay()
  // instead of tracking millis()
  delay(3000);
}

It lives!

https://photos.app.goo.gl/zgYdCUz7GNq4aZmC9

Insights:

This project was a good refresher on hooking up the Feather to external systems and a good reminder on the ease of extending it’s functionality outside of it’s circuit. Also a good reminder to not trust documentation all the time as the specifics for connecting the wifi and adafruit IO caused a large amount of time setback in making the prototype. I have been enjoying thinking about ways to interact with the body that don’t revolve around hands and eyes, as that is often the site of tech, so trying to think about different body parts and what the potential could be when targeting them has provided some interesting lines of thought. I think the tendencies with technology, and specifically the bodily notification kind, is to improve our behaviour, which I’m not interested in, but interpersonal communication and the potentially for technological interference there is something that I would like to continue to explore.

Information sources:

Discussed in documentation – adafruit tutorials.

https://learn.adafruit.com/welcome-to-adafruit-io/libraries

https://learn.adafruit.com/adafruit-feather-m0-wifi-atwinc1500/updating-ssl-certificates

Next Steps:

The next step would be to change the sensor to velostat button, similar to the earlier button I made but larger, to function as the doormat. The code would them have to be adjusted based on the sensor values when tested, but otherwise should be the same. There is a lag in the IFTTT applet because it does not constantly check the feed, so I would consider changing the arduino to speak directly to the hue system to increase speed, but as it stands it seems to be 30 seconds to a minute between sensor activation and light flickering.

 

T-Rex Arms

Results from Voltmeter testing

img_20190129_163832

Results from Arduino testing (all with 1K resistor) 

Velostat
Active 970
Passive 300

Big swatch
Active 1000
Passive 350-400

Eonyx Pressure sensing fabric
Active 950
Passive 200-250

Eonyx Stretch sensing Fabric
Active 15
Passive 8-9

Eonyx StaTex conductive fibre
Active 370
Passive 230

 

  • Strategy:

My plan was to make a button in the middle of the wearers armpit using the velostat. The button would register as pressed when the wearer make “t-rex” arms.  I will be sewing the neoprene into a pouch in a jersey tube that can be slipped over an arm. Conductive thread will run from either side of the neoprene to connect to the arduino. For testing purposes the arduino LED will blink when sensor reached a number reflecting the active state – over 900.

 

  • Documentation:

img_20190129_152922

Materials

img_20190129_153942

attach button half 1

img_20190129_154722

make button conductive

img_20190129_160421

img_20190129_160425

make other conductive side with resister sewn in

 

img_20190129_160505

And then close it up and attach the arduino and the code!

screenshot-18
Video of it working can be found here:

https://photos.app.goo.gl/m4HioLSPcNDPgTJg9

  • Insights:

Following up on last week and the making of something that has little to do with my work, i wanted to take the opportunity this week to make something that could be incorporated more into my work at large. A tendency I fall into when considered sensors is often to go straight to buttons. Everything is a button! Press it on and off! In contrast to that, the way that i would intend to use this is as something that needs to be held down. Still a button, but a little different. If I were thinking about this as some way to interact with a game pressing a button and holding down a button open up very different possibilities, especially when the holding down the button requires maintaining an awkward body position – entirely my jam. Placing the button in the armpit means that pressing it mimics the position of an arm binder (a piece of fetish gear) which is where is became interesting for me. Thinking about different ways to accomplish bondage like experiences without force is a fun avenue of thinking and could potentially work its way into thesis.

 

  • Information sources: n/a

 

  • Next Steps:

Integrating and making permanent the circuit would definitely help to polish the look, which I would definitely want t do if i were going to be developing it further. Also I would consider the necessity of the arduino and whether or not that could be done without and as it adds a lot of bulk – If i could make a game without it, I would.

 

knitting is hard – Max

Strategy:

The prompt I chose from the 8 was Softly Clapping Scarf. I envisioned one of those childrens scars that have little hand pockets on the end. In theory this wouldn’t even need a circuit to accomplish the prompt, a simple clapping scarf would do it, but since that avoids the point, I was thinking of putting some LEDs into it to translate the clapping action into a visual output instead of audio.

I chose knitting as the medium of choice, both because it seemed like the most applicable but also because it seemed like the most personally interesting skill to learn. The plan was to knit conductive thread into the ends of a scarf, connected by a single thread across the length of the scarf so that when the two ends are connected to circuit is complete.

Documentation:

Turns out I’m not very good at knitting. First step was getting through a single line and moving on to the next. For whatever reason i got to the end of the first line and kept losing it trying to transition to the second, but after three or four tries the hang of it was got!

00100dportrait_00100_burst20190122182701064_cover

Turns out the addition of the conductive thread makes the process of knitting significantly harder. The addition of something that has no stretch slowed my progress a lot and it took an unreasonable amount of time to get this little piece done.

img_20190122_184502

00000img_00000_burst20190122211305976_cover

As the conductive thread slowed the pace so dramatically the plan for how long the scarf would be shortened and shortened as time went on. After making a little square of conductive area, I moved on to pulling the thread through only at one of the ends.

00100dportrait_00100_burst20190122213834286_cover

Things got wonkier and wonkier as the knitting went on, and since the goal was a working circuit I decided a full scarf was too far out of scope, so went back to knitting the thread back in for another square at the other end.

img_20190123_100535

00100dportrait_00100_burst20190123100542604_cover

Again, VERY bad at knitting. I don’t even know how you would intentionally change the number of stitches, but I did very good at unintentionally changing the number of stitches. But hey, it conducts.

00100dportrait_00100_burst20190123110128317_cover

 

A scarf only a mother could love.

Insights:

First off, knitting is hard. Ok, not hard, but tricky, and irritating. It’s interesting to think about the potential uses for creating custom textiles to integrate into other projects. Within the context of my own work the applications seem limited as the aesthetic of these different materials are something I think a lot about. With knitting, and weaving and felting, it becomes hard to escape the colloquial associations of their materiality. This may in and of itself be an interesting area of exploration – the possible inversion or subversion of the expectation of the materials. Knits and weaves (and even more so felts) are not fabrics I generally enjoy wearing so there is already an expectation of discomfort when considering them – i find them scratchy and unpleasant. They carry, in this elementary DIY context, an association with warmth and comfort and craft, and these things are all great, but again, not my area of interest. When thinking about the relationship to the body I think this material consideration becomes extra important. What is the statement being made by choosing one of these materials? What kind of bodily experiences do they afford. What is the expectation of someone putting on a knit scarf? Going forward I would like to try to incorporate more of this thought process into the development process, thinking more about the option of of materials what statement they make in and of themselves. There is the interest in how they can be used to incorporate technology into themselves, but that is only interesting if considered in the broader context of how that technology makes a person feel.

Again, knitting is hard.

Information sources: none

Next Steps:

If this were going to be made in actuality pockets would be needed at the end as well the the inclusion of the LEDs and a break in the circuit somewhere for a battery. Also probably making something long enough to actually be worn around a neck. Using a colour or weight of yarn that would hide he conductive thread would also be nice as in this version it just looks like muddy yellow.