Attendance

img_20190213_120554

GitHub: https://github.com/npyalex/OnCampus 

Attendance describes a speculative ambient body-centric design project in which a the relative location of a roster of people are loosely tracked. A family (or a cohort of grad students, perhaps) each have an entry on a fixture (which could be expressed multiple ways – for the purposes of this project I imagined working with shape-memory alloys) that highlights when those people are nearby.

Above is a rough sketch of the fixture realized with shape-memory alloys: lengths of wire twist into an approximation of  the person’s name when they are close, and unwind into nothingness when they are away. Below is the same concept rendered with LEDs on a flat clock-face.  This project was researched and coded with the theoretical understanding that it would be realized with lengths of shape-memory alloy wiring.

1-1

I’ve worked with If This Then That and Adafruit IO quite often recently and I’ve been enjoying it, so they were the first place my mind went to when considering how to realize this project.

I started by setting up feeds in Adafruit IOdocu2and a pair of IFTTT applets
docu1

to track my location and interface with Adafruit IO. When I enter a radius around campus it sends “1” to the “arrived” feed, and when I leave the radius it sends “1” to the “gone” feed.

docu3

A little walking demonstrated that IFTTT and Adafruit IO were interfacing correctly: below you can see that the feeds successfully tracked the instances when I left for lunch and when I returned.

docu2

Without shape-memory alloys to play with I had to get speculative with the code. I did some research and learned that SMAs require careful voltage, with some trial and error depending on size. I set up my code to use a transistor and pulse width modulation so that when it is eventually hooked up to SMA I can find the ideal voltage for it.

In the photo below the LED is in the place of the shape-memory alloy.

img_20190213_170550

docu5

docu4

It took me a fair bit of digging to figure out how to monitor multiple Adafruit feeds in one sketch, and I continue to have some trouble with the syntax of the functions in the Gone/Arrived sections of the code.

I’d like the chance to work with shape-memory alloys properly and expand this concept – until then, I can prove the concept with an LED.

Works Consulted

https://learn.adafruit.com/adafruit-feather-m0-basic-proto/adapting-sketches-to-m0

https://learn.adafruit.com/adafruit-io-basics-analog-output/arduino-code

https://github.com/adafruit/Adafruit_IO_Arduino/blob/master/examples/adafruitio_12_group_sub/adafruitio_12_group_sub.ino

www.makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/

https://www.arduino.cc/en/Tutorial/TransistorMotorControl

https://github.com/adafruit/Adafruit_IO_Arduino/blob/master/examples/adafruitio_03_multiple_feeds/adafruitio_03_multiple_feeds.ino

https://github.com/adafruit/Adafruit_IO_Arduino/blob/master/examples/adafruitio_12_group_sub/adafruitio_12_group_sub.ino

Muscle Manager

img_20190207_083929 docu3screenshot_20190207-085348

Github: https://github.com/npyalex/Muscle-Sensor/

Concept

I get headaches often. I clench my jaw when I’m stressed, when I’m focusing, or when I’m nervous. Discussions with many professionals  throughout my life have convinced me that habitual jaw clenching is bad for my teeth, my bones, my muscles, and is major factor in my headaches.

Mindfulness practice has helped somewhat. With increased body awareness I have been better able to notice when my jaw is clenched, and adjust. I then also ask myself “why might I have been clenching my jaw? What here is making me stressed, or anxious, or focused?” With the symptom noticed I am then able to look outward for the cause.

1

I imagined a simple wearable (a hat or band?) that could conceal the sensor-stickers of an EMG muscle sensor. I’ve been enjoying playing around with If This Then That (IFTTT) in another class and having fun with it, and I thought this might be a fun way to integrate it as an unobtrusive opportunity for self-reflection.

My concept, then, was a wearable that detects jaw tension. When the tension is sustained the wearable sends a notification to the wearer’s phone, reminding them that their jaw is tense. No judgement is implied; it is simply a statement of fact. The wearer can self-reflect and adjust as required.

A high sensor reading is sent to Adafruit IO, which triggers an IFTTT applet, which sends a notification to the user’s phone.

Process

After playing with and testing the sensor on various muscles with the help of my friend and colleague Amreen, I wrote some code interfacing with Adafruit IO and commented it for clarity, viewable here.

The code uses the Adafruit Feather microcontroller and Adafruit IO Wifi to connect the board to the internet. I used the Adafruit IO Arduino library and the guide here. Note that the Adafruit Feather won’t connect to 5G wifi networks!

In summary, the code checks every ten seconds to see if your tension is above a threshold. If it’s above the threshold twice in a row (signifying extended tension) it sends a “1” value to Adafruit IO. An IFTTT applet, listening to the feed, sends a notification to the user’s phone with a non-judgemental reminder that they are experiencing jaw tension.

img_20190207_091230

Wiring is simple.  From Getting Started with MyoWare Muscle Sensor from the Adafruit website:

docu4

My code includes an LED so you can have a real-time visual representation of the sensor’s readings. That LED is currently assigned to pin 13, the built-in LED on the microcontroller, so no wiring is necessary.

docu2

The Adafruit IO feed is created automatically when the Feather sends data to it.

docu3

screenshot_20190207-085348

Stumbling Blocks

Everything works in theory, but the fact that the electrodes on the EMG sensor are only good for two or three placements has been an impediment to testing. Early in testing I was able to get values consistently from the sensor, but by the end it was unresponsive and was only delivering the same, very high, result.

docu1

The upshot of this is that the infrastructure of the project – Adafruit IO to IFTTT – can still be tested, and that it works consistently.

Next Steps

More tuning of sensor placement and adjustment of thresholds and timing is required. If this project was to become a wearable it would have to move away from the MyoWare EMG sensor, as the sticker-based electrodes are not feasible for long-term use or use with a piece of clothing.

Thoughts

This is ultimately a very personal self-reflection project, as I have lived most of my life with physical issues caused by or related to jaw tension. I imagine that this system, were it used by other people, would be adapted into a system that manages whatever tension points they hold.

I chose to send a smartphone notification rather than some other form of feedback because the smartphone is ubiquitous, and notifications tend not to draw undue notice. As monitoring one’s body is a personal experience, I would prefer to have the feedback presented in a relatively subtle and unobtrusive way.

The other option for feedback I considered was haptics, but this still felt more obtrusive than I would have liked. A smartphone notification can be ignored or forgotten, while a physical sensation can not. The intention with this piece is to gently remind the user that they’re carrying tension, and communicate that information when the user is ready for it, not to force them to confront it.

References

Welcome to Adafruit IO. (n.d.). Retrieved from https://learn.adafruit.com/welcome-to-adafruit-io/arduino-and-adafruit-io

and

https://learn.adafruit.com/getting-started-with-myoware-muscle-sensor/placing-electrodes

Ifttt. (n.d.). IFTTT helps your apps and devices work together. Retrieved from https://ifttt.com/

Range of Motion Visualizer

img_20190130_145055

GitHub: https://github.com/npyalex/Knee-Stretch-Sensor

Overview

The Range of Motion Visualizer is an exploration of the capabilities of stretch-sensor conductive fabric. The Visualizer uses a stretch-sensor in a bending motion rather than a stretching one and visualizes the output as a multicoloured band. In concept, the Visualizer is imagined as being used as a rehab tool. When prescribed a limited range of motion as a part of physical therapy, a wearer would calibrate the Visualizer to their prescribed range. They would see their motion represented. Safe motion that would not harm their recovery would (at this stage, anyway) be represented by a small green bar. The bar would elongate and turn yellow as they approached the outside of their prescribed range, and turn red when they were outside it.

Process

After exploring the functionalities and properties of the fabrics we had been given in-class, I decided to work with the stretch sensor. I had worked with pressure sensors briefly in the first semester, so I preferred not to work with them, and had been knocking around the idea for a wearable that would visualize motion for a little while, inspired by the Motex Project for smart textiles. The stretch sensor was an opportunity to realize that idea.

img_20190124_154352

knee-fritz

I put together a circuit using the diagram provided in-class as an example, and ran it while looking at the Arduino Serial Port to see what kind of readings it generated.

Then I took some code from the Ubiquitous Computing class and used it as the basis for moving the sensor readings into Processing.

When that confirmed that Processing worked, I wrote some code to adjust the length of a displayed rectangle based on the sensor reading.

After a few tests I determined what felt like a good range to implement the changing colours – yellow for approaching the danger zone, and red for beyond.

Then I put together the sleeve for holding the sensor. I took an old sock and cut away the foot. I sewed a hem into the sock, then turned it inside out and sewed in the length of stretch sensor fabric with conductive thread.

img_20190129_185352

img_20190129_192905

I left long lengths of conductive thread to attach alligator cables to while testing.

In testing this was uncomfortable and unwieldy. Because of the simplicity of the circuit I decided to minimize it. I used a small breadboard. I removed the extraneous wires and ran the power side of the sensor directly from the 5V pin. I set up the variable resistor directly from the A0 pin, and the resistor directly to ground.

img_20190129_195818

I removed the extra threads and sewed patches for alligator clips to attach to. I had to recalibrate the code at this time, as the sensor had begun returning lower readings. As a nice bonus, touching the two pads completed the circuit and served as a default “max” for the range-of-motion tracking function.

img_20190130_145055

I continued to explore visualization. I wanted to create a curve that mirrored the bend of the user’s arm. I used the curve() function, as well as explored using curveVertex() within the beginShape()/endShape() functions. I did get a reactive curve going, but I decided that it was not as strong a visualization as the bar.

silly-boy

Next Steps

This could easily be made wireless – perhaps with XBees? I would have to do more sewing, including a pocket for the microcontroller. I also considered exploring haptic feedback in addition to – or perhaps instead of – visual feedback. I would like to include a vibrating motor that would buzz lightly when in the yellow zone and strongly when in the red. Beyond that, I would want to create a means for quickly and simply re-calibrating the sensor on the fly, and continue working on using a curved image as a visualization.

References

https://github.com/npyalex/Ubiquitous-Connectivity-Project

https://docs.google.com/presentation/d/1xHIjrmXHmO3N-q6QnVvZ4OYTXpfT2C0L8y68Nwbwj5U/edit#slide=id.g4e5141135c_0_67

http://www.motex-research.eu/about-motex.html

Deceptive Jumping Necklace

After a creative elicitation exercise involving mix-and-matching verbs, adverbs, and feelings, I sketched out a series of goofy designs.

1

Many of them were so goofy or so obtuse that when it came time to select one to pursue for this project, they had to be discarded by default. The one idea that I thought would be achievable based on the parameters of the assignment, and not so complex as to necessarily require a microcontroller, was the so-called Deceptive Jumping Necklace.

1

The Necklace would sit clasped on its wearer’s neck until, when it was most unexpected, it would unclasp and leap off. When expanding the design I imagined it held fast by a set of electromagnets controlled by a microcontroller hidden in the central pendant. This central pendant would also hold springs that would push the necklace away when it was activated. It was goofy, but it could be read as a piece of critical or dark design, which are design avenues I am interested in.

1-1

I had no practical experience with knitting. I had done some simple weaving before, but I wanted to learn to knit. Even at the time I felt that weaving would be more appropriate than knitting for this object, but I wanted to take the opportunity to push myself and learn something new. I planned to knit the body of the necklace and weave a small patch to serve as the mounting for the magnetic clasp.

img_20190117_142336

It took me several false starts to get the hang of knitting. The first round of stitches that set up the first needle was simple enough, but the process and movements for the core stitching did not come easily. Furthermore, in my hubris, I had asked my instructor for small needles as I wished to knit something that would have the same stitch density as a weave. She warned me that large stitches would lead to larger loops which would be easier to knit, and she was right. The small loops were difficult to keep ordered and occasionally got very tight.

I had to stop and restart several times, but eventually, thanks to a very helpful YouTube video, I got it going.

While knitting, I decided that a necklace was the wrong form for the project. A bracelet would maintain the same kind of affordance as the necklace with respect to the critical design aspects, and would be a little simpler and faster to make. Also, I had by now decided to try to realize the project without a microcontroller, and a bracelet would be a better fit for an object that was just a swatch of knitted cloth.

As I knit, I attempted to include two lengths of conductive thread – one of the fourth stitch from the beginning, and one on the fourth stitch from the end. These will eventually become the wiring that keeps the clasp engaged.

img_20190123_164546

The bracelet turned out well enough considering it was my first serious foray into knitting. For some reason – probably through missing or fouling up stitches – the finished knit has a distinct curvature to it, which works for a bracelet!

img_20190123_164542

For the next step, I wove a swatch to serve as a place to anchor the clasp mechanism. I had done some weaving in workshops previously so this was familiar to me, and a YouTube video was a good refresher.

img_20190123_165848

I tied off the cut portions of the weft and trimmed them down.

img_20190123_170837

img_20190124_092253

There is much more work to do. Having never knitted before, I spent a majority of the week getting comfortable with the process through trial and error. I understand now how to recognize a mistake and fix it right away, which I did not when I began. Mistakes I made early in the knit were deeply woven before I recognized what they were.

Furthermore, I settled on the initial design of the project before I truly understood the needs of it. Before this piece is completed I intend to re-imagine it so it can function without a microcontroller, and to utilize one of the fabric-based sensors. Perhaps I will eschew magnets altogether?

While I’m disappointed to not have a completed product I am excited to have discovered knitting, which I find fun and relaxing. Now that the hurdle of learning to knit has been overcome I’m looking forward to continuing exploring, and perhaps knitting myself a big fluffy scarf.

References & Resources

RJ Knits (2018, November 24). How to Knit: Easy for Beginners. Retrieved from https://www.youtube.com/watch?v=p_R1UDsNOMk&feature=youtu.be

The Met (2016, March 11). #MetKids-Weave on a Mini Loom. Retrieved from https://www.youtube.com/watch?v=AWLIy-Um7_0