Touch Music – Work from Home Edition

Project Description

Since the beginning of the COVID-19 Pandemic the new norm became working from home which led to monotony during the lockdown period. Despite the slight semblance of normalcy of being able to go to work or school in-person we have slowly adopted the work from home notion as it has been popularised over the past two years. Touch Music – Work from Home Edition is the extension of a previous project I did in the Creation and Computation studio class back in 2020. The project explored capacitive sensing using and Arduino Nano 33 IoT connected to a capacitive sensor which was attached to conductive material wrapped around a wine bottle. Once connected it was activated by touch and would display a visualizer that reacted to the beat of a song.

touch-music-2touch-music

In this version of Touch Music – Work from Home Edition, I further explore the concept by designing a wearable sweater that is worn while someone is working from home. The wearable sweater has two capacitive sensors one on the left cuff of the sweater and the other on the back of the sweater and the idea is to have the sensor play different types of music when someone is studying or working and when they are taking a break. The capacitive sensor on the cuff is activated when someone is placing their hand on their computer while they are typing and plays focus music. Whereas the capacitive sensor on the back is activated when the user is taking a break and leans back completely on their chair and the music genre changes to the users’ preferences. The wearable is intended for use by people who work or study from home and enjoy playing music while they work.

Project Context

The Bose SoundWear Companion Wearable Speaker (Bose n.d.) is one of the inspirations of the Touch Music project. The wearable speaker is designed to be worn on the neck and offers the best of both worlds’ portability and great sound output. The speaker is well designed and sits comfortably on the neck and is great for people who work from home. The wearable has two speakers that are upward-facing and point toward the ears with good sound quality which is an alternative solution for hands free use such as answering calls while driving. While the wearable is not exactly designed for use in public spaces as it is a speaker and can get loud it has other potential uses which include use at home or in the backyard, taking a walk, speaker calls, virtual reality, and driving. Touch Music is designed for use at home specifically for people who work or study from home and spend most of the day using their computers. Like the Bose speaker the wearable sweater’s microcontroller is designed to sit on the shoulders of the user as it is unobtrusive and not distracting to the wearer.

Liquid MIDI is an experimental modular textile interface for sonic interactions, exploring aesthetics and morphology on contemporary interaction design (‘Liquid MIDI’ n.d.). It uses experimental textiles and conductive ink for the sonic interaction and is a tangible interface. The piece consists of textiles, screen printed with conductive paint. The paint creates a network of intersecting lines with pronounced circles at specific junctions. The lines are connected to an Arduino through alligator cables which helps it to communicate with a desired software, using midi protocol. Trigger pads and fader board are screen printed modules focused on AV performances, allowing the performer to build its set up regarding its needs. Sound is a medium that has been increasingly gaining ground in the visual arts during recent decades, despite this seeming contradictory (‘Liquid MIDI’ n.d.). Liquid MIDI relates to Touch Music which uses the concept of fusing conductive material on a fabric in this case a sweater to trigger a sound output when touched. The unique interaction of Liquid MIDI is foldable and morphable which allows for interesting uses and interactions where the interface becomes a part of the process of creation itself.

liquid-midi_002

Photo Credit: Liquid MIDI

Woojer Vest is one of the most powerful haptic experiences it delivers high fidelity tactile sensation which reproduce the rich emotion of sound(‘Vest’ n.d.). The Vest Edge gives 360 degrees of immersion, delivering powerful and accurate, detailed sensations. The Woojer Vest works by pumping low frequencies of sound into the body. It is best experienced when playing games, watching movies, or listening to music. Thus, it would give you a one-of-a-kind audio experience that cannot be experienced with just headphones. It makes the experience of watching movies and playing games, especially with VR, much better (Sutton 2021). The vest is designed to be worn and plugged into a gaming console, headphones, or any other device that you want to use with the vest. It’s the perfect companion for at-home gaming, movies, VR and music. It’ll pump the low frequencies through your body, delivering a unique and mesmerizing audio experience. The idea of Touch Music is to introduce a different experience of working from home by adding an interesting interaction to clothing that allows the wearer to control music using the sense of touch. As it can be monotonous to see the same four walls of someone’s workspace the idea of Touch Music offers a way to liven up the work from home experience with the potential of exploration into other interesting interactions.

Parts List

Electronics

  1. (1) Arduino Nano 33 IoT – ARDNN-032333 (Arduino)
  2. (1) MPR121 12 Key Capacitive Touch Sensor – PROTS-001982 (Adafruit)
  3. (2) 6″ (M-F) Jumper Wire – CONJU-062319
  4. (2) Ribbon Cable 22AWG 40 way – 1 Meter
  5. (2) Mini Breadboard Red – PCBBA-120442
  6. (1) USB (A) to Micro (B) Cable – 5ft – ZCABL-010215
  7. (4) 22AWG Hookup Wire – WIREJ-000140

Materials

  1. Copper Tape – COPER-010567
  2. Sweater
  3. Velcro
  4. Thread
  5. Aluminium Foil

Circuit Diagram on Fritzing

touch-music-circuit-diagram

Link To Code

Code on GitHub

Wearability Assessment

The wearable sweater was comfortable to wear for its intended purpose while seated it was comfortable and unobtrusive. While moving around or walking the microcontroller felt slightly heavy but was not very noticeable, the wires sewn through the sweater were not felt and stayed in place even while moving around. Based on Gemperle’s argument “The weight  of  a  wearable  should  not  hinder  the  body’s movement or balance (1998).” The construction of the wearable sweater is not bulky or heavy however it would be beneficial to use more compact electronic components in the next iteration.

Final Photos

img_3423img_3424img_3426img_3436img_3425img_3431

Construction Photos

img_3413img_3420 img_3414 img_3415img_3419 img_3416img_3417img_3418img_3455 img_3458img_3457img_3456

Video

Link to Demo Video – https://ocadu.techsmithrelay.com/b1LE?tab=Details

Link to Process Video – https://ocadu.techsmithrelay.com/apDM?tab=Details

Challenges and Successes

The wearable sweater was a fun project to work on some of the successful aspects was the code worked well and the construction of the wearable was easy to assemble. It was comfortable to wear and unobtrusive as I sometimes forgot I was wearing the device. The progress made from the first Touch Music project which focused more on physical computing let to further exploration of wearable technology in this iteration.

Some of the challenges faced during this project was time constraint trying to balance thesis work. As the wearable sweater incorporated capacitive touch it sometimes lagged to respond or would cut of the music midway and reconnect after some time. The capacitive sensors are hidden from the line of sight of the wearer which could potentially be jarring if the user forgets, and music automatically starts playing when the sensor is activated.

Next Steps

The next step of the project would be to design a more compact wearable that can be worn and removed by the wearer as they please. Presently the design of the wearable is designed to be fixed on the sweater which is not feasible if it is intended to be worn over an extended period as not everyone enjoys listening to music as they work. The possibility of exploring different form factors and ideas of incorporating the device into a wearable has the potential to be further developed and looking at different concept ideas beyond playing music while working from home.

 

Bibliography

Bose. n.d. ‘SoundWear Companion Wearable Speaker – Bose Product Support’. Accessed 2 May 2022. https://www.bose.ca/en_ca/support/products/bose_wearables_support/soundwear-companion.html.

Gemperle, F., C. Kasabach, J. Stivoric, Malcolm Bauer, and R. Martin. 1998. ‘Design for Wearability’. In , 116–22. https://doi.org/10.1109/ISWC.1998.729537.

‘Liquid MIDI’. n.d. E J T E C H. Accessed 2 May 2022. http://ejtech.cc/?page_id=790.

Sutton, Robert. 2021. ‘Woojer Vest – Everything You Need To Know’. Teckers® Tech (blog). 31 August 2021. https://teckers.com/woojer-vest/.

‘Vest’. n.d. Woojer. Accessed 2 May 2022. https://www.woojer.com/pages/vest.

Wait!

wechatimg710

Project title: Wait!

Project Description: 

This project targets on facilitating walking activity for users with vision impairment by incorporating TENS (transcutaneous electrical nerve stimulation) device as a form of haptics. People with blindsight—the ability to detect things in the environment without being aware of seeing them, are faced with the obstacles of perform daily activities without bumping into erect objects or surfaces. They need to use a stick to track obstacles or require guidance by another person when walking around buildings.

Wait! is a piece of wearable that addressed the demand of users with vision issues by providing audio and haptic feedback to alert users from being too close to impediments. It consisted of an ultrasonic sensor to detect real-time distance from the user to the object/surface in front of. When user has walked up too close towards the object/surface, the ultrasonic sensor will trigger the switch in the relay, generating the sound of a click and send low-voltage electrical pulses via TENS device to human body. Only when user stepped back, the electrical pulses will be put on halt. Henceforth, user are better supported and assisted in navigating indoor or outdoor spaces.

By incorporating an established TENS device, users are able to adjust the strength and frequency of pulse signals according to their own preference and level of comfortableness. The value could be set beforehand and saved for future uses with customization.

Project Context:  

EMS(Electrical Muscle Stimulation) and TENS has been widely employed in the field of medicine. They can be considered as supplements to conventional muscle training, particularly for therapeutic treatment and physical rehabilitation. TENS is more specifically used in pain treatment that it stimulates the nerves exclusively by delivering electrical signals that do not trigger muscle movement. Applying TENS signals to painful spots at the body can reduce discomfort and relieve pain (Gibson et al., 2019). 

Despite an emphasis on the field of medicine, there are emerging studies of using EMS as a form of haptics in the creative industry. Studies have been conducted around the cross-section of wearable devices, mixed reality, human-computer interaction and experience design. EMS has been used as a haptic input and output technology in wearable and textile-based computing through crafting comfortable textile electrodes (Pfeiffer & Rohs, 2017). EMS has also been incorporated into haptic interfaces to simulate the force feedback effect caused by a collision, in generating a mixed reality tennis game (Farbiz et al., 2007). With all the applications of EMS in non-traditional settings, TENS however, is yet to discovered and applied in other disciplines from an innovative perspective.

One case study that informed this project is Using Electrical Muscle Stimulation Haptics for VR. While traditional approaches in VR focus on lightweight objects via skin receptors such as vibro-tactile gloves, simulating heavy objects is still confronted with limitations by traditional methods of physical props or hand tethering (Burdea, 2000). This study explores how to better render heavy objects in VR via EMS in the form of wearables. Its main concept is to prevent the user’s hands from penetrating virtual objects by means of EMS. Tension is created in the user’s bicep, tricep, pectoralis, and shoulder muscles. The system stimulates up to four muscle groups to generate the desired tension, thereby constructing the desired realistic experience of touching the wall or lifting up heavy objects. 

The system is encapsulated into a small backpack, which could be worn by the user. The backpack contains a medical compliant 8-channel muscle stimulator, which is controlled from within the VR simulators via USB. Other components include a typical VR system consisting of a head-worn display (Samsung/Oculus GearVR) and a motion capture system (eight OptiTrack 17W cameras). 

This project informs the design of Wait! in the way that in a VR environment, users are less capable of sensing the surrounding while moving around – similar to be under the vision-impaired circumstance. Using EMS to generate muscle tension and henceforth simulate the weight of object is intriguing and very informative in terms of possible applications of muscle stimulation.

Other related works include vibration-based tactile haptics such as CyberTouch by Virtual Technologies (Burdea, 2000), pneumatic gloves using air pockets (Tarvainen & Yu, 2015), tethers and exoskeletons for fingers or upper body muscles. Passive haptics are also oftentimes used in VR to simulate the existence of objects such as still props and props placed by human or robot. Even though tactile haptic could potentially support a better texture rendering, it does not deliver a directional force that acts upon the user’s hands or muscle groups. Pneumatic gloves are also confronted with the similar challenge of representing heavyweight objects. Hence the utilization of EMS/TENS to activate muscles is of great value and importance to be further investigated.

Another study that I have looked into is Remote Controlled Human project which uses Spark Core, a TENS unit, and a relay to remotely control a human minion over WiFi. The Spark Core connects the TENS unit and user it attached to with the Internet of Things. With the TENS unit stimulating involuntary muscle movements, the relay acts as a bridge between the signal provider and the receiver. This project is a great reference for circuit construction, specifically how to bridge a commercialized healthcare product with computer softwares via a simple relay. It also extends the future possibility for hooking up sensors and writing one’s own sketches via the Spark IDE to generate outputs based on all kinds of conditions, such as rhythmic music input, light levels and et cetera.

Parts List:

  • 1 Arduino UNO
  • 1 TENS unit (TENS 3000)
  • 1 Ultrasonic Sensor HC-SR04
  • 1 relay unit
  • 1 Felt hat
  • Breakout cables
  • 3 Male-to-Male jumper cables
  • 2 Male headers
  • Wire cutters/strippers
  • Sewing kit
  • Small slotted screwdriver

Circuit Diagram:

circuit-diagram

Github Code | Link to Video

Wearability Assessment/Priorities:

  1. Unobtrusive placement. (The hat works at the appropriate area of human body without impeding dynamic body movements.)
  2. Design for human perception of size. (This wearable is designed to minimize thickness and weight as much as possible. The sensor is within the range of normal hat weight to be unobtrusive and transparent.)
  3. Containment. (This piece of wearable contains materials of all electronic components, wires, electrodes, etc. While some of these things are malleable in form, there are many with fixed volume that one needs to consider how the ‘insides’ bring to the outer form.)

Final Photos:

wechatimg714 wechatimg711 wechatimg712

Challenges & Successes:

Figuring out a way to incorporate TENS signal that is non-harmful yet effective on human bodies was one of the biggest challenges during the early stage of this project. As electrical muscle stimulation has a safe range of usage and a varied level of comfortableness upon users across different demographics, utilizing TENS needs thorough preliminary preparation and physiological backup. Out of the safety concern, I decided to proceed with a relatively safe and minimal side effect TENS device which has been clinically tested and commercially employed. 

Reference to Remote Controlled Human project, I was able to hack into TENS 3000 unit and control its digital switch using a relay and Arduino UNO. The analog control of TENS unit would be much more complicated with an established device. 

The construction of this project was also one big challenge in terms of fitting the whole circuit into the interior of a hat while maintaining enough space for user’s head to fit in. I mapped the circuit onto the inner surface and tried to design an optimal route for wires to connect to each component without taking extra space. The Ultrasonic sensor was placed in front of the hat with the other part of sensor hidden in the hat brim by poking out two holes. The wires were attached underneath with another cut to hide the connection. However, it still leaves visible wires connected to battery or electrodes that users might find complicated and troublesome when attaching TENS pads onto the body.

Next Steps:

Furthering this project, I would like to experiment with controlling TENS/EMS signals in an analog way rather than digital. The wearable could be elevated if it can sense body signal as inputs and adjust the output automatically with the assist of Arduino programming. However, figuring the way of connection and making sure the safety range of pulse stimulation are going to be the two biggest challenges.

In order to achieve that, it might require the abandonment of established TENS units but to figure out a way to manually set frequency/strength of electrical stimulations. It adds to the level of insecurity and danger as it is not monitored under a controlled environment.

Incorporating EMS into other creative disciplines as an innovative medium would be interesting to be explored, such as with the cross-section in music, performance art, writing, education and et cetera.

Bibliography:

Farbiz, F., Yu, Z., Manders, C., & Ahmad, W. (2007). An electrical muscle stimulation haptic feedback for mixed reality tennis game. SIGGRAPH ’07.

Follmer, S., Leithinger, D., Olwal, A., Cheng, N., & Ishii, H. (2012). Jamming user interfaces. Proceedings Of The 25Th Annual ACM Symposium On User Interface Software And Technology – UIST ’12. doi: 10.1145/2380116.2380181

Gibson, W., Wand, B., Meads, C., Catley, M., & O’Connell, N. (2019). Transcutaneous electrical nerve stimulation (TENS) for chronic pain – an overview of Cochrane Reviews. Cochrane Database Of Systematic Reviews. doi: 10.1002/14651858.cd011890.pub3

Janczyk, M., Skirde, S., Weigelt, M., and Kunde, W. (2009). Visual and tactile action effects determine bimanual coordination performance. Hum. Mov. Sci. 28, 437–449. doi: 10.1016/j.humov.2009.02.006

Jones, S., Man, W., Gao, W., Higginson, I., Wilcock, A., & Maddocks, M. (2016). Neuromuscular electrical stimulation for muscle weakness in adults with advanced disease. Cochrane Database Of Systematic Reviews, 2016(10). doi: 10.1002/14651858.cd009419.pub3 

Lee, J., Kim, Y., & Jung, H. (2020). Electrically Elicited Force Response Characteristics of Forearm Extensor Muscles for Electrical Muscle Stimulation-Based Haptic Rendering. Sensors (Basel, Switzerland), 20(19), 5669. https://doi.org/10.3390/s20195669

Lopes, P., You, S., Cheng, L.-P., Marwecki, S. & Baudisch, P. (2017). Providing Haptics to Walls & Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation.. In G. Mark, S. R. Fussell, C. Lampe, m. c. schraefel, J. P. Hourcade, C. Appert & D. Wigdor (eds.), CHI (p./pp. 1471-1482), : ACM. ISBN: 978-1-4503-4655-9

Pavani, F., Spence, C., and Driver, J. (1999). Visual capture of touch (tactile ventriloquism); out-of-the-body experiences with rubber gloves. J. Cogn. Neurosci. 11:14.

Pfeiffer, M., Schneegass, S., & Alt, F. (2013). Supporting interaction in public space with electrical muscle stimulation. Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication.

Robertson, A. (2021). Meta’s sci-fi haptic glove prototype lets you feel VR objects using air pockets. Retrieved 21 April 2022, from https://www.theverge.com/2021/11/16/22782860/meta-facebook-reality-labs-soft-robotics-haptic-glove-prototype

Stretch Controller

Project Description
“Stretch Controller” is a wearable controller that interfaces with Unity. It takes the form of a glove and has two stretch sensors that work as the inputs for the project. The inputs are used alongside other modifiers in the Unity Engine. In this way the users can interact with the project by flexing and relaxing their fingers. I chose to use a stretch sensor for the inputs because I thought it would be interesting to take a simple concept and play around with it to make something interesting and functional. Additionally, I chose to create a wearable controller because I like creating video games and feel that creating my own version of a controller would allow new and interesting design conventions and affordances that are different with an ordinary controller. While I do enjoy the tactility of ordinary controllers and the physical feedback they provide with buttons, I feel that a wearable controller can offer more immersion when playing a game. This project serves more of as exploration of alternative controllers rather than a replacement for ordinary ones.
screenshot_20220426-191744_gallery20220426_19544520220426_19545420220426_195057

Interactions/ Controls
The main interactions of the piece is the stretch sensors. They are placed along the pinky and index fingers. This is so that when the fingers bend/ ball up the sensor would be stretched and change the input value. The value is then run through a C# script on Unity so that it is remapped to be more suitable for interactions. The stretch sensor controls the rotation of a cube. The screen also displays the maximum and minimum values (important for calibration),  the mapped value, as well as the raw value. (The full range of the stretch sensor is 0-1024, it doesn’t go through the full range and must be calibrated to be more suitable. The cube rotation also only has a range of 360.)
The other interaction is holding down the on-board button (pin 4). While the button is held down, the minimum and maximum values will be recalibrated.

Videos
Video1(Shows the glove and hand flexing): https://drive.google.com/file/d/1WSHZCvftgVV4atS4N2ZegyDyHGLOKVX0/view?usp=sharing

Video2(Shows the computer inputs in the computer screen):https://drive.google.com/file/d/1v6HMORIKVE0vf1LTWKLBslA4rtk5ZEKm/view?usp=sharing

Materials:
-Adafruit Circuitplayground Classic
-Conductive Rubber (Stretch Sensor) 350ohm per inch in relax (From Creatron)
-Black Spandex (From Kings Textiles)
-1k Ohm Resistor * 2
-Conductive Thread

Fritzing Schematics

untitled-sketch_2

Code
Here is the link to the code on github
The repository features 4 files: 2 Unity files and 2 Arduino Files.
-The Unity files respectively remap the input for the pinky and index. They also use the remapped data to change the x (pinky) and y (index) rotation values of a cube.
-The Arduino file titled “Final_Test” was a test I used at the beginning of the process to make sure all the analog inputs are correct
-The Arduino file titled “Uduino_Final” is the example code from the uduino library so that the Arduino (Circuit Playground Classic in my case) and Unity can interface with each other. The code is only altered in the beginning to allow the circuit playground to be read.
#include <Adafruit_CircuitPlayground.h> is the only addition to this. The rest of the code is by  Marc Teyssier.

Project Inspiration
The project was inspired by the Nintendo Power Glove and its predecessors, the VPL DataGlove, and Z-Glove made by Thomas Zimmerman.

The Z-Glove inspired me because of the Optical flex Sensor. I really like how it uses optical flex sensors which uses a basic physics concept that light travels in a straight line to create a sensor to detect how much the finger bends. Similarly, I wanted to use a simple concept of stretching and flexing for my own glove-based controller.

Process: Electronics
The electronics were rather difficult to construct and wire up. Most of the issues comes from the conductive thread. Because of how stiff it is it tended to come undone quite often. From there was often the risk of the thread coming into contact with everything the rest of the circuit. This was especially a problem at the end of the circuit, when connecting everything to the circuit playground. To fix this, I put the loose ends down and placed a small piece of conductive fabric on top of it. This is because the fabric has a heat activated adhesive on it.
screenshot_20220426-191858_gallery
Besides that, the circuit features two voltage dividers with the stretch sensor assigned on each of them.
screenshot_20220426-191925_gallery
Process Video: https://drive.google.com/file/d/1WaJUJ78CmuAfbBZdRBp_SrrjEqhqX_it/view?usp=sharing

 Challenges and Successes
Overall, I really liked how tidy the wiring is for the project. I’m also really glad that it does work and that the fabric isn’t pulled b the conductive rubber. I also learnt from testing that adding another strip of the conductive rubber would average out the values a little bit. I’m really glad that I was able to interface the circuit playground with unity. This knowledge will serve me very well in the future.
There were a lot of challenges for the project. Though I wanted it to be modular with 3d printed pieces, I could not get them printed in time so I sewed the whole thing onto the glove. The circuit playground also tended to overheat so I’m a bit wary about keeping it plugged in for too long. The playground also seems to be able to read the analog inputs one at a time, so that will be a difficult challenge to overcome in the future. Another challenge was that everything came off a lot, as mentioned earlier with the wires often coming undone.

Next Steps:
There is a whole lot I want to improve on with the project, I probably want to use a different material/ make a bigger glove as the spandex is very tight, while this is good in detecting the stretch, it makes taking the glove off rather hard, especially when taken into account how fragile it is and how the components can easily come off it. I also really want to focus on making it modular. Most of all I really want to make a better game to interface with the glove. This works well as a demo but there are so man other thing I want to make with it.

 

Web Futures

img_6326

 

img_6273

 

My art style, especially in the context of technology, is heavily influenced by juxtaposing seemingly opposite realm: physical-digital, technologic-organic. By uniting these realms, technology departs from being a mere utility or novel, into a dynamic and stimulating extension dimension to our own. 

Web Future is a conceptual project that further explores this artistic interpretation in a variety of techniques. The project features a glove that connects via bluetooth to a TouchDesigner animation, endeavouring to bridge digital and physical realities with a seamless and dynamic interaction. As the user moves their hand, an animation will likewise respond to these motions, serving an experience akin to physically reaching into the digital world.

Additionally, the physical makeup of the glove is an exploration into demonstrating technology through a seemingly opposing medium. The glove was crocheted very loosely from raw cotton twine, delicately covering the hand through a very organic weblike structure. The BLE board is plainly displayed atop and is beautifully striking against the organic cotton, its wires and battery connection nicely hidden in the user’s sleeve. Technology is often considered to be something highly engineered, so I found it particularly interesting and even futuristic to move beyond that and display technology in such a way that appears to be falling apart.

img_6330 img_6312

 

Wearability:

The wearability of my final project drastically diverged from the wearable criteria I outlined in my project proposal. Initially, I imagined an installation piece available for all sorts of people to try, and therefor required a simple and easily adjustable glove. However when it actually came to creating the glove I felt more driven to create an artistic piece that explored my interest in smart textiles.

Materials:

  • 1x Arduino Nano 33 BLE Sense
  • 1x Arduino Nano 33 BLE
  • 1x D Battery
  • 1x D Battery Clip
  • Conductive Thread
  • Conductive Yarn
  • Raw Cotten Twine
  • Regular Thread

Circuit Diagram:

screen-shot-2022-04-25-at-8-31-50-pm

Code:

https://github.com/kahaniploessl/web-future

Final Video:

https://youtu.be/wqDIlZn0i2g

Challenges/Successes:

Overall this project was a success and I was able to create what I sought out to do. Incorporating the bluetooth was the biggest challenge and I’m so pleased it worked because the project would not have felt as cool without it.

Next Steps:

I’m very excited for what comes next in this project, which entails adding finger switched by crocheting conductive yarn. I think introducing this element will really elevate the project as it will serve an excellent bridge between the circuit board and the crocheted glove, excellently displaying the idea of “loose/deconstructed technology”. While it’s unfortunate I was not able to include that this semester it’s something I’d like to work on right away and so I should have an update relatively soon. 

Fairy Dust – Angelina Do & Valeria Suing

Project Title: “Fairy Dust”

Project Description

“Fairy Dust” is a wearable electronic device exhibited in the form of a vest mounted with moving wings controlled by the movement of the body. It also has an installation/interactive piece where the movement triggers on-screen animated fairy dust. “Fairy Dust” was an exploration by use of wearable technology as a platform for self-expression through fashion or style. This wearable was created for those who dare to dream and imagine!

Project Context

Parts, Materials, Technical Assets

  • Circuit Playground Express 
  • Micro USB Cable 
  • Micro Servo Motors (with 3D printed servo 90 degree)
  • Needle and Thread 
  • Fabric (Flannel, Cotton, Tule) 
  • Fairy Wings 
  • Decorative Elements (Flower Trim, Rose Embellishments etc.) 
  • Alligator Clips 
  • TouchDesigner 
  • Power Source (Powerbank used)

diagram_servomotors

Link To Code

Link To MakeCode

Wearability Assessment

“Fairy Dust” was designed to create an engaging experience. It’s meant to be presented as an interactive art piece that would impress, surprise and delight the user and audience. Throughout the process, we followed the Design Framework For Social Wearables (2019) as we considered how and why we would create our desired outcome. 

  • Sensing 

Our wearables project consists of using an accelerometer to sense body movement. The placement of the accelerometer was discussed by using Clint Zeagler’s Movement Sensor Placement Body Map. After taking into consideration garment manufacturing and wire placement, we decided to place the sensor in the back. We also considered that according to the accelerometer’s height, there would be different values for the y and z axis that would have to be addressed via calibration. After several tests, we saw that z axis values gave us consistent results in relationship to the wearable’s behaviour and environment. 

  • Actuating 

“Fairy Dust” has two expressive visual outputs that are meant to capture the attention of the public. As a kinetic wearable project, we made use of servo motors to activate movement. Motors were attached to fairy wings in the back to create a flapping motion. As part of the immersive experience, the other actuator was interactive fairy dust visuals using a particle system. 

  • Sensing-Actuating Interplay 

Since fairies are mythical creatures, our perception of their wing behavior is heavily influenced by childhood characters and stories. In our interpretation, we decided to use an upright position to make the wings move. When the accelerometer senses a change in this position, the wings will stop moving. 

At first, this interaction may be a surprise for the user, creating a sense of magic. As the user keeps interacting with the wearable, the input may become more apparent. At this point, the user can start expressing through their body movement while changing the behavior of the outcomes. 

  • Personal and Social Requirements 

This immersive experience requires attention from the wearer. A big decision to incorporate visuals was to keep the user’s engagement and focus. This physical immersion allows participants to step into a unique interaction and let their imaginations flow. We wanted to create an experience where the user feels like part of the story. This wearable is meant to create an audience and allows for different interpretations. 

Photos

  • Being Worn & Displayed

img_0763img_0764img_0783img_0761img_0770

Video of Interaction

Interaction of body movement (CPX accelerometer values attached to serial monitor) controlling fairy dust animation on TouchDesigner:

finalvisual

Supporting Sketches, Diagrams, Models, Renderings

  • Visuals Process

To decide the look of the visuals we decided to make a moodboard. This helped us choose a colour palette and aesthetics.

visualsmoodboard

The next step was to allow serial communication between MakeCode and TouchDesigner. After some research we realized that this connection was only possible on Windows. We were able to make it work by switching computers and using functions to reference these values. 

To make the visuals reactive we used a sphere geometry function. The position of the sphere referenced the serial monitor. We then used the sphere as an input in the particlesGPU and customized the behavior of the particles.

After testing the behaviour of the particle system, we customized the look to match the desired aesthetics of our immersive experience.

finalvisual

  • Wings Process

Being our first time working with micro servo motors, or any motors as a matter of fact it was important to run many trial and error sessions. In this first video here, one half of the wings were roughly attached to view the potential motions the motor could create.

After seeing how just the motor on its own made more of a “wiggle” motion rather than “flapping”, we were lent a 3D printed 90 degree servo attachment to allow for a more seamless flapping motion. At this point in time, the wings were moving seperately, rather than together.

The final step of the motor mechanics was placement inside the vest. We had to ensure they were sewn in at an appropriate height and distance from each other for a more accurate emulation of “flapping” wings”. At this height, the wings do not obstruct the wearer’s movement or offer any discomfort which is exactly what we were aiming for!

  • Vest Process

Since this project was more of an exploration for self-expression, it was important that we created a wearable that captured the majestic style and mood that was envisioned. We decided to create a vest from scratch using materials sourced from FabricLand. I started out by using a form-fitted shirt as a guide since the wearable would be fitted to my (Angelina’s) body. The final wearable ended up being a hand-sewn double lined vest made of a flannel fabric, with tule sleeves and a tule trim, finished off with rose embellishments. The wings were the only thing that were pre-made (purchased from Dollarama). This was my first time doing something like this and I am very proud of the outcome!

Challenges & Successes

Being a collaborative project, we depended on meetings and constant communication. As per safety precautions due one of us contracting COVID-19, we were forced to work remotely on the last week before the due date. This was challenging since only one of us had access to the wings and vest. We also needed access to Windows to connect the serial monitor from MakeCode and, unfortunately, this led us to not see both finished pieces work together. 

In spite of all circumstances we are both very proud of the outcome. We followed a schedule and we communicated really well throughout the process. We set goals for each team meeting and we assigned responsibilities to work on our own. We were able to successfully work as a team and individually. 

Next Steps

We would definitely want to keep working on this project as a collaboration. Something  that we want to incorporate is another output such as lightning. We had some trouble incorporating neopixels into our design, we also discussed the possibility of adding fibre optics and the design possibilities that this would bring. 

Our goal was to see body movement interact with the wings and visuals. Nonetheless, another exploration that we are curious about is making the visuals interact to the wing’s movement. We’re both curious to keep exploring this little world that we created and the endless opportunities for different unique interactions. 

References:

Dagan, E., Márquez Segura, E., Altarriba Bertran, F., Flores, M., Mitchell, R., & Isbister, K. (2019). Design Framework for Social Wearables. Proceedings of the 2019 on Designing Interactive Systems Conference. https://doi.org/10.1145/3322276.3322291

Zeagler, C. (2017). Where to wear it. Proceedings of the 2017 ACM International Symposium on Wearable Computers. https://doi.org/10.1145/3123021.3123042

Flaming Skirt

Flaming Skirt

By Nooshin Mohtashami

flaming skirt

Project Description

Self-expression is a vital part of our social life and what we wear daily, as our clothing, hairstyle, accessories, and cosmetics, are integral part of our self-expression. The main concept behind this project is using wearable technology to create a way for self-expression.

Flaming Skirt is inspired by Katniss Everdeen’s wedding dress made with feathers and pearls. It is based on the book and movie The Hunger Games. When Katniss spins and turns around to show her white wedding dress, the dress turns into flames and burns to reveal another dress underneath it that is black with small feathers, in the shape of a mocking jay. Revealing the mocking jay dress was an act or rebellion in this story.

The dress that was worn by Katniss during the movie was designed by Tex Saverio, Indonesian born fashion designer based in Jakarta.  The effects of the dress turning into flames were done using Computer Generated Imagery (CGI).

With this project, I plan to create a fake fire illusion using a combination of fabric, programmable Light-Emitting Diodes (LEDs) and fiber optic cables connected to LEDs connected to a microcontroller to activate and light them. The fabric and LEDs are sewn into an existing skirt. The concept behind this project is to change the appearances of clothing (a skirt in this case) as a way of self-expression to communicate different moods.

flaming_skirt

Concept & Related Works

Our behaviour and mental state shifts depending on our choice of clothing and accessories we wear. Research shows that wearing specific articles of clothing impacts our psychology and performance and that clothing has symbolic meaning (Hajo and Galinsky). As sustainability is becoming a major focus of many fashion brands, and as they move away from fast fashion, the concept of dynamic fashion becomes an interesting area to explore and experiment with.

Dynamic fashion can be defined as “fashion garments with transformable styles and animated colors or textile patterns that visibly change from the garments’ underlying colors or patterns, and even details, to others and then return to the initial condition after a period” (Kyung-Hee). These dynamic textiles can communicate and interact wirelessly with other technology systems and create colorful moods and settings using LEDs and fiber optics cables. An example would be Philips’ Emotional Dress (2006) where a prototype garment was created by incorporating electronics into the fabric of the garment to express emotions and personality of the wearer. In addition, there are digitalized dynamic fashion that combine images from computer graphic software such as Augmented Reality software that create the illusion that the garment has other moving graphics or objects when the viewer uses their mobile phone to view the garment. An example of this concept is “Audrey”, created by Yuchen Zhang, is a cropped black top made with neoprene that uses augmented reality to display a “digital aura” of the wearer when looked at it through an app.

Recent development in dynamic fashion have been in creating accessories and jewelry that respond to the wearer’s mood or the surrounding environment, as well as in clothes and fabrics that can change color based on the external conditions or the wearer’s choice. Some examples include:

‘Air’, a wind reactive ink that changes color with fluctuations in the air around the body. Created by the London-based artist Lauren Bowker and her material exploration studio THE UNSEEN, it is integrated into the layers of fabric and morphing its RGB values in response to the air pressure around the fabric (Bowker).

Air
Air

 Another creation, by Lauren Bowker, is the luxury accessory backpack and wallet that reacts to the air pressure or even sunlight and wind surrounding the accessory. The design’s response is to change color. For example, when the bag is in a warm environment, its color turns blue, or black when in cold environments, etc. (The Unseen for Selfridges).

Unseen Backpack

ChroMorphous developed by researchers at the University of Central Florida, is an eTextile that can be visually altered using a smartphone app. The ChroMorphous fibers contain small conductive micro-wires within the fabric that allow for electric current to flow. When the current passes through the micro-wires, the color and pattern of the fabric change (http://www.chromorphous.com/).

ChroMorphous Fabric
ChroMorphous Fabric

The relationship between my project and these references is the desire to change the color of a garment, or the appearance of the color of the garment, based on the wearer’s mood and specific movement as opposed to the garment color changing automatically based on external factors. A combination of fabric, fiber optic cables, and LEDs are used to create the illusion of flames versus using only LEDs to light up the skirt in different colors.

Parts, Materials, Technical Assets

Parts list

  • Arduino Nano BLE Sense – using its IMU Accelerometer to detect motion
  • Flora Neopixel v2 Addressable LEDs connected in series
  • AZIMOM PMMA Plastic End Glow Fiber Optic Cable 0.75mm
  • Wires
  • Battery pack
  • Skirt (Second hand black skirt found at Value Village)
  • Fabric and thread to sew the LEDs to the skirt
  • Orange and red leaves that look like flames
  • Velcro tape
Flaming Skirt Flowchart
Flaming Skirt Flowchart
Flaming skirt circuit diagram
Flaming skirt circuit diagram

Link to Github code

Wearability Assessment

This is in the form of a skirt worn around the waist of the wearer. The wearability of this skirt is very important as too many LEDs can make it feel heavy and might make the skirt slide off the wearer’s waist. Too many LEDs will require a higher electric current requiring multiple battery packs or a larger one which would make the size of the electronics to wear (in addition to the LEDs) too large. It is important to make the skirt feel comfortable to wear.

Photos of Work in Progress

I tried to connect the Neopixesl using different types of connections so that they can easily be incorporated within the skirt. At first, I used copper tape and it was easy to setup, however, the connections were not sturdy enough and the LEDs were not consistently lighting up when incorporated within the fabric. I then re-connected the LEDs using wires soldered to the Neopixels. This was sturdier; however, disconnection was frequent when the garment was worn and moved.

Connecting Neopixels using copper tape
Connecting Neopixels using copper tape
connecting Neopixels using wires soldered to each LED
connecting Neopixels using wires soldered to each LED

 

 

 

 

 

 

 

 

Connecting the Fiber Optic cable to LEDs

picture8picture9 picture10

picture11  picture12

 

Link to the Video

Challenges

  • Sewing LEDs connected in series and Neopixels into fabric is a challenging process for me. Many of the connections kept disconnecting after they were sewn to the fabric, and I had to re-solder them. The flexible Neopixel strands are probably a better choice of lights for this purpose.
  • This meant debugging why a series of Neopixels were not working a challenge. I ended up removing 2 series of Neopixels and stayed with the pre-made strip.
  • Using too many Neopixels and LEDs require a higher current from the power source, adding a bigger battery made the skirt too heavy to wear and it kept coming off. Removed the LEDs and only kept the Neopixels for this prototype.

Next Steps

  1. I would like to create a modular and removable system that can be easily connected and disconnected from clothing. This can make debugging the LEDs easier, as well as making the wearable washable without damaging the electronics.
  1. Once the lighting system works, I would like to let the wearer decide how the lighting should change. This could be by creating a simple mobile app to allow the user to select the lighting sequence.

 

Bibliography

Bowker, Lauren. The Unseen. https://www.creativereview.co.uk/creativeleaders50/leader/lauren-bowker/. 2016.

Epp, Felix. “Expressive Wearbles.” International Journal of Mobile Human Computer Interaction (2019).

Hajo, Adam and Adam Galinsky. Enclothed Cognition. Evanston: Elsevier, Northwestern University, 20122.

http://www.chromorphous.com/. ChroMorphous. n.d.

Kyung-Hee, Choi. “3D dynamic fashion design development using digital technology and its potential in online platforms.” https://fashionandtextiles.springeropen.com/articles/10.1186/s40691-021-00286-1#:~:text=In%20this%20study%2C%20dynamic%20fashion,after%20a%20period%20of%20time. 2022.

           

Sonification of 42 – Ivy Sun (3183268)

42_1_IvySun

PROJECT description

Has anyone ever heard the sound of space?

Ever wondered what the music of the spheres would sound like?

As we all know, sound waves can only travel through a medium. Since there is almost no matter in interstellar space, sound cannot travel through it. Our Universe is under the impression of profound silence, an eternal silence. But our Universe is not all quiet. Although there is no air in outer space, it is not without sound, but human ears cannot directly hear these “sounds.” NASA has built a machine that can record electromagnetic vibrations in outer space and convert them into “sounds” that humans can hear.

I accidentally clicked on NASA’s sonification playlist during a sleepless night. It worked surprisingly well, not only getting a good night’s sleep, but also dreaming of myself floating in space in a spaceship. Thus, this project was born.

NASA Voyager Recordings - Symphonies Of The Planets 3 (1992)

Much of our Universe is too distant for anyone to visit in person, but we can still explore it. Telescopes give us a chance to understand what objects in our Universe are like in different types of light. By translating the inherently digital data captured by telescopes in space into images, astronomers can create visual representations of what would otherwise be invisible to us. But what about experiencing these data with other senses, like hearing? Sonification is the process that translates data into sound. It is the use of non-speech audio to represent information: you take data of some kind and create sound with it. The information is translated into pitch, volume, stereo position, brightness, etc. I like to think of it as finding the music in science.

My final project – Sonification of 42, brings parts of our galaxy, and of the greater Universe beyond it, to listeners. In this project, sound waves are generated based on the brightness of the cosmic objects among the environments. Brighter light is converted into higher frequency sound, and the tempo is various in six different modes. In Douglas Adams’ The Hitchhiker’s Guide to the Galaxy, 42 is the number from which all meaning could be derived. 42 is the answer given by a supercomputer to “the Ultimate Question of Life, the Universe, and Everything.”

Sonification of 42 is a headscarf that converts to sound according to ambient brightness. It has the appearance of galaxies and nebulae with elements of black holes and air currents. It also contains planets, comets, and asteroids, including LEDs. An orb over the wearer’s left shoulder acts as a switch that turns Sonification on and changes the rate or rhythm of sound.

NASA’s Data Sonification research inspired the project. As a beautiful vision of the Universe and everything in the world, I call it Sonification of 42. We, human beings, are too small in our Universe, but the imagination is infinite, endless, boundless. Our Universe is always dynamic, just like those childhood nights when we looked up at the stars and were filled with hope and expectations for the future. A project like this takes one back to one’s most childhood fantasy of the things beyond the sky. Listening to the mystery, it takes one on a journey through our Universe using sound. It is the aim of this project to provide all humankind with a moment to recover the emotion of childhood. Every one of us feels constrained by life, weary of society, and soon finds that once longing for the vast Universe of childhood. We should believe in the mysteries of our Universe, look up at the starry sky, and listen to the voice of our Universe.


PROJECT CONTEXT

Initial Aural & Visual Inspiration:

NASA Voyager Recordings - Symphonies Of The Planets 3 (1992)
What does Saturn sound like from space?
A Universe of Sound
Photo Album - Sonification Collection
Our Solar System
Data Sonification: A New Cosmic Triad of Sound
Data Sonification: Sounds from Around the Milky Way
Jingle, Pluck, and Hum: Sounds from Space
Explore - From Space to Sound
NASA Image of the Day
Miranda - sounds from space
Uranus rings - Sounds from space
The sounds of the Uranus Rings
Uranus - Space sounds

The Meaning of 42:

The answer to life, the universe, and everything

Headscarf Inspiration:

Emotional Beasts Parte Dos: NeuroKnit

Material Inspiration:

A friend of mine Fiona Sun is from OCAD Material Design, who inspires me with the initial material choices, accesses, and approaches.

PARTS, MATERIALS, technical assets

The main body of Sonification of 42 is made of felting wool using wet felting. And the diffuser is made of polyester fibre.

Techniques Tutorials:

Wet Felting Ball

How to Make Felt Balls
HOW TO MAKE FELT BALLS (WET FELTING 101)
How to make a lot of felt beads and felt balls fast
Watch this before Wet Felting (Wet Felting 101 - Easy Tutorial)
How to make felt balls with Rachael Greenland

Wet Felting Fabric

Technique Focus Felting - Making Felted Fabric
How To Make Flat Felt With Wool Roving - Basic shapes in wet felting
Tutorial: How to make a flat sheet of wool felt

Parts List:

Circuit Diagram:

42_CD_IvySun

Thoughts on Code:

In my sketch, the measured brightness is assigned a minimum and a maximum tone frequency. An additional button switches between 6 different modes one after the other. When the program is started, a 550Hz tone outputs for 5 seconds. During this process, the LDR is calibrated, and the minimum and maximum incidence of light are measured. As soon as the calibration process is finished, no sound mode starts, and you only hear sounds when you have pressed the button.

Link to Code on GitHub

WEARABILITY ASSESSMENT

Comfort:
Sonification of 42 is made of soft skin-friendly materials, with a plush feel, so the user will have an excellent wearing and tactile experience. Besides, it is very safe with no threat when used. its face is undulating – the globes are three-dimensional, the black holes are hollow, and all parts are made with soft texture, driving people to touch it. Also, its hanging orbs can be rolled.

Durability:
It is very resistant to dirt and can be locally washed to keep clean. It is also very stable by firmly stitching, and not afraid of friction or exposure.

Usability:
I demonstrate it both in day and night usage scenarios, in fact, that would be more extensive. It can be worn or decorated around the body. Moreover, it fits in most daily scenes or sites, the only thing you need is a portable power bank to wake it up. Meanwhile, based on its interactive features and design elements, it is very easy to manipulate or trigger.

Aesthetics:
Everyone thinks of beauty differently. I think this project is truly beautiful with its own personality. It is what the Universe looks like in my mind. Its appearance is solid and flexible. Its design is very simple but childlike. To some extent, it is cute and iconic.

Sources:

Francés-Morcillo, Leire, et al. “Wearable Design Requirements Identification and Evaluation.” Sensors, vol. 20, no. 9, 2 May 2020, pp. 1–28., https://doi.org/10.3390/s20092599.

Gemperle, F., et al. “Design for Wearability.” Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215), Nov. 1998, pp. 116–122., https://doi.org/10.1109/iswc.1998.729537.

FINAL PHOTOS

42_FP1_IvySun 42_FP3_IvySun 42_FP4_IvySun 42_FP6_IvySun 42_FP7_IvySun 42_FP8_IvySun42_FP11_IvySun 42_FP9_IvySun 42_FP10_IvySun 42_FP11_IvySun


FINAL VIDEO

Sonification of 42 - Final Demo Please watch the video demo

PROCESS

42_01_IvySun 42_02_IvySun 42_03_IvySun 42_04_IvySun 42_05_IvySun 42_06_IvySun 42_07_IvySun 42_08_IvySun 42_09_IvySun 42_10_IvySun 42_11_IvySun 42_12_IvySun 42_13_IvySun 42_14_IvySun42_28_IvySun 42_15_IvySun 42_16_IvySun 42_17_IvySun 42_18_IvySun 42_19_IvySun 42_20_IvySun 42_21_IvySun 42_22_IvySun 42_23_IvySun 42_24_IvySun 42_25_IvySun 42_26_IvySun 42_27_IvySun

Sonification of 42 - Test Process Please watch the test video

Challenges & Successes

The nebula, the main part of this project, was long and difficult to make. The making of wool felt required great patience and careful attention. I needed to dip and knead it, again and again, to make the different colours of wool blend together really well and make sure that the hollow part does not make the whole piece fall apart. This process tested my patience and concentration. Actually, I made a prototype first, but it did not work out very well and ended up using a piece of material that I remade the second time. And the wool tanning of spheres was not that simple, the spheres were easily misformed and dispersed. So, after another failure, I kept only two spheres suspended from my shoulders. The balls on the nebula are foam balls wrapped in wool. But the overall effect turned out to be pretty good, and I am so glad. Besides, the codding part is another challenging part for me. I took a great amount of time debugging, and modifying the code and finally got the result I expected. Also, the sewing part was a bit tough as well. I pricked my thumb twice in the sewing process. But in the end, I made the project solid and stable.

The production of it makes me really proud. It is like my child, from intangible to tangible. I have witnessed its growth, at the same time, I have also grown during this process. Indeed, it is very fulfilling to realize a certain vision in your heart step by step. My concentration and patience have withstood the challenge when building it. In these few days, I have not stopped thinking or self-reflecting. Through this project, my sewing skills have been trained. More importantly, for me, not only the concepts taught in this course but also the understanding of the wearable field have risen to a higher level. Since when I got started in this field, I was in a panic at first. I felt that I know nothing, and I had to learn a bunch immediately. However, through several practices, I produced content that I want to express, which made me feel confident about the future. I started to trust myself, holding a certain understanding of this field. Thus, I know it, when I see an amazing wearable project again, I will no longer just be envious. Like I will use the theoretical framework I have absorbed to analyze and explore it, then finally internalize it into my insight. Thanks to this class, makes me no longer resist using needles, and programming, and gives me a chance to appreciate various materials, experience the meaning of interaction and so on. It always lets me feel the charm of electronic technology with aesthetics, exploring futuristic bodies and clothing.


Next Steps

My final project is the result of a better understanding of wearability. For a perfectionist like me, it still needs to improve in detail, such as more rigour material choice, much deeper considerations/logic in design, and neatness of sewing. But overall, compared to the previous assignments, I have improved a lot. For instance, the use of more suitable materials to diffuse lights, a more accessible interactive experience, finer stitches, etc., all being mentioned have made it more complete while basically reaching my preset goal. Actually, I cannot wait to share it with my friends. I also want to take it back to China and show it to my family. My younger brother will absolutely love it.

This project is a test for this semester. The selection of materials and the process of making them was long but enjoyable and fun. I think in the next step, I will continue to strengthen the concept of Data Sonification. I might add another input data to change the pitch of the sound, not just the frequency and rhythm. I would also like to add more advanced sound filters or multi-waves to make the output sound more interesting and livelier. In addition, the project uses only the most common LEDs for light, and while the results are good, there is still room for improvement. I think if I continue to iterate on it, I will choose to use more convenient control and better effect of the light band or RGB LEDs. If possible, the number of spheres hanging from the bottom of the nebula would increase to seven, representing the seven planets other than Earth, and each would control the output of sound and light. Moreover, it could be added sound visualization part using a projector to help furtherer immersive scenes. I hope to add this project to my future portfolio. I think there is still a way to go.

In the next stage, I will apply the skills I mastered this semester to various fields and combine them with complex projects. Via the course, I know that wearable technology is not only thriving in the fashion industry but also inclusive and can even change the world. All the experience in this class has also opened up new ideas for me, whether it is for my thesis year or the research direction for graduate studies. At this moment, I feel I am closer to my goal. Everything is laying the foundation for the future. I will keep exploring and researching, designing projects that can address more complicated issues, and be an expresser with connotations. Currently, I define myself as a raw artist, whereas we will see, and you will know.

In the end, I really appreciate Kate and my peers for making me feel such inspired in a group full of creativity and ideas. You all are awesome!

Hope all is well. See you in my future studies!

 

The Veil

Video:

Idea/Inspiration:

This piece is inspired by the veil from the TV show ‘Raised By Wolves’, which allows android robots to screen their emotions while dealing with the outside world. My concept draws from that where one wears the Veil to ‘screen’ their facial expressions. It is a conceptual wearable that has potential exists in a dystopian setting where maybe workers wear the veil to communicate with customers without any sense of emotions coming in between.

raised1a

image1:veil from raised by wolves

Wearing it makes the wearer part of a kinetic performance where the movement of LEDs on  the veil takes over the wearers expressions and the interaction between the wearer and the outside world suddenly becomes linear.

Description:

It is a helmet attached to a translucent face-covering veil on the front. The has flexible Neopixels on the backside. the neo pixels move and change colour organically according to the accelerometer data. As you speak the colors of the Neopixels become multicolored, thanks to the data from the microphone. The veil with the neopixels is quite flexible.

Parts:

  • Bike Helmet
  • Chiffon Cloth
  • Adafruit Circuit Playground Express
  • Adafruit soft flexible wire neopixels strand
  • Alligator clips
  • Battery Pack
  • Tape
  • Velcro straps

img_20220425_142101

image 2 : Parts

Sourcing: I really wanted to recreate the look that Ridley Scott’s costume department had created for the veil in the show Raised by Wolves. I tried to source a sufficiently thick translucent latex sheet from everywhere possible(yup). I even looked at other similar materials like rubber, etc. After wasting a lot of days on this I decided to use a translucent cloth instead. I went to Queens Textiles in downtown Toronto and decided on using a chiffon cloth. It actually worked much better than expected and the additional flexibility made it easy to fix it to my helmet using a velcro strap.

Link to code: 

https://github.com/AtharvaJ3110/Advance-Wearables/blob/main/The%20Veil:%20Code

Code Description:

For this project I decided to try circuit python to program my Adafruit Circuit Playground Express. Since it uses python it is much intuitive and easy to use than the c++ used in arduino IDE.

In the first part of the code I et all the variables up, initialize and make functions for the Neo Pixel animations. I set up the microphone to get serial data and normalize it.

In the While loop, I run the microphone and accelerometer to get values real time.  In the if-else statement I put a condition where if the microphone value crosses a certain threshold, all the Neopixels will be multicoloured. otherwise the neopi xelswill run a colour chase animation where the color changes according to the accelerometer data.

Circuit Diagram:

finalproject-sketch

Challenges:

Firstly, I wanted to use translucent latex sheet to create the veil to give it a more alien sci fi appearance with a beautiful diffusion, But latex was really hard and expensive to source in Toronto. I wanted to stitch the Neopixels onto the  veil, but my lack of stitching experience kept me from doing so. (shout out to velcro straps and tape hahaha!)