4.3 Capacitive Touch Control Neopixel Jacket

Video: https://youtu.be/4uVq600y3D0

Materials: conductive tape, Paper or non conductive fabric, Conductive fabric, copper pins, Jumper cables, 5V LED neopixel strip, old jacket, 9v Battery

img_20201213_113218

Description: It’s more like a prototype than a swatch, jacket with capacitive touch sensors on the sleeves, touching the capacitive sensors would change the pattern on the LED strip, perfect for Techno events or concerts. Battery connectivity gets loose so for the future I would make sure to have a proper connector for batteries. Would have loved to use Ida fruit instead of Arduino which would have made it more lightweight and more suited for being a proper wearable.

Development:
I had made a capacitive touch wheel for assignment 4.2, I used the same technique to create 3 touchpads using conductive fabric and copper wire.
I brought and 3 pin 5V LED strip and used the adafruit neopixel library for the code
connect the capacitive sensor to the analog-in while the LED goes in 5V,GND and pin.
Sew the LED strip into the jacket, attach the conductive touchpads into the sleeves

img_20201213_120712
link to code: https://github.com/AtharvaJ3110/sensorsandactuators/blob/main/capactitive%20ledstripcontrol

Circuit Diagram:

screenshot-2020-12-14-at-10-26-11-am

Influence:
I actually am really Interested in using conductive fabrics. I want to do something with it in fashion. I thought of a brand name too called 3021 Cosmic Couture. I wanted to make something cyberpunk. I was also inspired by the actuator ‘Fibre Optic Poetry’ on KOBAKANT(Link:https://www.kobakant.at/DIY/?p=7031)

img_20201213_113322

Tools/Techniques:
The main tools were the home made capacitive sensor using conductive fabric(https://www.kobakant.at/DIY/?p=6607)and the neopixel strip.

I used the adafruit neopixel library for Programming the neopixel strips. It makes controlling neopixels very easy.

Review Parallel World: Camellia (2020), Tae Nim

Tae Nim’s enchanting AR art installation piece, Parallel World: Camellia (2020), is one of the most unique exploration of AR as an installation piece as well as an exploration of a representation of Tae’s identity. Observers’ backgrounds change to Tae’s beautiful and enchanting artwork Camellia, and we become part of this tree with blooming flowers , A chilly winter soundtrack plays in the background once you press the mouse which is a beautiful reminder of the hardships of life. Our outside world is reflected inside as a part of a new identity.

capture3-768x432

This piece is heavily based on your identity, what is the relationship between your piece Camellia and your identity?

Camellia is the strong side of me. I had to help my moms business, I live alone and pay my rent for which I work two jobs.  Currently, I am focusing on using Camellia as a symbol of my work to express my willingness to overcome the hardships to blooms my potential. Significantly, the camellia is the flower that blooms after it overcomes harsh winter, so I found a lot of similarities because I feel like my season still in winter which is the time that I have to practice more to bloom my beautiful flower patiently. In conclusion Camellia = my strong identity.

What was the main theme for your piece?

“inside / outside;” people will compare their outside world that they can see through the mirror and inside a world that they can see through the camera of the computer in front of them. The computer-generated images from the AR will make them feel like they are in a different world, that is why I chose the title of the work as a parallel world that people can transform into my identity by using a visual representation of the camellia trees and flowers that I drew. I thought this work would be an exciting communication that people can explore their outside world and my inside world at the same time.

I find the piece to be very enchanting, I love how my room gets completely transformed into the beautiful artwork of Camellia, How did you integrate the background into your piece? Did you use any other software?

For the background, I add a two-dimensional picture on the paper. When I directly apply two-dimensional photo on the program, it is on Orthographic Camera. First, I had to erase the background of the image by using the photoshop. Second, I apply to the program, but the photo will be in front of your face. To put it on backwards, we have to use Portrait Background Segmentation on the Mask texture; everyone can easily find this feature when they press the plus button on the left corner.

Why the winter theme with the soundtrack?

 I decided to select Play sound since I want to express the setting of my inner parallel world by winter since there are more hardships that I have to go through. In conclusion, I had to go through many technical difficulties and takes time to get used to with this program, but I did have a clear goal of the project, and it became my motivation to spend time to solve the technical issue one by one by researching and trials.

Normally AR is part of an application or the viewfinder is the main stage, but you went for an art installation?

In order to bloom my flower I tried so hard. After getting into University however, I felt a kind of emptiness in my heart. I started to question why I am here and what my goals are but my life is too busy to thinks about these things. This assignment gave me an opportunity to delve deep and ask the question,”What doe Tae want to do?” and I wanted to express  my exploration effectively. I witnessed art installations in the AGO and I knew this is how I wanted to express my art.

Conclusion

I think Tae Nim’s piece is a very interesting study of the relationship between environment, Inside identity and the outside observer. The addition of AR in an art installation adds a while new layer of digital space that actually transforms the outside environment into an enchanting interior of the installation. The mirrors on the side of the frame adds to the overall reflective quality of the work. Unfortunately I could not witness this installation with my own eyes because of the COVID 19 pandemic but I would have loved to explore this piece in its entirety. I did get to explore the AR aspects of the work and I am very impressed with the implementation and the immersiveness of Tae’s artwork Camellia  as the background and the subtle symbolism that you slowly come to realise.

{Cellular Shadow}

CELL DIVISION

screenshot-2020-10-27-at-7-57-12-pm

Screenshot of a single cell dividing from my project ^

We are aware that cells are the building block of life, they have a lot of complexities and processes that go on inside them that we cannot see in out daily lives, However most of these processes are results of simple rules. For This project I focused on one such process, the process of Cytokinesis or Cell division where after Meiosis or Mitosis(where genetic information is copied or redistributed) the cell properly divides into two identical cells. It is a really fascinating process which allows us to do some wonderful stuff like self healing, reproduction and during the process of zygote formation a very interesting process occurs where a whole variety of complex 3D geometry of life emerges from simple rules with some internal parameters (chromosomes)that govern the division of the base Zygote which is spherical in Humans. Our bodies emerged from a simple sphere!!! now the knowledge of the specific rules that give rise to such complexities like the life we see and are is still obscure and research is going on but we have been successful in finding our own algorithms (Conway’s Game of Life) and even finding some that actually reproduce natures own geometry (Turing Patterns).  This is the main inspiration behind my project.

 

RULES

My project is an interactive project made in TouchDesigner entirely using TOPs(Texture Operators). The rules of my system is a combination of two different types of automata systems, one is Turing Patterns and another one is an edge detection algorithm used in the Edge Top of TouchDesigner. heres a brief explanation of how these two work:

Turing Patterns:

Introduced by the legend of computers himself, Alan Turing proposed a concept that shapes and patterns found in nature can arise naturally and autonomously from a uniform state. It arises from simple rules of reaction and diffusion. where two substances(mostly solutions) will first diffuse(mix into each other) then react due to local differences. the reaction is governed by certain weights, parameters or just chemical in nature.

screenshot-2020-10-27-at-7-36-38-pm

Edge Top

The Edge TOP finds edges in an image and highlights them. For each pixel, it looks at the values at neighbouring pixels, and where differences are greater than a threshold, the output’s value is higher. Add a feedback loop to this operation and you create an automata system.

The addition of these two operations give rise to the cell division like phenomena of my project. This type of phenomena was first observed in the Grey Scott model of reaction diffusion but in this version where the strength of the Edge Top acts like a control for the cell division and therefore allows us to control the overall chaos of the system.

PROCESS

I started by creating a simple reaction diffusion program by simulating the process of reaction and diffusion using a technique that involves blurring and sharpening the image every iteration. this can be done using feedback loops. You can read more about this technique here: https://www.researchgate.net/profile/Andrew_Werth/publication/280626953_Turing_Patterns_in_Photoshop/links/55bfef9908ae092e9666a3ce.pdf

screenshot-2020-10-27-at-7-23-38-pm

 

Then I add an Edge Top, add a few other gradients and control parameters using Ramp and Level Top and eventually I get a system that distributes itself uniformly to stabilise.

screenshot-2020-10-27-at-7-45-25-pm

screenshot-2020-10-01-at-10-46-33-pm

Video: https://ocadu.techsmithrelay.com/Zgzg

Exploring the different parameters of the system reveals different forms and properties of this system like the ability to replicate, send oscillatory information as waves, form a central structure like a nucleus. Here are examples of three behaviours:

screenshot-2020-10-27-at-7-49-14-pm

Video: https://ocadu.techsmithrelay.com/0qG5

Forming hexagonal arrangements and realistic division resembling plant and animal cells.

screenshot-2020-10-27-at-8-10-06-pm

Video: https://ocadu.techsmithrelay.com/wMyx

Individual oscillations seems to create its own automata reaction creating collective wave like oscillations throughout the system which makes them look like they are communicating.

screenshot-2020-10-27-at-8-09-20-pm

Video: https://ocadu.techsmithrelay.com/F5W8

A single cell, oscillates unpredictably and looks like it is visibly struggling to divide like a living organism.

Interactivity

I do not have a kinect or any depth sensors so I just used videoFilein + Level TOP to basically map out the edges using contrast and other adjustments and overlaid the cells directly on this input.

screenshot-2020-10-27-at-9-32-18-pm

The cells take the shape of the organism/human or object in front of the camera. While not exactly like the way cell division works in real life the way seeing these cells arrange themselves given any geometry is fascinating. These cells do behave very much like very simple organisms. What. is life.

I have included the TouchDesigner file below for anyone who wants to explore this piece.

https://drive.google.com/file/d/1NDxK0fnxDg5_Uz1Xc5iZzGgIHX96LTyw/view?usp=sharing

THANK YOU 🙂

Audio -Visual Orchestra

Audio-Visual Orchestra

DIGF-2014-001 Atelier I: Discovery

Atharva Jadhav

Kanav Arora 

Dharini Kapoor 

screenshot-2020-09-28-at-11-42-08-pm

 

Final Codes: 

KeyboardController:https://editor.p5js.org/jatharva2000/sketches/cgBrRB Shared Space:https://editor.p5js.org/jatharva2000/sketches/CJsnUuvvD 

Concept and Description of the Project: We spend most of our time chatting in groups, during video calls, even sharing visuals in the form of images and different filters and emoticons. Something that can be equally fun and creative to do especially when you have a lot of people together in a shared space (online or offline) is to have everyone take part in an impromptu orchestra. Since everyone is communicating from home mainly using group video calling with a screen in front of them. Just like how the textbook/novel shows how to program sounds in a commodore 64 console, the idea is to program a musical instrument/soundboard and a visual shared space using p5.js.  It will have different sounds linked to different button inputs or different areas on the screen and would also allow users to play with some other variables like pitch to produce unique sounds. The sounds and the type of sounds will trigger a visual on the screen, the visual and the audio will be seen and heard by everyone taking part and all the sounds and visuals from different computers would create an audio visual orchestra. The sounds can be customised in p5.js so arrangements with custom sounds can also be made. The major task was to be to have a foundation for the music and the visuals so that the whole thing does not seem overly ‘random’ even if everyone is pressing random buttons. This problem was solved by mapping the position of the visual object to the keys played by the player. The instruments will be like controllers whereas the visuals will be part of a shared space.

The project consists of one working instrument (player 1) and a shared space that receives and plays the notes. People can play on the piano controller and can listen to everyone’s audio collectively being played in the shared space. The Shared space also features a visual game where the aim is to play music with friends in ways that does not let the floating shape out of bounds. If the shape is out of bounds its game over but with a chance of still recovering the game but if the shape is lost out of bounds the game is over for real.

The shared space and piano are programmed using p5.js. 

Player Interaction and Screenshots :

untitled_artwork-13

 

screenshot-2020-09-28-at-11-37-16-pm

 

screenshot-2020-09-29-at-12-56-06-am

Final Project Video :

https://ocadu.techsmithrelay.com/rYiJ 

In the video the shape is being controlled by one player as he is playing notes to keep the shape in bound, another player joins mid-game (indicated by the change in shape and speed) and it becomes a fight to keep the shape in bound, the subsequent players have two roles: they could help keep the shape in bounds(working together) or try to force the shape out of the bounds (working against each other).

Development and UIThe First process: included creating two basic working piano prototypes out of which would be finalised to be the controller for the shared space.The prototypes were built on P5.js.

Source code prototype1: https://pastebin.com/HDS9XECw 

Source code prototype2: https://pastebin.com/HDS9XECw 

Screenshots:

 Prototype 1: 

Prototype 2:

The Second process :includes one working instrument (player 1) and a shared space that receives and plays the notes. People can play on the piano controller and can listen to everyone’s audio collectively being played in the shared space. The shared space and piano are programmed using p5.js. The final instrument was based on prototype 1.

Source code:

Piano Controller  Code:

https://github.com/AtharvaJ3110/programs/tree/master/PianoController1_2020_09_25_01_50_32 

P5.js Code:

https://editor.p5js.org/jatharva2000/sketches/cgBryZfRB 

Screenshot:

 The Third process: includes creating a shared space which would be considered as Orchestra’s shared space where the sound is being played on the shared space after the input is given on the controller and the value of the key of the notes is being shared, which will be used to generate visuals.

 Source Code:

Shared Space code :

https://github.com/AtharvaJ3110/programs/tree/master/OrchestraSharedSpace_2020_09_25_01_17_59 

P5.js Code:

P5.js: https://editor.p5js.org/jatharva2000/sketches/SLN4AdibU 

Shared Spaced Video:

https://ocadu.techsmithrelay.com/wy0H

The Fourth process: The last and final process consisted of producing meaningful visual interaction in our little audio focused project, There were problems in getting multiple shapes to correspond to different players so we went with having one shape being controlled by multiple players and based the game on how hard it would be to controller a single shape through the canvas with multiple controllers while simultaneously trying to create music. So the final game consists of controlling one shape with other players and making meaningful music out of this chaotic struggle of keeping the shape in canvas bounds. When a new player joins in the shape changes and more the players the faster and harder the challenge. The game is really fun when a conscious effort is being made to keep the music…”musical”. Two roles assumed by the players in this game:

  • Working together to keep the shape in bound 
  • Working against each other to throw the shape out of bounds

Wireframe for planned game:

untitled_artwork-1

The planned game was going to be a complete orchestra of audio reactive shapes with new shapes coming with new players and reacting to their sounds. With different sounds from each controller setup interacting with a corresponding shape in a group visual party session in different arrangements.

Textbook used:

 

Make your Commodore 64 sing by Bogas,Ed

link:https://archive.org/details/Make_your_Commodore_64_Sing/page/n45/mode/2up