The higher you fly, the harder you fall!

Project Description:

This is a kinetic art installation which emotes the feeling of elevation with pride.  This indulges in an extended non-verbal communication from the user to his surroundings. Three capacitive touchpads and a do-it-yourself switch are coded to activate the NeoPixels and wings respectively. This project is a surreal mix of fantasy and feelings in terms of the forms, it has been constructed upon. The golden wings represent ‘a feeling of flight’ (ambitious), the braided garland with the Adafruit Playground circuit framed with a golden fabric is created to resemble a medal to convey the idea of someone who flies on the Cloud 9, proud of his/her achievements, snobbish and has a sense of superiority complex. The DIY switch (Golden Steel wool) can be used to pause the wings which function via two servo motors attached on the back. The three capacitive touchpads are coded in rising levels of intensities, in terms of audio and light emissions. The first A1 represents pride with the least intensity and A3 representing the highest intensity of all. Arduino Nano 33 IoT to activate the golden wings and Adafruit Playground Circuit activates the music and lighting play. Indigo, golden yellow and creme white and magenta colour palette have been chosen to make the installation aristocratic, grand and royal. Glass jewel stickers and strings lights are included for additional aesthetic appeal.

Final photos: 

img-20200226-wa0011
received_2574070766142584 received_2976419402379843 received_3077629205603273

Parts List: 

For the Garland (Front) 

  1. Adafruit Playground Circuit, Creatron Inc.
  2. String Lights, Michaels
  3. Steel Wool, Tools cupboard, Digital Futures Studio
  4. Conductive Fabric, distributed in class
  5. Net Fabric, Michaels ( shared from a friend)
  6. Felt Fabric – Dollarama
  7. Fur Shawl, Dollarama
  8. Jewel stickers, Dollarama

For the Wings (Back)

  1. Servo Motors, Creatron. Inc
  2. Arduino Playground Circuit, Creatron. Inc
  3. Alligator Clips, Creatron. Inc
  4. Conductive thread, Creatron. Inc
  5. USB Cord
  6. BreadBoard, Creatron Inc.
  7. Golden sheets, Dollarama
  8. Acrylics, scraps from past projects
  9. 3D printed mounts to attach servos

Schematic Diagram:

Adafruit playground Circuit

Aduino Nano 33 IoT

Coding:

Activating The NeoPixels – Adafruit Circuit Playground

Activating the Golden Wings with Servo

https://makecode.com/_erj9PpPXWVa9

Process photos and notes:

20200225_174541 20200226_012811
img-20200226-wa0006 img-20200226-wa0031 received_836670073426724 received_1223332791189234

I chose classy and dark colours for an aristocratic and rich look. The fur shawl is stitched together in the middle to give it a majestic form. The wings are made using the knowledge of Origami – Japanese art of folding paper. I have used steel wood for the first time because of its conductivity and complimentary appearance that gets added to my project’s overall aesthetics or else I could have used any other conductive material. The net fabric is very flimsy, so I had to layer it up & braid the cloth fabric to add weight (preventing it from flying off). I had a difficult time pasting and mounting the servos on the back. For attaching it with the wings, I had to provide hard acrylics for supporting its edges. The acrylics were doubled taped from every side for stability and easy movement of the wings.

Lessons learnt:

  1. Steel Wool: For the 1st time, I used it for its aesthetic value and conductivity. But it is difficult to shape this wool as it gets easily breakable and needs careful attending while handling.
  2. Microsoft Make Code Software: Block-based programming which is very good for quick lighting interactions, easy navigation and fun to use. I used this for its flexibility to create musical notes and cool lighting animations. But it is very limited in terms of library usage, servo motor coding and defining specific colour outside its given gamut. Also, very few projects are done using this, seems to be relatively new software.
  3. Use a power bank, it is more reliable than batteries.

 

Video:

Flapping Wings, Does not Fly (Close-Up)

Flapping Wings, Does not Fly (Establishing Shot)

Project Context

Ideation: 

The idea of humans flying has fascinated several famous individuals like Leonardo Da Vinci or Daedalus (craftsman in Greek Mythology) since history.

When I was a child, my mom narrated me the story of Icarius – who got his wings made from wax & feathers by his father to escape from the prison. His father, Daedalus warned him not to fly towards the sun as it will meth the wax and the feathers will come apart. But Icarius did not listen to his father, became ambitious and flew towards the Sun to possess it. Unfortunately, his father could not stop him, the wax melted and Icarus fell to his death into the sea. This story creates a perfect epitome of the proverb – ‘Pride comes before the fall’.  I wanted to represent this concept – an individual who is overproud and blinded by his achievements is headed to his/her great fall. The higher one rises, the harder they fall.  When I saw Monarch V2

Exploration:

I tried making the wings with different widths of paper, with 70 gsm (copier paper sheets), the wings fell off & got torn easily. And with higher paper gsm, the paper would not fold and the servos were unable to more it. Finally, I got the right gsm & created my perfect golden wings. But to place it, I needed to tape it from both the side for stability and firmness. The servos were mounted on the 3D print out mounts. But still, I had to stitch them for assurance that it would not fall while flapping the wings. 

Then for making the garland, I wanted to use the net fabric but since it gets spread & creases easily, I planned to braid it to prevent it from falling from the chest of the Judy mannequin. To stuck the Adafruit Playground Circuit, I stitched it on top of a circular yellow felt fabric so that it resembles a golden medal. The garland symbolically represents the achievement of the wearer.  

Coding:

I tried 180-degree coding at first, but it got stuck, then I tried with 90-degree where it did not look aesthetically pleasing. So I tried coding it from 120-140 degrees. This served the purpose very well and visually communicated my concept. For coding the Adafruit playground Circuit, I use Microsoft MadeCode Adafruit which is fun, easy but limited in terms of attributing specific colours. I composed some of my musical notes. I used the jewel stickers on the fur shawl for a royal touch.

I used steel wool for the first time as a conductive switch, as it can be shaped like a digital button which turned on & off the wings. But I do not suggest to use it as it is highly unreliable and easily breakable. It needs to be handled very carefully with the maximum attention one can devote.

Execution:

img-20200226-wa0012

For final assemblage, I tied strings lights around the Judy mannequin for additional beauty and intensity. I tried using a darker room since I had incorporated a play of the NeoPixels but the wings were invisible. Since darker room defeats the purpose of communicating the message, I finalise to showcase my work in a well-lit room.

References:

unnamed

  1. Hartman, K., Knipping, A., Kourtoukov, B., Colpitts-Campbell, I. and Lewis, E. (n.d.). Monarch V2. [online] SOCIAL BODY LAB. Available at: http://socialbodylab.com/monarch-v2/ [Accessed 22 Feb. 2020]. An interesting project which gave me the idea to explore kinetic wearables and how wings could be attached to the shoulders. Monarch V2. It expands and contracts according to the muscle movements of the wearer. maxresdefault
  2. Wipprecht, A. (2020). Anouk Wipprecht FashionTech. [online] Anouk Wipprecht FashionTech. Available at: http://www.anoukwipprecht.nl/#intro-1 [Accessed 2 Mar. 2020].I read about the famous ‘robotic Spider dress’ in the ‘Design Framework For Social Wearables.pdf’ shared by Kate. But I further researched about it and how it marks the personal space of the user. It is attached with proximity and respiration sensors, thus extends out its 6 legs when another person walks towards it.

 

Digi-Cart 3.0

Experiment 4 – Katlin Walsh

Project Description

While interactive media content displayed within galleries has been updated within the last 5-10 years, presentation formats for tradeshows have not. Digi-Cart brings an adaptive presentation style to the classic concept of a tool cart. Robust building materials and pegboard construction allows corporations to adapt their layout and presentation style to reflect their current corporate event. 

Digi-Cart features a basic controller layout which can be overlayed with a company’s vinyl poster cutout to create an interactive presentation that can be facilitated by an expert or self guided. Corporations are encouraged to update their digital materials & create animated graphics to capture audience attention. 

Continue reading “Digi-Cart 3.0”

Digi-Cart 2.0

Experiment 5: Katlin Walsh

Project Description 

While interactive media content displayed within galleries has been updated within the last 5-10 years, presentation formats for tradeshows have not. Digi-Cart brings an adaptive presentation style to the classic concept of a tool cart. Robust building materials and pegboard construction allows corporations to adapt their layout and presentation style to reflect their current corporate event. 

Digi-Cart features a basic controller layout which can be overlayed with a company’s vinyl poster cutout to create an interactive presentation that can be facilitated by an expert or self guided. Corporations are encouraged to update their digital materials & create animated graphics to capture audience attention.  Continue reading “Digi-Cart 2.0”

Digi-Cart 1.0

Experiment 3 – Katlin Walsh

Project Description 

While interactive media content displayed within galleries has been updated within the last 5-10 years, presentation formats for tradeshows have not. Digi-Cart brings an adaptive presentation style to the classic concept of a tool cart. Robust building materials and pegboard construction allows corporations to adapt their layout and presentation style to reflect their current corporate event. 

Digi-Cart features a basic controller layout which can be overlayed with a company’s vinyl poster cutout to create an interactive presentation that can be facilitated by an expert or self guided. Corporations are encouraged to update their digital materials & create animated graphics to capture audience attention.  

Continue reading “Digi-Cart 1.0”

OCULAR

ocular-copy

Ocular
Animated Robotics – Interactive Installation

By Jignesh Gharat

Project Description

An animated robotics motion with a life-like spectrum bringing emotive behaviour into the physical dimension. An object with motion having living qualities such as sentiments, emotions, and awareness that reveals complex inner state, expressions and behavioural patterns.

He is excited to peak outside the box and explore around the surroundings but as soon as he sees a person nearby he panics and hides back in the box as if he is shy or feels unsafe. He doesn’t like attention but enjoys staring at others.

Design Process & Aesthetics

It’s an attempted to create an artificial personality. I wanted to work on an installation which encourages participation rather than spectatorship. So wanted to work on a physical installation that lets people experience familiar objects and interactions in refreshingly humorous ways.

The explorations started with exploring objects and living organisms that can be used as a part of the installation. Where I can implement curious interactions and a moment of surprise and unpredictability. Gophers, Crabs, Snails, Robots were few to mentions. I finally ended up finalising periscope as an object that can have behaviour and emotions and a perfect object to play hide and seek with the visitors.

What is Periscope?
An apparatus consisting of a tube attached to a set of mirrors or prisms, by which an observer (typically in a submerged submarine or behind a high obstacle) can see things that are otherwise out of sight. I started thinking of ideas, object behaviors as well as setup of the installation to come up with engaging and playful interactions.

Step 1 – Designing the bot

The model was developed in Rhino keeping in mind the moving parts that is rotating the head and the spin which is lifted from the bottom.

ocular-copy-81

ocular-copy-5

Step 2 – The mechanism.

Testing the motors stepper (28BYJ-48 5V 4 Phase DC Gear Stepper Motor + ULN2003 Driver Board UNO MEGA R3-Head ) and Servo (SG90 Micro Servo Motor – Head and 3001 HB analog servo – Arm ). Stepper motors the RPM was too less to give that quality and reflex to the bot so decided to go with the servo motor which had better torque.

ocular-copy-4

The 3d printed body was too heavy for the Analog servo motor to lift so finally decided to use paper to model the body and reduce the weight and load on the motor. Surface development was done using Auto CAD software for making 3 to 4 different options based on the mechanism and aesthetics of which I decided to work on the design shown in the below image. The arm was laser cut in acrylic and two options were made to reduce the friction between paper and the lifting arm contact surface.

ocular-copy-6

 

How does it Work?

A micro motor controls the oscillation of the head. A Distance sensor at the front of the wooden  controls the motor 2. When the visitor is in the range of set threshold of the distance sensor  that pulls the lever arm down and the motor 1 (Head)stops rotating.

ocular-copy-3

The installations is minimal and simple with just two materials wood and paper that gives a clean finish to the piece. The head is like a focus with a black sheet and white border as if the sensors are installed inside the head that controls the oscillation. In my further explorations and practice I will make sure that the sensor are not directly visible to the visitors as this some time leads to visitors interacting with the sensor and not actually experiencing the installation.

ocular_jigneshgharat_01-jpg ocular_jigneshgharat_02

 

Code :  https://github.com/jigneshgharat/Ocular

 

Project Context

Opto-Isolator (2007), Interactive Eye Robot  -Opto-Isolator (2007: Golan Levin with Greg Baltus) inverts the condition of spectatorship by exploring the questions: “What if artworks could know how we were looking at them? And, given this knowledge, how might they respond to us?” The sculpture presents a solitary mechatronic blinking eye, at human scale, which responds to the gaze of visitors with a variety of psychosocial eye-contact behaviors that are at once familiar and unnerving.

The Illusion of Life in Robotic Characters – Principles of Animation Applied to Robot Animation – Robot animation is a new form of animation that has significant impact on human-robot interaction.
This video relates to and extends a paper we have published at the ACM/IEEE Human-Robot Interaction Conference

Exhibition Reflection

The observations were made and recorded at Open Show Graduate Gallery and Centre for Emerging Artists & Designers which provided me with some good insights and learnings for necessary changes to be made developing the concept further. It was fun to see people interacting and playing around a nonliving object as if it had emotions, feelings and  the way Ocular reacted to the viewer’s actions. The interactions were supposed to be only from the front but as the wall card was placed at the side of the installation placed on a high plinth, people tend to read and start interacting from the side and did not really experience the installation as expected but most of them figured out the working as they read the wall card.
Some interesting comments from the visitors:

  • The Installations is making everyone Smile.
  • I have 3 cats and all 3 cats will go crazy if they see this.
  • She said all the guys react the same way Ocular did so she don’t want to go near him.
  • What are the next steps?
  • What was the inspiration?
  • How does this work?
  • Why does he ignore me?

A visitor (Steventon, Mike) referred to Norman White’s interactive robotic project, The Helpless Robot.

Observations and visitors reactions

ocular-copy-7

Reference

Interactive Canvas – Bring It To Life

interactive-canvas

 

Project Overview

 

Today there is so much competition, and advertisers are continuously looking to find new interactive and ambient mediums for advertising . This project explores a more traditional and yet interactive approach by using an ordinary canvas to bring three abstract icons to life upon touch with the idea of how a healing hand could protect wild life such as a butterfly and a tree. The use of touch isn’t limited to flat, shiny and expensive digital screens, but it could be brought to any tactile surface like in this project. The use of conductive ink and animations allows visitors to experience and feel the dotted paper anywhere and watch as flat objects change into animation. Moreover, such an advertising medium is economical to use and could be placed in any vicinity.

The aim of this object is part of my discovery into conductive materials and how it could be adapted to advertising mediums telling interactive stories.

thumbnail_img20191204200635

58231504-0ed3-4d5c-b18f-324d33eafd72

thumbnail_img20191204165833

Ideation:

My idea behind this project was exploring ways to being art pieces to life in an interactive and experiential ways. Although, there are many ways to do this today such as VR and AR but I was keen to discover something more traditional and familiar. Thus, for this project, I was keen to explore conductive ink and how capacitive sensors could allow a tactile surface to be interactive.

As a result, I came across variety of work within this field. This included the work from London’s Retail Design Expo1which used conductive ink to draw flat icons for storytelling. Another interesting concept about interactive wallpaper was introduced by High-Low Tech4. Here they added light and communication within the canvas. Similarly, another interesting work was the Dublin Interactive Wall2which explored conductive ink with advanced animation. In addition to this, conductive ink has also been used in combination with hologram and AR such as the from Ransquare3, an agency in Seoul.

 

Design Concept

To bring most commonly known symbols and icons to life, I chose a symbol of a hand, a butterfly and a tree. The story is about how energy from a hand can trigger life into the butterfly and tree. This is inspired from the documentary series of BBC Earth where intervention of humans is impacting wild life.

3ed0b590-a5d6-4e99-835a-2d0cbfc5f54c

5193d092-cead-4c68-891a-ee1b5b8a6116

f76cd67c-6779-4d99-b4a7-e75bf3fcce24

fd7cf8b6-9e95-495b-9b1b-3c743a40a1fc

Animation

To trigger an animation for each of the three objects, I made a straight path animation using three colors for each animation.

capture3 capture4 capture-2

 

Conductive Ink

I used a 12 ml conductive ink pot to trace my sketch on to the canvas and let it dry for an hour as it’s a better conductor when fully dry.

 

4e5ca389-542b-4582-b1e6-63d007b56fe0

72eff6f8-4d9f-43f5-b198-d6338b5d31f1

 

Software

Arduino

I downloaded the capacitive sensor library for Arduino to begin testing the code. I initially started with putting some conductive ink on a sample paper to check the values on console and gauge what values are triggered upon touch.

I used 3 1M OHM Resistors and placed it between the received and send pin which was Pin 4. This allowed the capactive sensing to work. I learned that when conductive ink was in contact values jumped above 1000.

thumbnail_20191209_165143

thumbnail_20191209_165153

Processing

Using serial monitor data from Arduino, I then wrote ‘and’,‘if’ and ‘else’ statement to run a movie for each sensor when its values are above 1000. For this process, a visitor presses and releases any area to play the animation.

8403e561-2e7b-4347-a0df-2424e0c55144

 

Production and Bringing Everything Together

On the flip side of the canvas, I attached all the wires which made electronics visible.

 

58231504-0ed3-4d5c-b18f-324d33eafd72

cf7a1639-5c33-4dd6-8ee4-68c41fdfb417

935c0e62-ab5c-4369-8d0e-c3bfbf7a0179

Installation

thumbnail_img20191204170350

 

The USB extender supported the idea to hide any wires. However, it could further help if I had used an HDMI extender and overhead projector to hide the projector and move laptop into a different room.

 

Reflection:

The canvas received appreciation as art is usually not to be touched but the interactivity and accessing art through a direct tap on the canvas brought new ideas to life. Few artist who interacted with canvas seemed delightful about using conductive ink on their own canvas.

Moreover, during critique various thinking points were suggested which helped the make project more canvas. These included making the narrative stronger, putting the canvas on a greater height, hiding most wires, discovering rear projection.

Supplies

  1. 18”x24” Canvas
  2. Conductive Ink
  3. Arduino Micro
  4. Pico Projector
  5. Jumper Wires
  6. 3x Resistor 1.0M Ohm

 

Large Scale Adaption.

This technique could be used in conferences as a backdrop to engage visitors about telling stories of products, interactive exhibitions and many others.

 

References

 

  1. Ayres, C.  (2015). How Dalziel and Pow Realized This Awesome Interactive Touch Wall. Retrieved from URL;

https://www.core77.com/posts/35697/How-Dalziel-and-Pow-Realized-This-Awesome-Interactive-Touch-Wall

  1. Studio,L. (March,2019). Dublin Interactive Mural. Retrieved from URL:

https://lightscape.ie/dublin-interactive-mural-interactive-projection-mapping-motion-design

 

  1. (January, 2019). 365 Safe Town (Korea Youth Safety Experience Center). Retrieved from URL:http://raonsquare.com/

 

  1. High-Low Tech Group. Living Wall. Retrieved from URL: http://highlowtech.org/?p=27

 

 

GitHub

https://github.com/arsalan-akhtar-digtal/Experiment5

 

Invisible ‘Kelk’

By
Arshia Sobhan Sarbandi

 

Invisible ‘Kelk’* is an interactive installation inspired by the transformation of Persian script, from calligraphy to digital typefaces that almost all the current publications are based on. This project is a visual extension of my previous project, an exploration to design an interface to interact with Persian calligraphy.

*’Kelk’ is the Persian word for calligraphy reed.

From Calligraphy to Typography

Reza Abedini is one of the most famous Iranian graphic designers, whose main source of inspiration is visual arts and Persian calligraphy. What is original about his work is that he was the first to recognize the creative potential of Persian calligraphy and to transform it into contemporary graphic typography.[1][4] In an interview, Abedini says: “After many years, because we have repeatedly seen and got used to typography in newspapers, it has become the reference of readability, and not calligraphy anymore. And now, everything written in that form is considered readable, which is one hundred percent wrong in my opinion.”[2]

 

Left: Vaqaye-Ettefaghiye, the second Iranian newspaper published on 1851 Right: Hamshahri, one of the current newspapers in Iran
Left: Vaqaye-Ettefaghiye, the second Iranian newspaper published in 1851, written in Nastaliq script (image: Wikipedia)
Right: Hamshahri, one of the current newspapers in Iran, using glyph-based typography (image: hamshahrionline.ir)

 

Abedini argues that we have lost the potentials of Persian calligraphy as a result of adapting to a technology that was not created for Persian script – movable type.[3] ‌Below, you can see Abedini’s original words written in calligraphy (Nastaliq, the prominent style in Persian calligraphy) and typography using one of the most common typefaces currently used in books and newspapers:

Despite the obvious visual difference between these two, the important fact is that both of these writings are completely readable for an Iranian person.

Designing the Visuals

Similar to my previous project, there are two different things happening when pushing or pulling the hanging piece of fabric – the canvas – from its state of balance. All the words from the short paragraph of what Abedini says about calligraphy and typography are scattered on the canvas in a random pattern that refreshes every minute. You see the same words on both sides, in calligraphy and typography.

The composition of calligraphic words is inspired by an important style in Nata’liq script, called Siah-Mashgh. This style is usually defined by its tilted written words in multiple baselines all over the canvas.

 

Calligraphy pieces by Mirza Gholamreza Isfahani
Calligraphy pieces by Mirza Gholamreza Isfahani in Siah-Mashgh style

 

What I found really interesting is the fact that although words are randomly positioned in my project, and the random pattern is constantly changing every minute, the result retains its visual identity in terms of the calligraphic style. The words are in harmony in all different random positions, which in my opinion, is the result of great visual potentials of Persian calligraphy.

 

Samples screenshots of how calligraphy words appear in different random patterns
Sample screenshots of how calligraphy words appear in different random patterns

invisible-kelk-2

 

Technical Details

Almost all the technical details are the same as the previous project, except that I used a slightly larger fabric in the exhibition and a cleaner physical setup. A larger piece of hanging fabric essentially results in slower movements of the fabric in the resting position and during the interaction, which I found more suitable regarding the overall experience I expected.

Exhibition Reflection

From the observation of and discussions with the people interacting with the project, two major points were perceived. The first obvious fact was that when confronting the hanging fabric, many people hesitated to physically interact with it, and even after being told that it was OK to touch the fabric, they touched it in a very gentle manner, unless they were ensured that they were supposed to ‘push’ the fabric. It was also perceived that the possibility of pulling the fabric was much less than pushing the fabric. However, after getting comfortable with the interaction, they usually spent several minutes with the piece and found it pleasing.

One possible solution (to resolve the issue of hesitation to interact with the fabric) could be installing it where people have to push the fabric to pass through it. Another possible solution could be some airflow in the environment that causes the fabric to move slightly in both directions from the resting state so that it can provide a clue that moving the fabric in two directions would result in different visual feedbacks.

Code Repository

https://github.com/arshsob/Experiment5

References

1- https://www.mashallahnews.com/reza-abedini-persian-calligraphy/
2-http://honargardi.com/portfolio/%D8%B1%D8%B6%D8%A7-%D8%B9%D8%A7%D8%A8%D8%AF%DB%8C%D9%86%DB%8C/
3-https://www.youtube.com/watch?v=XfI3grQDVyY
4-https://kabk.github.io/go-theses-18-saber-javanmard-gain-loss/thoughtsChapters.html

Antimaterial

img_8987

By Jessie, Liam and Masha

Project Description: 

Antimaterial is an interactive installation that explores abiogenesis and kinesis to articulate a relationship between the human and nonhuman world. Just as we are able to affect our surroundings, the ability of magnetic materials to move matter indicated the presence of life to pre-Socratic philosophers. The rocks beneath our feet were not only the essential catalysts for life, but that microbial life helped give birth to minerals we know and depend on today. In Antimaterial, these concepts are explored as primitive life emerges within the topography of the material in response to human proximity, demonstrating a connection between animate and inanimate matter.

Project Context:

Drawn the freedom and possibility of exploration, from the start of the project,we all agreed to use creative material to create a tangible and experimental interactive art piece. With some of the team member’s previous experience with using water in their personal projects, we thought to use water as it’s a very versatile medium due to it’s free flowing form. Using tools such as a solenoid to tap on water to create ripples, or use speakers in waters to visualize sound waves with water were some of the many initial ideas we thought of. However, with water being a difficult medium to control the form of, we were worried that the final piece would end up looking like a jacuzzi, which led us to finetune to our ideas further. 

After several sessions of brainstorming, we came up with the idea to mix water with some magnetic material to magnetize it which would also give us more control over its form. During our research, we found quite a few interesting projects that have guided us in new directions of exploration. Sachiko Kodama is a Japanese artist that dedicated the majority of her art practice to using ferrofluid and kinetic sculptures to making installations. By combining ferrofluid which can be a messy medium at times with kinetic sculpture which is inherently neatly structured, her works usually create order and harmony amongst chaos and come off as intriguing as well as energetic.  

j4

Sachiko Kodama Studio Website

Inspired by this kind of harmony generated by the juxtaposition of structure and chaos by the use of ferrofluid and kinetic sculpture, we wished to demonstrate it with ferrofluid in our practice for this project too. We purchased a bag of iron(III) oxide and started playing with it by mixing it with different different types of solvents including motor oil and washing up liquid while doing more research how it’s been incorporated to other artists’ works of art. 

 

 

One use of iron(III) oxide that drew our attention was to first mix it with oil and then encase the oily mixture in a container filled with water. For example, the project Ferrolic by Zelf Koelman utilizes the unmixable nature of oil and water and creates a versatile movement of ferrofluid within water with the use of magnets behind the display, and creates different patterns with different modes of operation.

 

We wanted the magnetic material to generate different patterns when visitors interact with it. The mechanism behind the project SnOil inspired us to make an array of electro magnets behind the magnetic material, and different patterns will be programmed to be generated through the activations of electro magnets on certain positions during visitors’ interactions. 

 

However, having talked to Kate and Nick during the proposal meeting, they shared their concerns about whether we would be able to manage an array of more than 10 electro magnets within 2 weeks’ of production period, as well as the cleanliness of the piece on display since ferrofluid tends to get messy. Having reevaluated these essential suggestions, we decided to abort the idea of making ferrofluid and only use iron(III) oxide powder as it generates a fur-like texture when magnetized and still creates a very aesthetically intriguing look. For example, the following project uses furry pom poms as part of the interaction piece to produce a very engaging experience.

 

We were also inspired by the mechanism behind this pom pom piece as it uses servos and turns a circular movement into a linear movement which we later used in our project to simply the movements of magnets. 

 

Process:

This project has undergone a series of experimentations, evaluations and adaptations. Having decided to explore creative material for this project, we chose to use magnets to experiment with. We first started making electro-magnets by wrapping conductive metal wires around a coil, and hoped to create patterns with magnetic material through the activation of an array of electro-magnets. The patterns will change and react to the on and off of the eletro-magnets. 

 

However, the magnetism of the electro-magnet is not strong enough and generates a lot of heat after a longer period of time when it’s activated. We worry that they would cause safety issues during the open show because of overheating and also the visual effect might not be desirable because of its weak magnetism. Eventually, we decided to use actual magnets instead. This change has led us to re-think the reactions of the magnets as they will not have the ability to be turned on and off, and will always have magnetism.

Therefore, we decided instead of making magnets become activated and deactivated, we could change their position through movement actuated by servos motors through a slider. 

It turns out the micro servo motors we have are too weak. One slider weighs around 6kg and the servo we have is only able to move weights up to 1.5 kg. We try to come up with different solutions to solve it. First, we changed the micro servo to a more powerful servo FS5106R which takes up to 6kg in weight. Second, we put the slider vertically instead of horizontally and use gravity to our advantage, and attach the slider to the servo through a piece of fishing wire.

 

 

 

However, it seems the slider is too heavy still, especially now that all the weight is put on the servo motor. The fishing wire also becomes less durable after the serco runs for a longer period of time. Finally we landed on the idea of laying the slider horizontally and using gears to achive the linear movement of magnets through the rotation of servos.

c1660601_main-1_1 lasercut-01

 

Having done that, the mechanism still seems inadequate because the full rotation servo doesn’t seem to have enough torque to move the gear plus the slider. We again had to change to an even more powerful motor, which led us to using stepper motors. We also favor the stepper motors over servos since they are simpler to code as it counts steps instead of time. 

 

Now that all the underlying mechanism is working. It’s time for us to put it all together. The production process begins. We have prototyped what the piece would look like in a digital 3D software and made sure that it would all come together in reality.

screen-shot-2019-11-25-at-4-28-38-pm

We started building the box with the help and guidance of a professional constructor friend. 

 

Finally, this is what our piece looks like.

j1

 

Code and Circuit:

https://github.com/jessiez0810/exp5/blob/master/nptest_2_with_sensor.ino

screen-shot-2019-12-08-at-2-06-57-pm

Reflection:

During the exhibition, the piece successfully achieved creating a sense of mystery and made many visitors to the open show wonder what it was and how we did it. During the critique session, we were suggested by Nick to get gloves to put besides the piece so people would know that they could touch it and maximize the level of interaction. However, having discussed it within our group, we agreed the gloves might temper with the eeriness the piece gave off and wouldn’t fit with the underlying tone. The gloves would also create a barrier for people to feel the texture of the powder, which is an essential part of this interaction experience – to find out what this alien material is through all means of explorations. 

To solve this issue, we decided to keep paper towels with us. After people touched it, we would hand off paper towels to them to wipe the powder off. However, if they didn’t ask us, we wouldn’t say anything about touching. As was expected, many people were tempted to touch the magnetic power on display and we allowed them to do it if they had asked. What’s interesting is that the seemingly dirty powder didn’t hinder people’s curiosity and a lot of people dig their hands into the powder, which seems to be a validation of success in mission of the mystery this piece tries to achieve. This project has taken us a lot of time in all stages of the process. However, it was truly rewarding to see it come together in the end after many trials and errors. 

j2 k3

References:

AccelStepper Class Reference. (n.d.). Retrieved December 8, 2019, from https://www.airspayce.com/mikem/arduino/AccelStepper/classAccelStepper.html.

Brownlee, J. (2018, July 10). MIT Invents A Shapeshifting Display You Can Reach Through And Touch. Retrieved December 9, 2019, from https://www.fastcompany.com/3021522/mit-invents-a-shapeshifting-display-you-can-reach-through-and-touch.

Ferrolic by Zelf Koelman. (n.d.). Retrieved December 9, 2019, from http://www.ferrolic.com/.

Kudama, S. Protrude, Flow (2018) Retrieved December 8, 2019, from http://sachikokodama.com/works/.

PomPom Mirror by Daniel Rozin. (n.d.). Retrieved December 9, 2019, from https://bitforms.art/archives/rozin-2015/pompom-mirror.

SnOil – A Physical 3D Display Based on Ferrofluid. (2009, September 20). Retrieved December 9, 2019, from https://www.youtube.com/watch?v=dXuWGthXKL8.

State Machine. (n.d.). Retrieved December 8, 2019, from http://www.thebox.myzen.co.uk/Tutorial/State_Machine.html.

 

 

 

Leave a window open for me

img_4037

Project title: Leave a window open for me

By: Neo NuoChen 

Project description This piece is meant to create a space which expands infinitely within a box. It is a reflection of myself and my personal experience of fighting with insomnia and depression. It welcomes the audience to take a peek inside the story and share the feelings through the “window”. 

visuals (photographs & screenshots):

  • (3-5) work-in-progress:img_3756 img_3843 img_4032 img_4053 img_3821
  • (3-5) portfolio-worthy final images

 img_4257

img_4037

dsc00075

dsc00072

edited video of the interaction: https://vimeo.com/378123696

 

code:https://github.com/NeonChip/NeoC/blob/master/exp5

circuit diagrams https://www.circuito.io/app?components=512,10190,11021,763365

Please note that the servo input is supposed to go to pin 9 and the LED panel input should go to pin 12. There was no breadboard being used for my project, and I was using a 16×16 LED matrix panel.

Project Context (500-1000 words)

I wanted to create this personal space within a box for the public because it is my story and I’m willing to share it with everyone. It is designed to be viewed through the “window” and one person per time so that the audience could have a more intimate and immersive experience.

This is a story about me going through insomnia and depression for several months when I was in New York. When I couldn’t sleep, the window was one of the things that I stared at the most, and I kept thinking about all the things that I shouldn’t have done and could have done. The stress was so real that it constantly haunted me in my head, the slightest light or noise became utterly unbearable. I could feel the bed spinning like a disc, or was it me spinning in my head? It didn’t matter day or night, they were the same, the thoughts kept me awake, the music that played was drowning me.

But I don’t want to drag the audience into a bad mood, it is not what my intention is, ever since I stepped out that situation, I’m proud of myself for doing that and I want to let people know that it is not undefeatable and you are not alone.

The initial idea of building an expanding space within a box was to see how the contrast between the outside and inside could be, the surprise element is always a twist that I want to collaborate in my design. I’ve been to multiple Yayoi Kusama’s exhibitions, and each time I’m amazed by how the glasses are forming these rooms. I was blessed to see Samara Golden’s work, The Meat Grinder’s Iron Clothes, at the Whitney Museum of American Art in New York. It was a multilevel installation built with mirrors to expand the space between where the audiences were standing. Looking up and down when standing there in between, the feeling was oddly satisfying, it made me lost in the thought of an existential crisis.

screen-shot-2019-12-08-at-18-32-27

screen-shot-2019-12-08-at-18-33-25

I was also inspired by Louise Bourgeois’ Untitled (Hologram Suite), 1998 – 2014. I like the way that she picked different items in different images to represent the different stories when creating this series of artworks. That helped me made my decision when I was debating whether to build a whole room full of furniture or go with one specific item. And I do agree that going with fewer items would express the feeling of loneliness more.

screen-shot-2019-12-08-at-18-32-48

“The holographic image is created by laser beams that record the light field reflected from an object, burning it onto a plate of glass. The image is scaled at a one-to-one correspondence with the original sculptural material that was created by Bourgeois. These elements reoccur in many of Bourgeois installations and are related to her interest in physical and emotional isolation and sexuality.”

The servo that I used, at first, I went for the servo in our tool kit and followed a tutorial trying to make it go 360, but the result was not what I was expecting, because it simply turned into a motor without the function of taking input, and the speed was too fast. So I re-soldered everything back and the servo was able to download codes again but the spin was not ideal and made a lot of noise as well as shakes. So I borrowed a servo that could go 360 without being manually altered from Li’s team, which helped me survived this.

As for the choice of LED colors was aiming to create this harmony between itself and the spinning bed, both of them seemed to form a peaceful and harmless space, but it also emphasizes the fact that it’s a space of absolutely no sleep.

Exhibition Reflection (250-500 words)

The open show was a blast, glad to see everyone’s work shine at their spots. It was interesting to receive feedback from the audience, especially when I ask them to take a look inside first without knowing the story behind it, most people described it as “beautiful and amazing”, which was great, because like I said, dragging people into a bad mood was never my intention, and people are free to feel whatever they feel. But after reading/hearing the background, they would be like, “yes, I can totally see that”. It made me wonder, did they actually see that because they’ve felt the same thing in the past? Or did they match the visual with the story? Either way, empathy was created and connected the audience with the piece, and I was more than happy about that.

Towards the end, there was a girl who came looked at my work and told me she knew exactly what I was going through because she went through the same thing before. We sat at the staircase and talked about the experience, and how good it felt talking about it, this made me realized the fact that there are more people than I expected who are facing a similar issue. And they should know that they are not alone, we are on the same boat here, and we will always be each other’s support.

I picked a location where fairly isolated and unnoticeable, which somehow went well with my concept, but not a lot of people went into the darker room where I was in along with the other three works. It would be nice if the audience knew about the room and paid us a visit:)

Reference:

Samara Golden. The Meat Grinder’s Iron Clothes, 2017. Whitney Biennial. The Whitney Museum of American Art, New York https://samaragolden.com/section/450598-THE-MEAT-GRINDER-S-IRON-CLOTHES-2017-WHITNEY-BIENNIAL-WHITNEY-MUSUEM-OF-AMERICAN-ART.html

Heather James Fine Art. Artsy.net. https://www.artsy.net/artwork/louise-bourgeois-untitled-hologram-suite

Yayoi Kusama, Infinity Mirrored Room – The Souls of Millions of Light Years Away, 2013. https://ago.ca/exhibitions/kusama

Unseen

(Un)seen

Nadine Valcin

unseen_still1

(Un)seen is a video installation about presence/absence that relies on proxemics to trigger 3 different processed video loops. It creates a ghostly presence projected on a screen whose image recedes as the viewer gets closer to it despite constantly trying to engage the viewer through its voice.

As visitors enter the room, they see a barely distinguishable extreme closeup of the ghost’s eyes. As they get closer to the screen, the ghost remains unattainable, visible through progressively wider shots. The last loop plays when visitors are in close proximity to the screen. At that distance the array of squares and circles that are layered over the video giving it texture become very distinct making the image becomes more abstract. The rhythm of the images also changes as short glimpses of the ghosts are seen through the progressively longer black sequences.

The video image is treated live by a custom filter created in Processing to give it a dreamy and painterly look.

Context

In terms of content, my recent work and upcoming thesis project deal with memory, erasure and haunting. I am interested in how unacknowledged ghosts from the past haunt the present. As Avery Gordon (1997, p.7) remarks:

“Haunting is a constituent element of modern social life. It is neither premodern superstition nor individual psychosis; it is a generalizable social phenomenon of great import. To study social life one must confront the ghostly aspects of it. This confrontation requires (or produces) a fundamental change in the way we know and make knowledge, in our mode of production.” (2008, p. 7)

This project was a way for me to investigate through image and sound, how a ghostly presence could be evoked. I also wanted to explore how technology could assist me in doing it in an interactive that differed from the linear media production I normally engage with. The video material for (Un)seen comes from an installation piece entitled Emergence that I produced in 2017. I thought the images were strong and minimalist and provided a good canvas for experimentation.

(Un)seen is heavily inspired by the work of Processing co-creator Casey Reas and his exploration of generative art.  I have been interested in his work as it explores the way in which computing can create new images and manipulate existing ones in ways that are not possible in the analog realm. Over the years, Reas has used various custom-built software to manipulate video and photographic images.

reas_transference_1

Transference, Source: Casey Reas (http://reas.com/transference/)

Transference (2018) is a video that uses frames from Ingmar Begrman’s black and white film Persona (1966). It deliberately distorts the faces represented rendering them unidentifiable and reflecting on contemporary questions around identity and digital media.

reas_samarra_1

Samarra, Source: Casey Reas (http://reas.com/samarra/)

He applies a similar image treatment in the music video Samarra (2016) and in Even the Greatest Stars Discover Themselves in the Looking Glass, An Allegory of the Cave for Three People (2014), an experience in which three audience members interact, mediated through cameras and projected images. In that piece Reas, once again looks at identity as through a technological lens against the backdrop of surveillance.

13129374695_9895c96e99_o13129674664_eea72112b1_o

Even the Greatest Stars Discover Themselves in the Looking Glass, An Allegory of the Cave for Three People, Source: Casey Reas (http://reas.com/cave/)
reas2-670

KNBC, Source: Casey Reas (http://reas.com/knbc/)

In KNBC (2015), Reas pushes his experimentation further, manipulating images to a level of abstraction where they become unrecognizable in the finished product, breaking their visual link to the original source material. The recorded television footage and accompanying sound are processed them into a colourful, pixelated  generative collage.

01-min_jfj07dvflm

Surface X, Source: Arduino Project Hub (https://create.arduino.cc/projecthub/Picaroon/surface-x-811e8c)

From the group project Forget Me Not (assignment 2), I retained the idea of working with an Arduino Uno and a distance sensor, this time to control the video on the screen. I wanted to create a meaningful interaction between the image and the distance that separated it from visitors..

The interactive art installation Surface X by Picaroon was cited in assignment 2 and is still relevant to this project because of its use of proxemics to provoke the closure of the umbrellas, revealing the underlying metal structure and mechanism when visitors approach. Whereas the cretors saw the activation of the umbrellas as a metaphor for the way we constantly prefect and control our digital personas and how they collide with reality upon closer inspection in the moments where all our cracks and flaws are revealed.

surfacex02

Surface X, Source: Arduino Project Hub (https://create.arduino.cc/projecthub/Picaroon/surface-x-811e8c)

In (Unseen), the proxemics are used differently, to signify the refusal of the ghost to visually engage with the visitor, or perhaps, signalling that its presence is not quite what it seems.

Process

emergence_4

Still from unprocessed original footage

I started by going through my original footage selecting all the takes from one of the four participants in the shoot for my installation Emergence. I chose this woman because she had the most evocative facial expressions and dramatic poses. I then created 3 video loops between 30 and 60 seconds in duration. The first loop is comprised of extreme closeups focused around the eyes, where the entire face of the character isn’t seen. The second loop consists of closeups where her entire face was visible. The third loop features shots that are a bit wider, but their duration is shorter and there is a significant amount of black between them.

 

unseen_still4

(Un)seen – loop 1

unseen_still2

(Un)seen – loop 2

unseen_still9

(Un)seen – loop 3

I originally thought of manipulating the video image through Adobe After Effects, but I encountered a Coding Train video by Shiffman that showed the potential of extracting the colour of pixels from a photograph (much like it is possible to do in Photoshop) to program filters that would change the appearance of images. It seemed interesting, but I didn’t know if it would be possible to apply those same features to a moving image, given the processing capacity necessary to play and render live video.

Some of the original footage was intentionally shot with a very shallow depth of field, leaving parts of the shots out of focus depending on the movement of the subject being filmed. As I started to experiment with textures, I found that images that were slightly out of focus helped blur the outline of the circles and squares that created were part of the video filter. I used the gaussian blur function in Premiere Pro to get the desired texture. It was a trial and error process, manipulating the footage in Premiere Pro, then in Processing through several iterations.

 

screen-shot-2019-12-08-at-15-04-58

Left: Original source footage, right: blurred footage

unseen_still10

Same footage rendered through processing

unseen_still8

Left: Original source footage, right: blurred footage

unseen_still5

Same footage rendered through processing

 

I recorded the soundtrack, then edited and mixed it. It consisted of a loop with a woman’s heavy breath on which a selection of 13 different short clips play randomly.

unseen_still13

The clips are mostly questions that demonstrate the ghost’s desire to engage with the visitor, but also at times challenges them. Examples include: Who are you? Where do you come from? Can you set me free? Do you want to hear my story?

 

unseen-circuit-diagram

The technical set-up was rather simple. The Arduino Nano was used to read the distance data from the LV-MaxSonar EZ ultrasonic distance sensor. The video for the idle state (loop 1) loaded automatically and two different thresholds were set to trigger the playback of loops 2 and 3.

Challenges

The distance sensor gave wildly different distance readings depending on the space and had to be patiently recalibrated several times. Despite the Arduino being set to send readings to Processing at 1500 millis intervals, the liminal distance at the thresholds for the different video loops triggers posed some problems creating rapid  flickering between the different loops. One might say that the system itself was haunted.

The ventilation in the classrooms at OCAD also proved challenging as despite playing at full volume on speakers, the soundtrack wasn’t fully audible except at very close range. The original intent was to have a 360 degree soundscape through a P5.js library to heighten the immersion and feeling of presence. Unfortunately I could not find an equivalent for Processing.

unseen_still3

unseen_still11

unseen_still15

screen-shot-2019-12-09-at-01-03-00

Closeup of image as seen up close, projected on a screen

The Exhibition   

The exhibition was a wonderful opportunity to get members of the OCAD community and of the general public to engage with the work. The fact that (Un)seen was in a separate room was at once an advantage and an inconvenience. Some people missed the piece because they concentrated on the main spaces, but those who ventured into the room focused their entire attention on the installation. Being the sole creator of the piece, left me with all the duties of engaging with visitors and didn’t allow me to visit my colleagues’ pieces, especially those from undergraduates or second year graduate students that I had not seen. I met and spoke with Digital Futures faculty that I hadn’t yet encountered as well as staff and students from other departments. It was a useful and engaging sharing that should happen more often as it created a real sense of community.

People were eager to engage with the piece and the feedback was overwhelmingly positive. Visitors understood the concept and enjoyed the experience. Because of the issues with the distance sensor, they had to be instructed not to move too quickly and to take the time to pause to minimize false triggerings. The only drawback to the room was the extremely noisy ventilation. Despite the sound playing at maximum volume on the room speakers,  the soundtrack and clips were barely audible. The fact that the door was open to entice people to come into the space only added additional din. It would also have been nice to have a totally dark space to present, but I ended up switching spaces with some of my colleagues in order to accommodate their project.

 

CODE: https://github.com/nvalcin/Unseen

 

BIBLIOGRAPHY

Correia, Nico. “Bridging the gap between art and code” in UCLA Newsroom, April 25, 2016 http://newsroom.ucla.edu/stories/bridging-the-gap-between-art-and-code. Accessed on December 6, 2019.

Gordon, Avery F. (2008). Ghostly Matters, Haunting and the Sociological Imagination. Minneapolis: University of Minnesota Press.

Picaroon (2018), Surface X in Arduino Project Hub. https://create.arduino.cc/projecthub/Picaroon/surface-x-811e8c. Accessed on December 6, 2019.

Reas, Casey(2019). Artists website. http://reas.com/ Accessed on December 6, 2019.

Rosenthal, Emerson, “Casey Reas’ Newest Art Is A Coded, Projected ‘Allegory Of The Cave’” in Vice Magazine, March 14, 2014. https://www.vice.com/en_us/article/mgpawn/casey-reas-newest-art-is-a-coded-projected-allegory-of-the-cave-for-thedigital-age  Accessed on December 6, 2019.