Author Archive

Brains Talking

cover2

 Group members: Siyu Sun, Mehdi Farahani,  Anantha Chickanayakanahalli

 “Brains being by themselves tend to get noisy especially with all the electronic devices to blame. When they are brought in proximity or closeness to other brains a sense of synchronization is possible and there is a scope of calmness. In this experiment, we are trying to represent this proximity by means of a physical installation of two objects that represent the two brains. They are activated by changes in their positions and when moved they demonstrate the noise of everyday sounds reverberating. These sounds come to a standstill and attain a silence when the brains are brought together (touching each other). This is meant to translate the effect people have on each other.”

 

Project Description

Brains Talking is an interactive artwork based on the study of  Bluetooth Low Energy (BLE). In this installation, we are digging into the term of Consciousness in the human mind. Consciousness means some kind of experience, cognition, feeling, imagination, and volition. We can perceive and receive the information in the world, and the brain is the most crucial tool to let it can be trued. There are so many nerve systems in our brain, and we can do the sensation during that, just like we can speak, we can touch, we can taste and smell the flavor, which can help us interact with the world. But compared with the un-consciousness, it is very easier to understand, just like we don’t have any self-thoughts, awareness, and can’t do the interactive. For instance, stones, sculptures, installations, even like our technology world. So we combine the still acrylic material and then use laser cutting to cut off the outline of the human brain, to mimic future human brains. when consciousness becomes unconscious, what can we humans do?

Output

  1. We put some sound from our life. Neurofeedback is a form of alternative treatment wherein brain waves are retrained through auditory or visual stimuli. Everything can be turned to sound in our physical world, and they can activate our neuro system to make the feedback. So we are thinking about mimicking a virtual environment that can interact with future brains. The sound in this audio track came from the original physical sound, like the movement elevator, boiling water, the shaking of keys, flapping the handrail, closing the door, and so on. All of these inadvertently form our communication system. In this installation, we are set up a defined distance that can activate the sound, people can use the distance to control the sound value, which means the distance from each other is never far away. As we all know, following the development of technology in our recent world, the relationship between humans is turned into a silent world. We always focus on intelligent devices which ignore the people around us. So we want to use this kind of idea to critique this phenomenon.

 

2.  Mimic the hand’s power

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212123241

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211213133923

microsoftteams-image-5

 

Experience Video

 

 

 

How it Works Video

 

Final Imagines

bcc88733fb35530c3812c8867a76146

3d600c505cc8621dccd883d8beb6966

d9c9115ed70566d38ca40af830e241b

light

 

 

Development  Images

Laser Cutting Sketch

laser

 

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212040232

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212040323

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212040307-%e6%8b%b7%e8%b4%9d

e540fcde067ada1c3289151ee67129b

 

 

Circuit Diagram

 

 %e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211212132222

 

Github

Processing File:

https://github.com/rewritablehere/digitalfutures/blob/main/brains_talking/brains_talking.pde

Arduino Code:

https://github.com/rewritablehere/digitalfutures/blob/main/brains_talking/central.ino

 

 

Biography

1-   Jade, Laura. “INTERACTIVE BRAINLIGHT- Brain Sculpture Lights Up Your Thoughts.” YouTube, Date accessed: December 03, 2021

URL: https://www.youtube.com/watch?v=ZQmA5X47ewU

2-   “BRAINLIGHT.” Laura Jade, Date published: August 29, 2018, Date accessed: December 03, 2021

URL: https://laurajade.com.au/interactive-brain-light-research-project/

3-    Jade. Laura – BRAIN LIGHT PROJECT – RESIDENCY 2015, Date published: July 08, 2019, Date accessed: December 03, 2021

URL: https://www.cultureatwork.com.au/laura-jade-culture-at-work-artscience-residency-2015/

4-     Kanai, R. (2010, August 31). Are Two Heads Better Than One? It Depends. Retrieved from https://www.scientificamerican.com/article/are-two-heads-better-than/#

Talking With Plants by Siyu Sun

 final-result1

 

Project Description

Talking with Plants(2021) is an interactive bionic artwork based on the Arduino study.  This project simulated the possibility of dialog between humans and plants, to let people empathize with plants. Plants play an important role in the ecosystem, and they account for about 80% of all biomass on the planet. [1] However, most of their feedback is slow, so people always ignore these existences. As the term Plant Blindness offered by Elisabeth Schussler and James Wandress, a botanist and biologist in the United States said,  “the inability to see or notice the plants in one’s own environment, leading to the inability to recognize the importance of plants in the biosphere and in human affairs.”

I hope to make plants”conscious” through this bionic plant installation. I used fiber optics and a culture utensil as materials to create the bionic environment. Then used fiber optics, and picked real plants from the forest by roots on three single LEDs to simulate growing plants. The interactive effect here is microphone/sound sensor inputs the sound signal to recognize the volume level, to identify different volume setting signals to light up corresponding color LEDs. After research, plants have a strong sensation, they can distinguish different colors of optical fibers, ultraviolet rays, and red high beams. They can change their physiological conditions according to different light.[2] Based on this concept, I chose LEDs lights for feedback output. So people can see the change situation of plants when they make sounds.

 

 

Discussion of Calm Technology Principles

In this work, I was engaging with the first principle  I. Technology should require the smallest possible amount of attention in Calm Technology. I am a firm believer in the principle that less is more. Therefore in this installation, I used a minimalist technique to make the interaction more direct. And through the creation of the environment and concept, I built the appearance of the installation as I wanted, and let the audience as far as they can perceive the thoughts that I expressed.

 

 

Experience Video ( 1 minute )

https://vimeo.com/manage/videos/639140685

 

 

How It Works Video

 

 

Final Project Images

2

overall

 

 

Development  Images

Brainstorming

brainstorming

process

sketch

 arduino-circuit-material-list

installation-part-material-list2

material-list

circuit-connection-storage

 

 

Circuit diagram

untitled-sketch-2_bb

GitHub

https://github.com/Liz715/Experiment2/blob/14aa66f765958bfb55d78555fb34fafd67dcede7

/Talking%20with%20Plants%20Arduino

 

Bibliography

[1]Jose, Sarah B., et al. “Overcoming Plant Blindness in Science, Education, and Society.” Plants, People, Planet, vol. 1, no. 3, 2019, pp. 169–172., https://doi.org/10.1002/ppp3.51.

[2] BioScience, Volume 53, Issue 10, October 2003, Page 926, https://doi.org/10.1641/0006-3568(2003)053[0926:PB]2.0.CO;2

Experiment 1 – Visualization of Gossip | Siyu Sun

—— Gossip can destroy a person or make them strong.

 

%e5%be%ae%e4%bf%a1%e5%9b%be%e7%89%87_20211001002040

 

Description

Visualization of Gossip(2021) is an immersive interactive narrative work between the human and the environment.

The technique I used here is based on the case study in ml5, PoseNet, Sound Visualization in p5.js. So audiences don’t have to use the external input device, such as use mouse to implement click events. I set up creatCapture() in order to connect the camera, then combined it with PoseNet’s tracking system, and “scroll”, “click” experiment, to get feedback on the sound, graphics, and status of the control screen.

Back to the creative part, this work is divided into four narrative clues. The control feedback system and purpose of each theme are different. In combination with the research done in Experiment 1, I hope to establish an immersive area combining audiovisual and express my conceptual understanding of “Rumors/Gossip”.  I used this technique of expression due to I am concerned that the media has the potential to attract multiple senses through rich information, and has plenty of potentials to influence audiences in perception, cognition, and emotion. The sensory or perceptual mode, surround effect, and resolution in the immersive experience will help the audience create a sense of presence in virtual events, and associate the sense of consciousness or substitute reality. [1]

7ede8242bf8631016cd838f9b546ac0

 

 

Context

Study One: Movement and Feedback | ml5, createCapture(), preload()

The prototype of Gossip

This is my first work. I use harsh noise images to show what external evaluation brings. The cacophony of music goes up and down. When people hold their head and rock from side to side, the noise image follows people’s movement, it’s like noise in your head that you can’t get rid of, swirling back and forth.  

The most technique I study in this part is knowing the ml5,PostNet, setting up a Creaturecapture function, and know-how to preload media files.  The most key thing I would like to mention is I used the person’s silhouette instead of a real person over the camera, that’s because I want the whole works to have a sense of unity. So in order to  realize it, the important thing you must do is upload media file in function preload(){

Then, I combined the Capture function of ML5. PoseNet to Capture the position of the audience’s Nose to control the movement of the background picture, and set the noise image at the position of the eyes and Nose to adjust the specific orientation and make it follow the movement of the audience so that it is difficult to get rid of this concept.  

How to do 

Hold your head and shake left and right.

Problem

The problem I met in this process, the visual identification is not particularly accurate and it’s better to change the soundwave image to true vibration frequency, but I don’t know how to combine them together.

Reference

createCapture():Creates a new HTML5 <video> element that contains the audio/video feed from a webcam. The element is separate from the canvas and is displayed by default. The element can be hidden using.

Preload():  This function is used to handle the asynchronous loading of external files in a blocking way.

1

p5.js Present:   https://preview.p5js.org/lizsun703/present/Qhbzo9zdl

p5.js Edit:         https://editor.p5js.org/lizsun703/sketches/Qhbzo9zdl

https://youtu.be/nRb6ZiC-kqY

 

Study Two: Movement As a Click | MousePressed()

— Don’t be afraid

When gossip invades your brain, the only thing you have to do now is beat them and rebel against the odds.  In this part, I combined the Click function of mousePressed to change the image.  I’ve set up two different figures — controlling the variation of the picture by identifying the viewer’s waist height, and the noisy image disappearing when the viewer raises their hand.  It stands for defeating the gossip.  

How to do

Move from side to side and raise your hands to see the image disappear.

Problem

However, the biggest regret in this part is that I want to make the noise sound disappear when I raise my hand, but I have tried many times without success. I hope to continue to study this question in the next few days.

Reference

MousePressed()This function is called once after every time a mouse button is pressed over the element. Some mobile browsers may also trigger this event on a touch screen if the user performs a quick tap. This can be used to attach element-specific event listeners.

 

uhg1bx8hal

p5.js Present: https://preview.p5js.org/lizsun703/present/M38Co7k5r

p5.js Edit:       https://editor.p5js.org/lizsun703/sketches/M38Co7k5r

 

 

Study Three: Sound Visualization | waveform()

— Self-digestion

This is an area full of noise and language.  The inspiration of this work comes from the interactive text experience of “IN ORDER TO CONTROL” In this study, sound visualized learning was conducted to determine the waveform to read the amplitude of sound and draw the waveform of the sound. The mic was also set so that the audience could speak and the amplitude would also be realized.  And I set up a lot of words on the screen, praise, criticism, insult, people can speak any works in here and can see the different amplitude was changed. It’s really funny that it’s my first time to try this part, and I saw the tutorial to wrote down the code in random(in order to try), then I got the effect I want to have haha—- The soundwave was a fusion with the profile, that means people can make balance with gossip.

How to do

In this process you need to watch the text on the screen and listen to the soundwave noise, you can speak anything you want, and the microphone will record the voice then feedback to the waveform.

Problem

In the beginning, I designed to make these sentences scroll up and down, but I tried it hard to be scrolling them together. 

Reference

Waveform()

x9qumd3btk

p5.js Present: https://preview.p5js.org/lizsun703/present/BsiRBoJzX

p5.js Edit:       https://editor.p5js.org/lizsun703/sketches/BsiRBoJzX

 

 

Study Four: Dynamic Text

— Balance yourself

After experiencing the whole work, I hope to let the audience know how to establish their own balance point in the gossip: We live in a space full of all kinds of external voices, and all kinds of gossip, like knives and stones, affect us, bring us to pressure, but also label us. It made me imagine the weight of pressure, rumors can destroy a person or make them powerful.

How to do

In this process, you need to raise your hand and move your body everywhere you want. I  set up a bad sentence ” You’re a jerk!” on the left, and the praise sentence”You’re so creative!” on the right. You can find it to be in a balanced way.

4

p5.js Present: https://preview.p5js.org/lizsun703/present/8XF0zEzkO

p5.js Edit:       https://editor.p5js.org/lizsun703/sketches/8XF0zEzkO

 

 

Others

As I mentioned before, the sound visual study is my favorite part and I do the related works since 2019, using Kinect combined TouchDesigner.

Siyu Sun: ” RUMOR” Kinect with TouchDesigner interactive art device (vimeo.com)

 

Biography

[1]Teresa Chambel. 2016.  Interactive and Immersive Media Experiences.  In Proceedings of the 22nd Brazilian Symposium on Multimedia and the Web (Webmedia ’16).  Association for Computing Machinery, New York, NY, USA, 1.  DOI:https://doi-org.ocadu.idm.oclc.org/10.1145/2976796.2984746

 

 

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.