What Is My Purpose?

Project Title: What Is My Purpose?

By: Nilam Sari

Project Description:

This project is a 5x5x5 inches wooden cube with a 3D printed robotic arm. A hammer head shaped piece is attached at the end of its arm. The arm repeatedly hits the top part of its own body, a sheet of clear acrylic. This robotic piece appears to be self-harming itself.

Process: 

I started this project by creating a timeline because I thought I should be more organize with this project to meet the tight deadlines.

experiment-5-timeline

I modeled my design on Rhino3D to help me visualized the arm that needs to be fabricated with the 3D printer.

document1

At first I created the arm to hold a chisel, but after printing and testing it with a servo, the servos couldn’t handle the weight of the chizel so I compromised with an IKEA knife. That didn’t work either, so I compromised with this 3D printed hammer head that hold a 1in bolt.

document2

At first I had trouble attaching the servo motors into the 3D printed parts, but Arsh suggested that I cut out the little servo extensions and fit it into the 3D printed parts instead of trying to make a hole that fits directly into the servo, and it worked perfectly (Thank you Arsh!).

Next is time to code the movements of the arm.

At first, I achieved the movements that I wanted, but it is too predictable, it feels too ““robotic””. So I randomized the movements within a range and got the results that I wanted.

Then I worked on the wooden body part. This went smoothly and pretty uneventful. The lack of tools in the maker lab makes the process take longer than it needed to, but I managed to do it according to my timeline.

document3

When I was done with the wooden body, I attached the arm onto it.

When I did a test run, it was doing okay. The only problem is the chisel/knife problem I mentioned above. Then I installed the inserts for bolts at the corners of the top of the box to secure the 1/16 inch acrylic sheet.

At first I wanted this piece to be battery run, complete with an on/off button at the bottom of it. But when I tried using 9V battery it wasn’t strong enough to run the two servos. So I asked Kate for help and learned that it’s more about the current rather than the voltage. So I got four AA batteries on series and try to run it again. It was still not enough. Kate suggested to put another four AA batteries and attach it parallel to the other four. And it worked!

However, the battery couldn’t last long enough and the servo started getting weaker after a couple of minutes. It was a hard decision, but I had to compromise and use cable to power the robot from the wall outlet.

For the show, I originally wanted to add delay() so the servo motors get breaks in between and don’t overheat. But when I added delays the motor doesn’t run the same way it did without the delay.

Code: https://github.com/nilampwns/experiment5

Video: 

Visuals:

1

2

Circuit Diagrams: 

experiment-5-wiring

Project Context:

We are constantly surrounded by technology that do tasks for us. When machines do not carry out their tasks properly, they are deemed broken. Can we co-exist with machines that are not designed to serve us, humans? What kind of emotion is evoked by seeing a machine hitting itself over and over? Are we projecting our feelings onto non-living objects? These were the questions that lingered in my mind when I decided to create this robot.

I believe that robots have the potential to teach us, humans, how to empathize again. That is basically the belief that I have that drove me into graduate school and life in general. This piece that I created for experiment 5 is in some way a part of my long term research in life. Can robots teach us, humans, to empathize with a non-living being, and ultimately, with each other?

There has been multiple research that ask the participants about the possibility of the robots getting hurt or where participants are asked to hurt the robot. As Kate Darling (2015) wrote on her research report paper, “Subjects were asked how sorry they felt for the individual robots and were asked to chose which robot they would save in an earthquake. Based on the subjects preferring the humanoid robots over the mechanical robots, Riek et al. postulate that anthropomorphism causes empathy towards robotic objects.”

People tend to empathize more with robots that look like them. I want to push how far can I remove my piece from anthropomorphization as much as I can, and push it even further by making a robot that its whole purpose is to hit itself. That’s why I created the body to look like a wooden cube, with visible microcontrollers, the only thing that makes it looks a bit anthropomorphized is the robotic arm.

The act of self-harming is jarring. It’s emotional. It’s a sensitive topic. But what does self harming mean to a robot that cannot feel pain? It does not have sensors that detect pain.

Not so much about self-harming, but Max Dean’s piece, “Robotic Chair” (2008) is a robotic chair that explodes it self into multiple parts, then it slowly search for its missing piece and putting itself back together again autonomously. The viewers’ reactions were positive emotions. “As stated further on the Robotic Chair website, the chair has been shown to elicit “ empathy, compassion and hope ” in its viewers.” (Langill, 2013).

I acknowledge that my approach is very philosophical. The title of the work itself is “What Is My Purpose?” a question that even us humans have not found the answer to yet. I hope to make people think that it’s okay for machines to not have a purpose, to not serve us, still exist around us, and for us to still empathize with them. That way maybe humanity could learn to empathize more with each other.

Exhibition Reflection:

The exhibition was lovely. I got to talk to so many people about my research, receive feedback and reactions, or just simply chat with them. Unfortunately, the servos of my piece got overheated and busted 30 minutes into the show, I thought it was the arduino but I unscrewed the piece to hit the reset button but it still didn’t change anything. I tried to let it cool down for a couple of minutes but it also didn’t work. I have a video of it running so I was showing the videos to people who were interested in seeing it. Thankfully it was running when it was the busiest.

People told me they liked my work. I asked them what they like about it, and a couple of them said that they think it’s funny and silly. Some said they can feel the frustration of the robot. Some felt so empathetic that they felt bad watching the robot hitting itself over and over. One person even said “idiot robot” at it. It was a mixed bag of reactions but most people enjoyed the piece.

References:

Darling, Kate.Empathic concern and the effect of stories in human-robot interaction”. with P. Nandy and C. Breazeal. proceedings of the IEEE international workshop on robot and human communication (roman). 2015. 

Dean, Max. “Robotic Chair”. 2008.

Langill, Caroline Seck. “The Living Effect: Autonomous Behavior in Early Electronic Media Art”. Relive: Media Art Histories. MIT Press. 2013.

Project Proposal: What is my purpose?

Project Title: What is my purpose?

Work by: Nilam Sari

Project Description: 

This project will be a small part of my thesis project. This project is going to be a 5 x 5 x 5 in wooden cube with microcontrollers inside that controls its arm to repeatedly stab itself with a chisel, slowly chipping off its body. The purpose of this project is to evoke an emotional reaction from its viewers.

Parts / Materials / Technology list: 

  • Wooden case (hard maple)
  • 3D printed arm parts
  • Chisel
  • Arduino Uno
  • 2 Servo motors

Work Plan:

experiment-5-timeline

Physical Installation Details:

The work will be battery powered, autonomously moving, non-interactive, sitting on top of a pedestal. There will be an on and off switch.

Resource List:

One small width pedestal to hold a 5 x 5 x 5 in cube at display height (around hips).

Information for Open Show:

Would like to borrow the pedestal from the graduate studies. Work can be shown anywhere in the gallery or the 7th floor space.

Fire Pit

Project Title: Fire Pit
Names of group of members: Nilam Sari, Neo Chen, and Lilian Leung

Project Description

This experiment is a continuation of our first project “Campfire” from Experiment 1 by Nilam and Lilian. The experiment expands upon the topic of conversation, from the importance of face-to-face communication over the use of mobile devices, to exploring the power of anonymity and distance with a platform that allows participants to type their negative feelings into the text box and literally throwing their negative emotions into a virtual fire.

Project Progress

November 8

The team worked together to update the imagery of the flame, exploring different shapes and beginning to style the text box and images. One of the references we pulled was from aferriss (2018)’s Particle Flame Fountain. Taking their source code, we revised the structure of the fire to connect it with user presence from PubNub. 

1

(Source: https://editor.p5js.org/aferriss/sketches/SyTRx_bof)

Then we implemented it into our interface.

2

November 9

The team added additional effects such as a new fire crackle sound, to emphasize throwing a message into the fire when users send a message. Attempted to create an array that would cycle with a set of prompts or questions for participants to answer. Rather than just having the message stay on screen and abruptly disappear, we also added a timed fade onto the text so that users can see their message slowly burn.

November 10

We worked on the sound effect for when messages are being “thrown” into the fire, and managed to combine two files into one for the sound that we want. 

Changed the way to submit the text by using your finger to swipe up rather than pressing the button. Along with the new fire crackle sound, the fire temporarily grows bigger every time it receives an input. The input text box resets itself after every message is sent.

We tried to add an array of text for questions/prompts, but haven’t been able to display the selection of the questions randomly. When random() is used, the following error message shows up: 

“p5.js says: text() was expecting String|Object|Array|Number|Boolean for parameter #0 (zero-based index), received an empty variable instead. If not intentional, this is often a problem with scope: [https://p5js.org/examples/data-variable-scope.html] at about:srcdoc:188:3. [http://p5js.org/reference/#p5/text]”

The random() is now working by calling random(question); rather than the index number let qIndex = random(0,3);. Now everytime the program is opened there will be random questions and will get randomized again every time the user submitted an input.

Nov 11:

The sound of a log of wood being thrown into the fire is added to the program. Everytime a new input is being sent the sound plays. We also changed the CSS of the input box to match it with our general aesthetic. We then added a prompt to tell people how to submit their message instead of using ‘enter’

3

Nov 12

We are trying to change the fire’s color when the text input is sent and adding a dissolved effect for the text.

4

What we have so far:

  1. The more people that join the conversation, the bigger the fire becomes
  2. The text fades out after a couple of seconds, no trace of history
  3. The fire changes color and plays a sound when new input is being thrown into it
  4. Swipe up to send the messages into the server and fire
  5. Prompts added just above the text input

To communicate our concept, we felt that producing a short film for our documentation would be the most effective. To do this and be able to record content individually, we created moodboard of the visual style we wanted to use for our film.

5

Our final step was to put the film together as both presentation and documentation.

Project Context

This experiment is a continuation of our first project “Campfire” from Experiment 1 by Nilam and Lilian. The original concept for the piece was creating a multi-screen experience that was focused on coming together and the value of being present and having a face-to-face conversation. Participants were asked to place their phones on a foam pedestal that light all the screens to recreate a digital campfire.

img_7854_edited doc1

Switching the concept we explored the use of distance and presence online and the practice of speaking into the void when sharing content online; for example, twitter or tumblr, where users can post their thoughts without expectation of a response. People’s personal posts differ from vague to incredibly revealing and are a method of venting and also personally working through one’s own thoughts.

Within Sherry Turkle’s book Alone Together: Why We Expect More from Technology and Less from Each Other (2011), she writes how technology can be seductive because of what it offers for human vulnerabilities. Digital connections and networks provide the illusion of companionship without the demands of friendship. The platform is preferably used on a mobile device because of the intimacy of a personal handheld device.

6

Our idea was to have participants land on our main screen with a small fire. The size of the fire is dependent on the number of participants logged on, though the amount of individuals is hidden to maintain the presence of anonymity. Participants can’t tell how many users are online exactly but are aware there is a body of people depending on the size of the flame. Turkle (2011) writes about anonymity as compared to face-to-face confession as having the absence of criticism and evaluation. Participants are able to take their thoughts and throw them into the fire (by swiping or mouse drag) as both a cathartic and purposely gesture. 

6

Participants are able to see their message on screen after submission and have it  burned in the flame. When participants swipe or drag the screen, there’s the sound of a wood block and fire crackle as the image gets send, as the message metaphorically feeds the flame. The colour of the fire changes as well on send. 

Other participants on the platform can see submitted message temporarily on their screens, the change in the fire both informs interaction with the fire and encourages other to submit thoughts on their mind troubling them as well to burn as well. Rowe in Write it out: how journaling can drastically improve your physical, emotional, and mental health (2012) describes the use of journaling and writing out one’s emotions has been proven to reduce stress by allowing people to open up about their feelings. 

7

Once the message is sent and burnt, there will be no traces of the message anywhere. There is no history stored in the program or PubNub. It is as if the thoughts that the user wrote and threw into the digital fire and the thoughts become digital ashes. This is both symbolic and literal in terms of leaving no digital footprints. While PubNub allows developers to record the users’ IP addresses and information, we choose to not record any of the users’ information.

This work wasn’t inspired but harks back to the use of PostSecret, a community mail project created by Frank Warren in 2005, which allowed people to mail confessions and secrets anonymously which were hosted online.

 7

fireresponse

 

Project Video https://vimeo.com/373640297

Project Code on Github https://github.com/nilampwns/experiment4

References:

aferriss. (2018, April). Particle Flame Fountain. Retrieved from https://editor.p5js.org/aferriss/sketches/SyTRx_bof. 

https://audio-joiner.com/

Rowe, S. (2012, March-April). Write it out: how journaling can drastically improve your physical, emotional, and mental health. Vibrant Life, 28(2), 16+. Retrieved from https://link.gale.com/apps/doc/A282427977/AONE?u=toro37158&sid=AONE&xid=9d14a49b

Turkle, Sherry. Alone Together : Why We Expect More from Technology and Less from Each Other, Basic Books, 2011. ProQuest Ebook Central, https://ebookcentral.proquest.com/lib/oculocad-ebooks/detail.action?docID=684281.

Yezi, Denise. Maybe You Need A Tree Hole Too, May 3 2010 https://anadviceaday.wordpress.com/2010/05/03/maybe-you-need-a-tree-hole-too/

I Might be Just a Text on Your Screen

Project Title: I Might be Just a Text on Your Screen
Names of Group Members: Nilam Sari

Project Description:

This experiment is a tool to help someone who is experiencing panic attack. “I Might be Just a Text on Your Screen” walks the interactor through their panic attack(s) through a series of texts and an interactive human hand shaped piece of hardware to hold on to.

Work in progress:

The first step that I took to figure out this project is to compose the text that would appear on the screen. This text is based on my personal guide that I wrote to myself to go through my own panic attacks.

text1

The reason that I included the part where it says “I might be just a text on your screen, with a hand made out of wires” is because I don’t want this to be a tool that pretends to be something else that it is not. It is in fact just a piece of technology that I, a human, happen to create to help people who are going through this terrible experience I’ve had.

The next step is to figure out how to display the text. That was a learning curve for me because I never worked with Strings on processing before. I learned from processing’s online reference guide on how to display text. It was fine until I ran into the problem of how do I display the text to appear a letter by letter as if it’s being typed out on the computer.

At first I thought I had to separate the text a character by character, so I watched a tutorial on Dan Shiffman’s channel, and ended up with this

text2

But to make the characters appear one by one means I have to use millis, so I did that and ended up with this

text3

But it didn’t work the way I wanted it to. Instead of displaying the characters one by one, the program just froze up for a couple of seconds, and displayed the characters all at once. So I went through more forums and found a simpler way to do it without using millis, and incorporate it in my file.

After I got those done, the next step I took was to build the physical component. I want this physical component to act as an extension of the computer that allows the interactor to navigate through the text with it. I bought a 75 cent pair of gloves and stuffed it. Then, it was time to work on the velostat. I tested it out with a fading LED light and map it to the amount of pressure on the velostat.

img_8073

I followed the step by step on Canvas and velostat testing worked fine, the light goes brighter when more pressure is put on the velostat and dimmer when there’s less pressure. I’m using the same tool but instead of using mapping, i just use multiple thresholds between 0 and 1023 so the program knows when the sensor is pressed at different pressures. 

I slipped the velostat into the glove, so when you squeeze the ‘computer’s hand’, it will activate the velostat. I went to Creatron to get a heat pad to put inside the glove, to mimic body heat. It’s powered by Arduino’s 5V port.

pic1

The next step was to figure out how to go through the text page by page. I had trouble figuring this out so I asked Nick about it. He suggested to create an int pageNumber. And put it at the end of every message. I added a timer with millis() to create a couple of seconds buffer before it changes pages. It worked wonderfully.

There were a couple of hiccups here and there while I was programming, but taking a break from it and going back into it helped me with solving most of the problems I didn’t mention above.

After everything was set, I put together the wires and soldered it into a small protoboard.

pic2

img_8103

Link to code on Github: https://github.com/nilampwns/experiment3

Documentation:

document1

pic3

project3diagram2

Project Context:

This project’s idea came from my own experience dealing with panic attack. Panic attack symptoms include physical sensations such as breathlessness, dizziness, nausea, sweating, trembling, palpitations, as well as cognitive symptoms like fear of dying and going crazy. Some people who suffer from Panic Disorder can experience this more than four times a month. Medications and therapy can help cure Panic Disorder, but not everybody has access to those things. This project is not a tool to replace medical treatment of Panic Disorders, however, it can be a nice tool that helps walk one through their panic attacks when no one else is around to assist them. Because it can get really scary to deal with this on your own.

When I used to suffer from constant panic attacks, in my wallet, I kept a piece of paper that had instructions on how to get through a panic attack on my own. These instructions are messages from myself for myself who are having a panic attack. This inspires me to create the text that appears on the screen. I thought, if a piece of paper could help me go through my own panic attacks, then an interactive piece would be a step up from that.

Technology has been used to help people with mental health issues, especially on smartphones. Smartphones apps provide useful functions that can be integrated to conventional treatments (Luxton et al., 2011). There are already apps out there that helps people with their anxieties such as Headspace (2012) that helps people meditate through their app, and MoodPath (2016), and app that helps people keep track of their depressive episodes.

pic4

(Left, Headspace (2011); Right, MoodPath(2016))

However, I don’t want this tool to appear as something that it is not. I don’t want this project to pretend like it understands what the interactor is going through. In the end, this is just a string of codes that appear on your screen, along with a physical interactive piece that is made of wires.

This reminds me of a point Caroline Langill made in regards to Norman White’s piece, “… an artwork that heightens our awareness of ourselves and our behaviours by emulating a living thing rather than being one.” It is performing empathy and offering a companion without it knowing that those are what it is doing. So if the interactors feel like they’re being empathized with, is it a real empathy that is being offered by this project? Is it real empathy that the interactor is feeling, or is it mere illusion of empathy from a machine? Sherry Turkle asked this question in her book “Reclaiming Conversation”. Turkle raised the concern of technology replacing actual human contacts. For my project, I don’t want this to be something that replaces treatments or help from other people and the society, rather a tool to close the gap in human fallacy, of not having mental health resources vastly available for those who need it.

Reference

Langill, Caroline S.. “The Living Effect: Autonomous Behavior in Early Electronic Media Art”. Media Art Histories. MIT Press, 2013.

Luxton, David D.; McCann, Russell A.; Bush, Nigel E.; Mishkind, Matthew C.; and Reger, Greg M.. “mHealth for Mental Health: Integrating Smartphone Technology in Behavioral Healthcare”. Professional Psychology: Research and Practice. 2011, Vol. 42, No. 6, 505–512.

Turkle, Sherry. “Reclaiming Conversation: The Power of Talk in a Digital Age”. New York: Penguin Press, 2015.

 

 

 

 

The Red Field

Project Title: The Red Field
Project By Arshia Sobhan, Jessie Zheng, Nilam Sari, Priya Bandodkar

Project Description

Our project is an experimentation on light sequences. The piece is meant to be hung on a wall and gets activated when the viewer walks pass by it. The light sequences change based on the interaction the viewer has with the piece. We used mirror board and reflective clear acrylic sheet to create an infinite reflections for more an immersive illumination.

Ideation Process

The idea of the project has gone through series of evolution. At the start of this project, we jotted down the ideas that could potentially be built upon and/or combined. We tried to expand the interaction experience of the users as much as possible with our ideas even with the limited number and categories of tools available to us.  

1

Eventually, we came to an agreement to build something simpler with the limited timeframe to complete the project, yet experimental and aesthetically pleasing so we could still practice our design skills as well as familiarize ourselves with physical electronic basics and the Arduino programming language. Inspired by the LED light cube video on YouTube, we brainstormed ideas to make a variation of it to combine incorporate users’ physical interactions with the lights/ sensors as an essential part of this project. To make sure the project is mostly finished before the day of presentation, we made a schedule to keep us on track since we only have about 5 days to make the project.

Project Schedule

2

Based on the distance sensor data input and LED lights output information, we have explored the possible combinations of how they relate to each other. Initially, we hoped to use 3 distance sensors so that each distance controls a separate attribute of LED lights, for example brightness, blink rates and blink patterns. 

The idea behind our project was to collaboratively control the display of the light in the box in the same way DJs mix their music. Based on this idea, we created a light panel and a controller as the main part of the piece.

Work in progress

Day One (Monday, 7 Oct)

We have established 4 modes, which are idle mode, crazy mode, meditating mode and individual mode. To generate more interesting behavior patterns for the LEDs, we soldered 4 groups (red, blue, purple and green) of LEDs together, leaving one group (8 LEDS marked in yellow) in the center unsoldered in an attempt to have individuality in the midst of unity for LED patterns. To further increase the aesthetic appeal, we decided to use an infinity mirror to put behind the LED lights so that the lighting effects will be enhanced and amplified even more as if there are infinite blinking and fading LEDs.

panel-light-pattern

Day Two (Tuesday, 8 Oct)

We divided the coding into 4 different parts, with Arsh, Priya and Nilam each designing one of the modes of different lighting patterns of the four groups of LEDs that are soldered together, while Jessie designing a separate set of behaviors for the unsoldered group of LEDs. 

We regrouped a few times to make adaptations to our code for the maximum amount of clarity when it comes to users’ interactions with the sensor. Using the same thresholds becomes important when it comes to working individually on our own code and combining it altogether in the end. We tested different sensor values to come to the final threshold numbers.

Day Three (Wednesday, 9 Oct)

In order to hide the distracting wires on the back of LED lights, we designed and laser cut a box to encase the LED light panel as well as the wires at the back. We also designed a pyramid to place 3 sensors at the center of the each side for users to interact with to control the lighting behaviors and patterns. However, we realized by having 3 sensors will significantly affect the speed of execution of the code. Eventually, we decided to use only 1 sensor for this project and utilize different physical ranges to trigger different behaviors for the LED lights.

3

4

With our code close to finish, we started soldering the 4 groups of lights together so we can have the code tested on the LED light panel to see if the light patterns work well together since they have been done by separate people. We soldered the lights in parallel rather than in series in case that one of the lights burns out, it won’t affect the others LED lights that have been soldered onto the same group.

5

To achieve the effect of the infinity mirror, we got reflective acrylics from the plastic shop at the main building. We got this mirror-like acrylic for the base layer of the LEDs, and used clear transparent acrylic and coated it with a reflective layer as the cover for the box. We experienced some struggles while trying to coat the cover acrylic, as air bubbles got between the acrylic and the coating. However, it still looks good with all the physical elements combined.

 

Day Four (Thursday, 10 Oct)

On the final day before the presentation, we finalized the code together in case if they don’t work together. Problems occurred as we tried to do so. Arsh and Priya’s code couldn’t work together for some reason which we couldn’t figure out. Having consulted Nick, we learned that Boolean State is a digital function and can’t work the same time with analogWrite(), as one pin can either be assigned digitalWrite() or analogWrite(), but not both at the same time. We adjusted our code accordingly to solve this issue in the end.

With Arsh, Priya and Nilam finished with their code, Jessie had trouble achieving the effects of making 8 unsoldered individual LEDs blinking one after another, with setting different blink rates in an LED array. However, the 4 groups of LEDs already blink and fade in a coherent and unified manner with Arsh, Priya and Nilam’s code. We decided to let Jessie continue to work on her code. If she worked it out before the presentation, we would have more interesting light patterns. If she couldn’t, the LED panel worked well as the way it was and she could work on it after the presentation as well.

Day Five (Friday, 11 Oct)

Jessie eventually made the 8 individual LEDs behave in the way she desired them to. Unfortunately, there wasn’t enough time to assemble the lights together before the presentation, so we presented it as the way it was. During the presentation, Nick offered some insight on the psychology of human behaviors and possible interactions with our LED panel. He encouraged us to think more about how we could use this to our advantage by discarding the sensor pyramid completely, and hide the distance sensor somewhere as a part of the main body of our LED panel for example. Users would get closer to it in order to find out what triggers the lighting behaviors and have a more intimate and physical experience with the LED panel.

Project Adaptation After Presentation 

After the presentation, we received an important feedback that by having a separate controller, the physical distance between the piece and the controller might impede the natural interactions, because the controller would limit the physical spaces participants could utilize to play and experiment with, which basically only allows the users to wave their hands around it like a zombie. During break, we made changes on the display and concept of our project.

The new piece is meant to be hung on a wall and only gets activated when the viewer walks pass by it. In the new version of our project, the idle state of the wall piece becomes completely dark. It won’t have any sort of reaction until someone walks pass by it and activate the piece. Once the piece is activated and received attention from the viewer, the light sequences on the wall piece will change depending on different ways of interactions.

This new concept plays on the concept of proxemics, and try to minimize space or even eliminate it between viewers and the collaborative aspect. We thought that with this new concept, more focus would be placed on human relationships with the space around them.

Video of interaction

Link to Code

https://github.com/arshsob/Experiment2

Documentation

portfolio-image-fhd

led-board-fritzing

 

Technical Challenges

Due to our very limited experience with Arduino and coding, we faced several technical challenges on our way.

The first issue occurred when we were trying to control the brightness and the blink rate of LEDs at the same time. We understood that we can’t use analogWrite and digitalWrite to the same LED set simultaneously. This Issue was resolved by adding a little more code and changing all digitalWrite functions to analogWrite.

The second issue happened when we connected the LED board to the Arduino. At the test stage, the data coming from the distance sensor was reasonably smooth when there were only 4 LEDs, each connected to an output pin. After connecting the LED board, the distance data was wildly fluctuating and making it impossible to interact with it. This fluctuation was a result of the electrical noise caused by many wires connected to the board.

6

As suggested by MaxBotix, we added two components to filter the noise to our board: a 100ohm resistor and a 100µF capacitor.

7

Adding these components stabilized the distance data significantly and resolved the issue.

Finally, to amplify the brightness of LEDs, we used a transistor for each LED group. Otherwise, all the LEDs were too dim to demonstrate the fade effect relevant to distance changes.

After modifying the idea with regard to presentation feedback, the effect displayed for someone passing the box was another challenge, considering that it was supposed to happen only once after distance changes detected by the sensor. Using a variable to store the time of sudden change and several conditions over duration of fade-in/fade-out effect, the issue was resolved. However, there seems to be some kind of conflict among them, causing minor flickers during the effect.

Several attempts to use a sine function failed trying to create a degree related to time past after the sudden change and to limit it between 0 and PI, due to unnatural (and uncontrolled) behaviour of the output.

Project Context

The work by Harry Le, an 8x8x8 LED Arduino cube project, and The DIY Crab, a DIY infinity mirror coffee table on Youtube.com gave us the inspiration for this project.

Philippe Hamon said, within the context of architecture, “Every building, once completed, concretizes a sort of social and natural proxemics”. This applies to the existence of most objects including artworks. Interactive artwork, in particular, adds a new element to the relationship between the artwork and the viewers. Our work, “Title of our work”, is meant to grab the attention of passers-by to pay more attention to the objects around them.

People are more likely to interact with objects that react to them. “Title of our work” idle mode mimics a still mirror until the sensor picks up on a motion. Once the sensor picks up that there is a person passing by (using the change in distance), the wall piece would play a short light sequence. It is a random blinking effect of lights that has a pleasant fall off, subtly creating a notion of “I just saw you pass by”. On the counterpart, the quick blinking light sequence draws the attention of the passer-by, thus creating a sense of curiosity.

Once the piece grabs one viewer’s attention, it will draw other people’s attention as well. One of our goals is to get people to interact with the piece collaboratively, creating a sensual co-existence. People adjust their distances between each other based on their social activities, but sometimes the distances are also used to raise defense mechanisms when others intrude within their spaces (Hall, 1969). The size of the piece requires participants to share a relatively small space, encouraging them to get close into each other’s personal spaces. We encourage people to get close to each other while interacting with our work. However, we are also interested to see how participants who don’t know each other well would behave in close proximity with each other when they are all drawn to the same object.

Through physical interactions with the piece, participants will gain aesthetic pleasure and gratification through the lighting patterns their actions trigger. After the adaption on the piece, we encased the sensor together with the LED panel so it wouldn’t be easily seen. The idea behind it is for participants to freely experiment with the piece to try to figure out the mechanism behind it driven by their curiosity. As Costello and Edmonds (2007) have put it in their study of play and interactive art, “stimulating “playful audience behavior might be a way of achieving a deep level of audience engagement.” We build on this concept in our interactive piece to obtain engagement and entertainment. Participants will eventually adapt to the ways that LEDs behave, and gain a sense of gratification by obtaining the knowledge of how it works. This kind of rewarding system behind the piece will keep them invested in the experience throughout the interactions. Furthermore, with this acquired knowledge, participants can go on to use this piece for more advanced performances such as making the LEDs react in a cohesive way with the sounds of music and etc.

References

Costello, B. and Edmonds, E. “A study in play, pleasure and interaction design”. ACM New York, 2007.

Le, Harry. “8x8x8 LED CUBE WITH ARDUINO UNO”. Youtube. https://youtu.be/T5Aq7cRc-mU. accessed on October 18th, 2019.

The DIY Crab. “DIY Infinity Mirror Coffee Table”. Youtube. https://youtu.be/OasbgnLOuPI. Accessed on October 18th, 2019.

Hamon, Philippe, “Expositions : Literature and Architecture in Nineteenth-Century France”, trans. Katia Sainson-Frank and Lisa Maguire (Berkeley: U of California P, 19).

Hall, E.T. “The hidden dimension”. Anchor Books New York, 1969

https://dl-acm-org.ocadu.idm.oclc.org/citation.cfm?id=1314168