Author Archive

The Observer Part II

PROJECT 4

whatsapp-image-2016-11-23-at-7-30-36-pm-8

The Observer Part II by Sara Gazzaz

DESCRIPTION

This project collects data over 8 hours of how many people approach art. It also collects the proximity between the observer and the art piece. The concept for this project was chosen because of my interest regarding how people interact with art pieces in different ways and in different environments. This art piece was placed at the entrance of 205 Richmond Street West. Collecting data of the way people at OCAD interacted with the piece was what I was exploring.

 

How It Started?

Several informative posters at OCAD are put up on walls of the hallways and in different rooms for the purpose of informing or reminding us about workshops, deadlines etc. I’ve noticed how people sometimes don’t give these posters much attention and from how many they are they often don’t even notice them because they are used to them being around in the environment. Rarely have I seen someone approach them and stand to read them.

From here, I wanted to move into another direction of how people dealt with art pieces.
As an artist, the sense of touch is an important element when viewing art. I prefer touching the piece and feeling the texture and the layers of paint.

Not all galleries permit people to touch artwork and also when I asked several people, not all cared about touching the pieces and some felt that they need permission first.

Realizing that this is not a preferred interaction for everyone and people have different ways of observing art I decided to proceed with collecting data over an 8 hour period of one day to see how close people got to my painting as well as the amount people who approached it and the time of day they did. It was even more interesting looking into the different preferred ways of viewing art.

I approached several ways in deciding what kind of art to use for this project.
Here are some images of the first attempts:

dsc_0653     This idea came out when I was observing if people read the informative posters around campus. I listed out ordinary phrases such as
“Come CLOSER-this is just another stoopid poster!” and thought of how people would approach words on an art piece on the wall similar to the poster. The double “o” in ‘stoopid’ was also an idea of play on words in have the ultra-sonic sensor built in the art piece.
I then chose the wording on the above image because I wanted a more subtle interesting phrase that I thought people could relate too and still used the “..” to add the ultrasonic sensor into.

 

dsc_0656     This was my piece when I wanted to play around with the idea of touching a painting. The dotted hand print was to invite the individual to play his hand in that area. For this my sensor would be a pressure pad.

 

final     This was the final chosen piece and what I believe was best suited for my concept of simply seeing how people interact differently with an art piece.

 

Watch “The Observer Part II” on Vimeo:

https://vimeo.com/193088364?ref=em-share

Music: Starover Blue – “A Flower In Space”

Technology

Hardware

List of components and materials used:

Mixed Media Art Piece on Canvas
1 ESP8322 Huzzah Feather
Breadboard
Small Cardboard box
Portable USB Power Pack
Resistor (10k
Conductive Wires
Ultrasonic sensor
Velcro
Shoe Print Signs (Way-Finding Signs)

 

How It Works?

LOCATION: I hung this art piece at the entrance of 205 Richmond Street West because I wanted a place with high traffic of all the people that access the building. It was placed on a wall perpendicular to a big mirror on the street level. The ground area surrounding the piece where a person would stand to observe was small therefore it was well-suited because it meant that there will be no passers in front of the sensors unless they intend to see the piece. This was a way to avoid mis-readings. This position and use of the mirror was to allow people to view the piece from a distance whether they were coming through the entrance doors, out the elevator or even up and down the stairs. It would then invite them over to view it directly if they wanted to.

whatsapp-image-2016-11-23-at-7-30-36-pm-6            whatsapp-image-2016-11-23-at-7-32-14-pm

 

I also placed cut our of shoe prints on the floor as way finding signs that would also add to the way people were invited over to look at the art piece.

whatsapp-image-2016-11-23-at-7-30-36-pm-5

 

One ULTRASONIC sensor was attach to the top of the label cardboard box underneath the painting. Every time a person  approaches to observe the art a new value is registered and is sent to Adafruit IO.
Using IFTTT, the readings traveled from the Adafruit feed to a Google Drive spreadsheet on my account. 92 readings were registered as the people who approached the piece on the 23rd of November, between the hours of 14:27 to 22:30. The data collected was according to people’s proximity with the painting. I set the distance value to be registered when it was between 0-30 cm.

screen-shot-2016-11-27-at-12-12-26-pm      whatsapp-image-2016-11-23-at-7-32-13-pm-2

The above cardboard box was covered with a label showing details of the painting. It was the housing for all the wiring and power. The power, microprocessor (Huzzah ESP-8322) and data wires for the  ultrasonic sensor were secured to the breadboard on the back this box.

 

screen-shot-2016-11-23-at-8-42-01-pmView of Project 

 

 

Circuit Diagram

 

feather_ultrasonicsensorinput

Software

Code available on GitHub 

https://github.com/saragazz/theobserverII.git

 

 Coding + Challenges

I began with the process of getting my mac address in order for Nick to give me access to OCAD’s wifi. After that I set up an AdaFruit account and used a reference code example to connect to my account and publish my readings on my Feed.

I started out working on my code for the ultrasonic sensor using a reference. I was trying to adjust the code in a way telling it when the distance between the observer and the art piece should be registered. I first experimented at home with using different thresholds. I started out with a threshold of 2 between movements in front of the sensor. Then realized I was getting so many readings per person. Therefore, I increased the threshold to 10.
I also used an “if” statement for reading movements between 0-30 cm as “close” and 30-100 cm as “far”.

whatsapp-image-2016-11-23-at-7-32-13-pm-1     whatsapp-image-2016-11-23-at-7-33-54-pm-1whatsapp-image-2016-11-23-at-7-32-13-pm    whatsapp-image-2016-11-23-at-7-33-54-pm

 

The Conclusion:

The data collected on the spreadsheet was visualized in Adobe Illustrator C6 in the form of a line graph. The X-axis of this graph represents the time period and the Y-axis represents the distance. The number of people that approached the piece in total were 92 and they are seen as the number of vertical lines.
screen-shot-2016-11-24-at-7-28-07-pm

I took that data and visualized it in the abstract painting below. The width of the canvas was divided into 1 hour intervals and had clusters of circles floating vertically. The circles represent the people who approached and  the size of the circle represents the distance between them and the piece. The bigger the circle the closer that person was to the artwork.

whatsapp-image-2016-11-25-at-10-42-03-am-1

 

 

Future Iterations:

I would take this project further by testing out and looking more into eye tracking systems. It would definitely be more interesting to further develop it and see how people actually view the art piece when looking at it.

References + Case Studies:

How people observe art.

The Art of Looking: How Eleven Different Perspectives Illuminate the Multiple Realities of Our Everyday Wonderland

http://www.huffingtonpost.com/james-elkins/how-long-does-it-take-to-_b_779946.html

 

Marcelo

Creation & Computation Experiment 3

Sara Gazzaz and Mudit Ganguly

 img-20161113-wa0071

‘Project: Marcelo’ – converts paint strokes into a language

 

whatsapp-image-2016-11-13-at-9-30-38-pmwhatsapp-image-2016-11-13-at-9-30-38-pm-1

Introduction

Marcello is a messaging tool that uses brushes as input devices. It was developed to be used as a tool that helps users engage in nonverbal communication over long distances.

We aimed to convert the physical action of making a painting into messages that could be sent across the internet to another user who would then read the message and reply back using a brush as well.  Since the brush is limited in terms of strokes we had to use a language that uses limited inputs. Morse code proved to be the simplest and most efficient answer.

The entire system is controlled by an Arduino which receives sensor data from a flex sensor built in the brush. The data is then sent to P5.js which then sends it across the internet using PubNub. The information is then picked up by P5.js on the receivers’ end and it displays the information in the console which is then translated by the receiver in real time.  


SKETCHES

whatsapp-image-2016-11-13-at-4-09-45-pm whatsapp-image-2016-11-13-at-4-09-43-pm whatsapp-image-2016-11-13-at-4-09-45-pm-1 whatsapp-image-2016-11-13-at-4-09-41-pm whatsapp-image-2016-11-13-at-4-09-40-pm

Target Audience + USES

The possible uses of such an interface can be in performance art, couples therapy, long distance communication and interactive art practices.

How it works?

Hardware:

2 X Arduino Micros
2 X Flex Sensors
2 X LED strips (blue)
2 X Breadboards
2 X Laptops
2 X 9 Volt Power Adaptors
2 X Transistor Pins
2 X Diodes
4 X 10k Resistors
2 X Briefcases
2 X Brushes
2 X Plastic wrap
2 X Micro USB cables
Felt
Paint & Paint Containers
Glue Gun
Canvas Boards
Notepads
Pens
Painting Paper
Conductive Wire
Paints
Wood

We broke apart a brush and inserted a flex sensor into it along with the bristles. Each flex sensor was covered in a protective plastic wrap that protected the sensor from any paint or water. The sensors were then attached to the breadboard with long wires that were inside a wooden handle we built. We used a wooden cylinder that we sawed in half and sawed the inside to build space for the wires to go through the brush.
To hold the bristles and flex sensor in place and attach them to the handle we covered them with felt so it would help provide users with a comfortable grip as well as to cover the whole mechanism.


whatsapp-image-2016-11-13-at-4-09-42-pm whatsapp-image-2016-11-13-at-4-09-42-pm-1whatsapp-image-2016-11-13-at-4-09-39-pm whatsapp-image-2016-11-13-at-4-09-44-pm whatsapp-image-2016-11-13-at-4-09-44-pm-1whatsapp-image-2016-11-13-at-4-10-28-pm

whatsapp-image-2016-11-13-at-4-10-27-pm-2whatsapp-image-2016-11-13-at-4-10-25-pm

whatsapp-image-2016-11-13-at-4-10-26-pm-1whatsapp-image-2016-11-13-at-4-10-27-pm whatsapp-image-2016-11-13-at-4-10-29-pm

whatsapp-image-2016-11-13-at-4-10-27-pm-1whatsapp-image-2016-11-13-at-4-10-29-pm-2whatsapp-image-2016-11-13-at-4-10-26-pm

The LED strips were placed on the rims of the briefcases. They provided feedback to the users in the case that whenever the flex sensor was pushed beyond a certain point, the LED strip would light up notifying the user that a sensor value has been registered. These strips were then attached to the breadboard as well.

The LED strips required more voltage than the arduino provided so we used an external power supply of 9 volts attached to the breadboard that powered the LED strip while the Arduino was powered by a micro USB cable. The Arduino was connected to a laptop which had the arduino code running as well as the P5.js code.
whatsapp-image-2016-11-13-at-4-12-52-pm-1
There are two laptops that are used as part of the system to demonstrate how two people would communicate with each other. Each one displayed data through the console log in P5.js.

The briefcase is where all these components came together. It had the LED strips that lit up as well as the laptop. All the other components were hidden underneath the laptop. Also over the laptop was a white foam board that held the canvas in place. It was used to cover the laptop’s keyboard in order for it to be viewed only as a screen. The paint containers were glued on to the board with 2 colors of paint (blue &  yellow).

whatsapp-image-2016-11-13-at-4-12-52-pm    whatsapp-image-2016-11-13-at-4-13-03-pm whatsapp-image-2016-11-13-at-4-12-50-pmwhatsapp-image-2016-11-13-at-4-12-48-pm-1

 

Circuit Diagram

circuit-diagram

System Architecture

Software CODE:

 

Arduino Code available on GIthub (add link) for Flex sensor 1

https://github.com/Flutter2016/Project-Marcello/blob/master/Arduino%20Code%20for%20Mudit’s%20Brush

Arduino Code for Flex sensor 2
P5.JS Code

 

The Arduino Micro microprocessor on the breadboard runs a simple firmware that reads the sensor values, converts them and prints them into the serial output. Since each flex sensor has its own native value we had to develop separate codes for each sensor.

The result of the integration is the sensor values being converted into either a ‘dash’ ,‘dot’ or ‘blank’ values depending on the direction the bristles of the brush were pressed on. The data sent is packed as a message to Arduino’s Serial Monitor.



This packaged data is then sent to our P5.js sketches that we’re running simultaneously. P5.js needed an additional program to run (P5 Serial Monitor) that would enable it to pick up the data packets that the Arduino is sending it.

P5.js then displays this incoming data package in its own Console Log. Once the data is in the log it is then sent over the internet via PubNub to the second laptop that also had P5.js running. These ‘dots’ , ‘dashes’ or ‘blanks’ were then displayed on the second laptop’s P5.js Console log. This is the basic interaction.

Note: The users should refer to a key diagram that translates the english alphabets into morse code so that they know what the messages being sent to them say and also to refer to the kind of strokes required to send the message they intend to send the other user.

morse-code

Once the second user receives the morse code they had to translate it to decipher the message and then they replied back using the same mechanism.

Coding

Arduino Code:
We began by calibrating the flex sensor. We found the complexity of the movement and the responsiveness compelling and thought it would translate well to the movements of a painter. We divided the range of values into 3 sections that would register either a ‘dot’ or a ‘dash’ or a ‘blank, which are the fundamental communication tools used in Morse code. We also then incorporated the LED strips and made them light up whenever a ‘dot’ or a ‘dash’ had been registered. Finally we coded the values to appear on the Console log of the Arduino.

P5.js Code:
In P5.js we had to code the software to pick up the Arduino code for the particular Port on that laptop. We also had to add the integration with PubNub using our own Serial Keys. Finally we designed the code in such a way that only incoming information from the second laptop would be displayed on P5.js’s Serial Monitor. This was important as we didn’t want both users to get confused between the messages they send and the messages they received. . So User 1 only saw incoming messages from User 2 and vice versa.

Network Communication

There were two communication issues involved in the project: Arduino to P5.js and P5.js to Pubnub

Arduino to P5.js
P5.js needs an additional program to run (P5 Serial Monitor) that enables it to pick up the data packets that the Arduino was sending it. This whole setup has a lot of requirements and even then is not the most stable network. The only way to troubleshoot it was to restart Arduino, the P5 Serial Monitor and P5.js and then hope for the best. It was not the most reliable way to troubleshoot as it was more of a hit and miss with the software being temperamental.

P5.js to PubNub
Pubnub gives users a set of custom credentials to be inserted into the P5.js code that would enable internet connectivity. Even when we put these credentials into the program we faced multiple obstacles. Thanks to Marcelo Luft, we managed to overcome these obstacles and send those data packets over the internet.

whatsapp-image-2016-11-10-at-11-54-34-pmwhatsapp-image-2016-11-13-at-4-12-58-pm1

Performance

On the day of the presentation we had the class divided into two groups that were separated by a white board and acted as a barrier for them to see them as being two separated rooms. We then began communication between two users (Sara and Mudit)

We shared messages like “Sup’ , ‘Yo’ and ‘Hi’. We then allowed our classmates to try it out.
Ania sent the message ‘Cat’ to Orlando.
whatsapp-image-2016-11-13-at-11-00-39-am-1 whatsapp-image-2016-11-13-at-11-00-39-am-2 whatsapp-image-2016-11-13-at-11-00-39-amwhatsapp-image-2016-11-14-at-11-36-03-am

 

 

 

 

 

 

 

Case Studies

http://www.mudam.lu/en/expositions/details/exposition/yuri-suzuki/

Yuri Suzuki is a sound artist, designer and electronic musician whose recent work explores the physical and technological characteristics of sound production, an interest that has arisen since the loss of the music library stocked in his laptop when the hard drive crashed.
For his Royal College of Art graduation show in 2008, he presented work which involved an innovative way of playing conventional vinyl records, including Sound Chaser (a miniature electric circuit constructed from pieces of old records on which small cars circulate and transmit sound) and the Finger Player, a transmitter handled like a thimble, enabling the physical experience of the retransmission of sound by running a finger along a record.
Suzuki’s intention is ‘‘to raise public awareness of the way in which sound and music is produced’’ and in most cases this occurs through performances and workshops requiring public participation. For Mudam Summer Project he is therefore presenting workshops led by invited artists and creators that tackle a variety of themes such as learning the basic principles of electronic music and the creation of sound pieces using transformed objects

https://www.tiltbrush.com/

 

Don’t Spill

logo

By: Shreeya Tyagi, Sara Gazzaz

DON’T SPILL
A mobile water balance game designed to raise awareness on the issue of water scarcity in rural parts of the world.

Project Context:

Balancing water in a container is a common practice in many parts of the world. Yet nearly 1 in 10 people worldwide or twice the population of the United States live without clean water. The majority live in isolated rural areas and spend hours every day walking to collect water for their family. Not only does walking for water keep kids out of school or take up time that parents could be using to earn money, but the water often carries diseases that can make everyone sick. Access to clean water means education, income and health – especially for women and kids.

In Africa alone, women spend 40 billion hours a year walking for water.

Access to clean water gives communities more time to grow food, earn an income, and go to school — all of which fight poverty.

 

Project Description:

Based on the above we wanted to create a playful game that allows 20 users to play the part of balancing water on their heads. We wanted our project to create an experience for the user to be able step into the shoes of people who carry water on their heads for miles in drought struck areas of the world.

We did this through creating a body balancing game that involves the participants to each get a water bottle with the instructions to open a link on their phones. The are told to each take a headpiece based on their preferred size (small, medium, and large) and insert their phones into the sleeve on the headpiece horizontally.
The link opens moving water on a screen with the background of land affected by droughts. They are asked to try walking and balance the water on their heads by taking care of not spilling it. Each time they spill the water level decreases accordingly. We incorporated sound to enhance the visual effects.
They are scored on how much water they’ve lost by checking the number of droplets left on the top left of the scree. They can then refer to the water bottle given to them to know the percentage of water lost.

 

Presentation of Prototype at Project Critique:

20161028_154722 20161028_154848 20161028_154853

 

 

dsc00048 dsc00052

 

CODE:

https://github.com/saragazz/dontspill.git

LINK:

https://webspace.ocad.ca/~3158859/Don’t%20Spill/

 

SCREENSHOTS of TESTING CODE + CODING CHALLENGES:

gass-tilt-stage-1

Phase I – Gyroscope Data

It was difficult to find the code to access the Gyroscope through P5.js and incorporate it to create the movement of a virtual container.

This image above shows our initial test. We were testing out the accelerometer to see how it responds to tilting the phone sideways as well as back and forth. We adjusted the sensitivity of the tilt after this test so it would be more sensitive and cause more spilling since we are dealing with water.

 

waterglass-stage-2
Phase II – Introducing Sine waves to a container 

Sine waves are regular 2D waves. We used 2D waves to simulate a moving fluid in our virtual container. We had never used Sine waves before and it was a challenge to use these to create a water-like effect.

Phase III – Using Sine waves inside a container It was a challenge to make the wave and container move together.
Our second test when we started out with constraining the water in a container instead of it filling the whole screen. It worked out well but we have to change it to horizontal and fill up the screen with water instead so it would be more visible for participants to see because when tested it was too small to understand.
Also another problem we faced with this trial was that we were dealing with water and it was too complex because of it having different attributes. Water particles are bonded to other particles in “class” which was very complex coding we haven’t learnt. Since we had no background on how to code visual water spillage we thought of using gifs that would appear when someone spills water due to imbalance but then it was crossed out because it didn’t allow users to feel the flow and movement of water.

 

waveytrial     

 

Phase IV – Using the phone as a container

The fluid effect is a 2D Sine Wave simulation and it would have been interesting to do 3D simulations. This would be our next challenge while developing the game idea further.

Our third test was trying to fill the screen with water and draw waves using small circles that would cause the lines to flow across the screen.

 

 

screen-shot-2016-10-31-at-1-53-24-amscreen-shot-2016-10-31-at-1-53-36-amscreen-shot-2016-10-31-at-1-53-15-amscreen-shot-2016-10-31-at-1-53-59-amscreen-shot-2016-10-31-at-1-53-08-am

Phase V – Using the phone as a container space for the fluid simulation. Working with various platforms was a huge challenge. Our sketch worked fine on IOS and gave trouble with various Android devices.

Below is the image of first cointainer that we planned on taking pictures of to show spillage visualised as our gif plan. Also, the video is our reference to show how we wanted to the spill to look like.

whatsapp-image-2016-10-24-at-10-29-01-pm 

We were trying to take videos of real spilling of this water container that we colored with blue paint.
Also in this picture we used a ground that we made out of clay that we painted and carved to look like land that has been through droughts.

 

WEARABLE PIECE:

From the start we knew we wanted to use the act of balancing in some way so we decide to go for a head piece that allows users to insert the phone into.
At first we went through with starting to brainstorm for a vertical sleeve that would somehow be attached to the headpiece using a strong clip-like structure. BUT then we changed that to a horizontal visual because it allowed us to use up more space since we are using mobile screens and needed to make it more visible for participants to see what’s happening around them on their fellow participants’ screens.
Before deciding on going for the idea of participants interacting by seeing the water balancing act on their fellow participants rather than on themselves, we were faced with trying to adjust the phone’s placement in order to allow our users to see how good enough they are balancing the water. We brainstormed an idea of making the phone face the user by extending a piece from the headpiece towards the front of their face that would hold a mirror reflecting the phone on their forehead using clips. Another option would be the phone itself would be hung in front of their faces in level with their eyes in a sleeve. See photo 1.0

We eliminated the mirror idea and the the whole idea of the user having to see how he/she is balancing water because it was not what our goal was. We wanted it the goal of this project to be about how when people balance water on their head they dont have the ability to see it and also the main reason was for the 20 screens and users to interact with each other and guide each other with how well they are balancing the water.

dsc_0489dsc_0489PHOTO 1.0

screen-shot-2016-10-27-at-11-59-07-pmPhoto reference of how we would like our headpiece to look like.

whatsapp-image-2016-10-27-at-10-03-05-pm  Testing fitting of head piece which resulted on being irritating and caused discomfort due to the hard surface of phone on forehead.

 

whatsapp-image-2016-10-28-at-12-38-36-pm-2 whatsapp-image-2016-10-28-at-12-38-36-pm-1 Using a sponge to make it more comfortable for user by creating a cushion for the phone to rest on their forehead.
dsc_0449 dsc_0450dsc_0443Attaching elastic band to plastic sleeve using staples for strength

 

 dsc_0455     testing water level when high to low —–>    dsc_0454

dsc_0451Water labels designed with instructions to open given link on participants’ phones. 
img-20161027-wa0042 Process of cutting pasting score droplets representing water percentage of actual bottle.

 

 

REFRENCES:

https://www.charitywater.org/whywater/

 https://en.wikipedia.org/wiki/Carrying_on_the_head

http://mdgs.un.org/unsd/mdg/Resources/Static/Products/Progress2012/English2012.pdfhttp://

www.unicef.org/eapro/JMP-2010Final.pdf

http://google.github.io/liquidfun/

https://www.youtube.com/watch?v=scBOyNCQ8gQ

LUX – Interactive Light Installation

BY: Afaq Ahmed Karadia, Mahsa Karimi, Sara Gazzaz


Project Description:

We used the following:

Input: Potentiometer
A potentiometer, informally a pot, is a three-terminal resistor with a sliding or rotating contact that forms an adjustable voltage divider. If only two terminals are used, one end and the wiper, it acts as a variable resistor.

Output: LED – Light Emitting Diode
A LED is a semiconductor device that emits visible light when an electric current passes through it. The light is not particularly bright, but in most LEDs it is monochromatic, occurring at a single wavelength.

Material: String
Stretchable fishing wire.

LUX is a light art installation where participants can interact with the piece by manipulating the brightness of the light via potentiometers.
It consists of two inputs. The interaction with each potentiometer has an effect on the brightness of the LED lights. Also if the potentiometers are turned simultaneously it would have a completely different effect on the brightness because it also calculates the difference between both potentiometers to determine the brightness.


Circuit Diagram:

whatsapp-image-2016-10-07-at-11-59-48-am-1

Code:

https://drive.google.com/open?id=0B80PUMaQ0z4fOUxEbmF4NGMzbG8


Mood board:

418af25d3d840c7b866b2366e3e3a23e 617f042f91ab98dc9f2fb2e86693290c 040537239bd47dd931b18b5e3945fc9d a9eac28441b5d046f03511f12016a0f2

 

Sketches:

whatsapp-image-2016-10-05-at-5-56-34-pm


Design Files:

file-4 file1


Video:


Project Context:

Interactive light installations have become an extremely pervasive phenomenon in contemporary design. Light is a profound element in our lives. Through a design perspective, it is incredible to think about the effect it can have on people’s perceived experience in a given space.

Interacting with light and seeing real-time immediate effects on the captivating visual stimuli is intriguing. It is a sought after experience that is taken advantage of by many designers of industries from entertainment to retail.

With an abundance of options to choose from, people are becoming more immersed into products and experiences they can reciprocate and experiment with. As advancing technology is rapidly assisting the developments suited for this need, designers are ubiquitously integrating interactive lighting installations on their manufactured products and experiences.

Image vs Light

Figurative vs Abstract
Surface vs Space
Light phenomena: Reflection, Diffraction,Transparency, Occlusion and Shadows

 

Process Journal:

Input: Potentiometer

Output: LED

Material: String

We started with writing the code that would adjust the brightness of an LED light by turning a potentiometer. Initially the LED light went through three cycles during one complete turn of the potentiometer. To solve this we had to map the analog input value range of (0 -1025) to the analog output value range of (0-255). 

In order to add to the interactive factor of the installation we decided to use two potentiometers. Each individual potentiometer has an effect on the brightness of the LED lights. Also if the potentiometers are turned simultaneously it would have a completely different effect on the brightness because it also calculates the difference between both potentiometers to determine the brightness.

We had chosen string as the material to be used on this project. in order to figure out what type of string would be the best for our project, the group started brainstorming on the possible outcomes.  

At first we decided to work with just LED strips as “strings” because we thought it was a better way to use both our material and output together. Later on, we figured that it was going to limit what we do with our project and that we should experiment with different strings that would give us more options to play with using our LED lights.

In order to have the LED light to correspond to the difference of the values from the potentiometers, we had to include the math library to be able to subtract the values in our code.

LED strip lights need a higher voltage to turn on than what the Arduino kit could supply. We had to use an external power source (9V DC adaptor) and a transmitter to adjust the input voltage on our kit. 

Initially we were designing cubes and prisms and wanted to attach the LED strips to the structure. We ended up using a hollow structure that was an outline of the shapes which added to the “stringy” feel of our project. We ended up choosing stretchable fishing wires as our material which also would allow more light to be transmitted and have different effects when light shines through it.

This piece is something we would like to be a part of a bigger installation that when it is hung next to others on a wall it would create a visually playful art installation. 

Images Journal:

whatsapp-image-2016-09-26-at-12-43-57-pmimg_5984whatsapp-image-2016-10-05-at-5-54-22-pmwhatsapp-image-2016-10-05-at-4-09-26-pmwhatsapp-image-2016-10-07-at-11-43-52-am

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.