A Circle of Lights!

 

0

2       3

Image: Stills from our experiment at different phases of the project

 

GROUP MEMBERS

Catherine Reyto | Masha Shirokova | Sananda Dutta 

 

PROJECT DESCRIPTION 

A Circle of Lights is a kinetic sculpture based on the concept of a children’s mobile, that involves users into a colourful play of reflections and shadows. It consists of acrylic transparent geometric shapes with the intent of reflecting LEDs and project them on the surrounding area (walls, floor, ceiling).

Interaction with the mobile is done by means of the distance between people surrounding it entering in and out of the proximity range of the sensors.  The combined (three) sensors are all connected to the Servo which in turn, spins the whole sculpture. Through fluctuations in distance, participants are able to affect the speed of rotation and activate the lights (the closer they get, the faster it spins!)

 

PROJECT CONTEXT

”It’s just beautiful, that’s all. It can make you very emotional if you understand it. Of course, if it had some meaning it would be easier to understand, but it’s too late for that.”  – Marcel Duchamp

We were inspired by Mobile work of these artists: Alexander Calder,  Alexander Rodchenko and  Edgar Orlaineta. And from our Digital Fabrication class, we had taken inspiration from a recently completed project by our classmate Arsh.  He had laser-cut the elegant shapes of Arabic script into three sheets of Acrylic, then layered the sheets together to demonstrate the beautiful reflections when in contact with direct or diffused light sources.  

4

Image: Black Gamma by Alexander Calder

 

5     6

Images: Hanging Spatial Construction No.11 and Oval Hanging Construction No.12 by Alexander Rodchenko


7
Image: Solar Do (It Yourself) Nothing Toy, After Charles Eames by Edgar Orlaineta

 

8    9

Images: Artwork by our classmate Arsh (Arshian Sobhan Sarbandi). His assignment for Digital fabrication course.

What characterizes all  types of mobiles is the fact they rely on balance and movement in order to achieve their artistic effect.  They are composed of a number of elements; often abstract shapes, interconnected with wires, strings, metal rods or similar objects, whatever serves best in maintaining constant movement while in a state of suspension. They represent a form of a kinetic sculpture because, unlike traditional sculptures, they do not remain static, but are literally mobile, set in motion by air currents, a slight touch of an infant, or even, as in our case,  a small motor. 

By the sequential attachment of additional objects, the structure as a whole consists of many balanced parts joined by lengths of fishing line, whose individual elements are capable of spinning when prompted by the Servo’s propulsion or direct contact.  Because of their weight (we deliberately opted for a material that had some density, rather than say, card stock), gravity assists in naturalizing the animated movement of the shapes through space with a bit of bounce and retract. 

While classic mobiles are manipulated by air and space, our idea was to upgrade the concept with Arduino and Servo functions, so users could actually interact with objects. Depending on location, participators could activate one, two or all three sections of LEDs and alter the speed of the Servo. 

 

PROCESS

Weary of time restraints but eager to get our hands dirty with electricity we set out to combine basic but as novices, daunting Arduino functions ( sensor-reactive Servo and LEDs).  Drawing on our combined experience as graphic designers, we planned to incorporate this circuitry in a way that could trigger interactive movement of an object containing simple but colourful shapes.

Initial ideas – Before starting on the ideation process, we worked on building primitive circuits to better understand the principles of working with Arduino.  Having no prior experience, we felt we needed to get a better sense of our bearings in order to set a benchmark for what might actually be feasible for us to build in a short time-frame.  Once we’d gained some sense of familiarity, we then attempted to combine various modes together to see how many sensors we could use at once. Our process with Arduino depended heavily the learnings from our latest classes that covered topics about getting LEDs blinking in relation to the sensor’s threshold limits, Servos working with a timer and multiple LEDs blinking at alternate times.  

From a product standpoint, all of our initial ideas involved the creation of an interactive art piece which would combine the sensors and LEDs.  We discussed assembling a circuit of lights that could be elegantly diffused behind a thin-papered painting. We explored the concept of a movement-responsive garden, where the LEDS could be arranged in the pattern of a flower petals. These would light up once proximity to an object was detected by the sensor. We then attempted to transfer this concept onto a cubic mesh as well, wherein the sensors placed at the vertices of the cube would sense an object in their radar and light up particular sections of the cube. These ideas were explored when we had misunderstood the description of the assignment: we thought there was a restraint of either using LEDs or servos, not both working together.  After this detail was clarified, we felt and increased freedom to incorporate the Servo motor as well. We opted to increase the level of challenge by incorporating operations of both a Servo motor with synchronized LEDs. As a group, we also wanted to mix both platforms in order to gain experience about how these applications work together.

 

MATERIALS AND ASSEMBLY

Tools

  • Laser-cutting
  • Soldering

Material

  • Acrylic sheets
  • Fishing tackle (line, swivels)

We began by laser-cutting the circular base that would serve to support the suspended objects beneath (also laser-cut acrylic), and support the breadboard circuitry on its surface. We also laser-cut patterns from fluorescent acrylic sheets for an added dimension of light reflection.  These shapes were attached by fishing line (on account of both its transparency as well as strength) that were in turn attached, by means of a swivel around the rim of the acrylic disk we’d designed to uphold the breadboard and circuitry (which we soon nicknamed ‘the bomb’ on its resemblance to this).  

 

10 11
12 13

Images (L-R from top to bottom): Live scenes from laser cutting lab, final cut-outs for hanging elements in the mobile, servo base with points for attaching the breadboard base, breadboard base with LEDs arranged on them.

We then designed a smaller disk (the yellow circle in the above left image) to be at the very top of the mobile, firmly attached to the servo’s propellor piece. The triads of holes are for threading the topmost strings of the mobile. The hole at the centre is for the screw that attaches to the motor head.

To achieve the goal of a mobile with interactive light, we opted to solder our LEDs into three separate parallel circuits, assigning each section to one of our three sensors respectively. Each circuit consists of 4 LEDs, and each is assigned its own pin on the Arduino. The desired outcome is that the LEDs would light up once the corresponding sensor detected a disturbance within the threshold limits of its radar.

14    15

High hopes: Planning the animation of the LED circuits

 

CODING

Main 

  1. Parallel Circuit
  2. Servo+Sensor
  3. Servo Sensor
  4. Servos Multiple LEDs

The project being a fairly open experiment, we were able to explore freely with physical  materials, LEDs, sensors and motors. But freedom came with a price; It quickly became quite challenging to carve a clear path in what seemed like a constant stream of possibilities of what we could do (or where things could go wrong). Being weary of our limited time-frame, we conceded to create something that we hoped would be reasonably possible to code and assemble given our limitations of time and knowledge.  But we kept coming back to the idea of a proximity mobile (noun: a decorative structure that is suspended so as to turn freely in the air), as seen hanging over the cribs of babies to lull them to sleep.  But as aforementioned, ours would have the added feature of physical interaction.  Though there were complexities in the concept, our strategy was to challenge ourselves in seeing if we could pull it off in time, and even if we failed, we would still have a beautiful piece.  It would be made up of various cut-out shapes that carried some small degree of weight, suspended by unobtrusive strings, and have the additional feature of a parallel circuit of harmonized LEDs.  The idea was that the light animation would come into effect by response to movement within the threshold limits of the sensors’ radar. Once the participators move within range, each of the three sensors would initiate the Servo’s rotation (180 degrees in either direction). 

 

Coding Syntax used

Libraries used – <servo.h> and <animationTools.h>

Float

Serial.begin

pulseIn

If () and else()

Oscillate (reference taken from https://github.com/npuckett/arduinoAnimation)

 

Final Code

https://github.com/sanandadutta/Circle-of-life.git

 

 17  parallel

18

Image: Soldering the parallel LED circuit Images: The Parallel LED circuit represented above and the end result shown below it.

As you can based, based on the above images, 3 sections – A, B and C have formed after having 4 LEDs in a parallel attachment amongst themselves, in 3 separate parallel connections. These sections have been assigned 3 separate sensor pins which have been assigned a range of 120 degrees for interaction with any sort of physical disturbance.

16

Diagram A: Sensor mapping to the LEDs (Sensor 1 assigned to Section A, Sensor 2 assigned to Section B, Sensor 3 assigned to Section C)

 

19

Diagram B: This diagram is a front view of the entire setup. The Servo motor is fixed to a small ceramic base which in turn supports the larger ceramic base that hosts the breadboard circuitry (including the Arduino). The Servo motion triggers a three-tier rotation; starting from the top, which then pulls breadboard base into an offset rotation, then lastly, tows the strings of the mobile pieces into a 180 degree rotation. 

 

TESTING

  • sensor with led; 
  • sensor with servo;
  • sensor with led and servo;
  • parallel circuit of LEDs 

Link: https://vimeo.com/367269220

Sensor threshold 1, 2 then 3, with LED circuit  https://vimeo.com/367269685

Once we could read data from the sensor input, use it to turn on the LEDs and had established an output response from the Servo, it was time to start doubling (actually, tripling) up.  We set up our prototype (so far consisting of an 8” diameter disk cut from foam-core) in a quiet, disruption-free room for testing the range of our sensors together. Nick’s tip about taping distances to the floor came in very handy for this part, and we found that especially true when testing where the threshold limit of each Servo overlapped with the next.

22

Image: Making contact – Configuring the sensors and their respective LED circuits

 

 

Testing the structural design

It was at this point in the testing phase that we started making iterations on our prototype in terms of material design and overall physical mechanics.  An 8” diameter disk meant a very small area and thus too much overlap for the sensor thresholds. This discovery led to a makeshift upgrade in the form of a piece of cardboard, cut to an 11” diameter, intact with a very precisely-measured place-mark at the centre for the Servo (all three of us tumbled down a rabbit hole where somehow the accuracy of this place-mark was of utmost importance).  

The mechanics of our material design were becoming complicated.  We had managed to get our prototype in motion by fastening the Servo to the underside by harnessing it with layers of electrical tape.  Our device was now taking inputs from all three sensors (arranged on the large disk in an equilateral triangle), which activated the three LED bulbs that served as stand-ins for our soon-to-come parallel circuits, as well as the Servo (thanks to the Oscillation function from the Animation Tools library.  But we anticipated a few issues regarding suspending decorative pieces to the base.  For one thing, we would need a material more sturdy than cardboard in order to support the weight, but the bigger concern was the jerky movement of the Servo. 
Because of the Servo’s 180° limit, we were concerned that the back-and-forth rather than circular motion of the suspended objects might look awkward.  

We wondered if we could increase the range of motion of the dangling pieces by means of an offset caused by gravity.  To test this, we added a second tier to our prototype: a small circular base, where the Servo would sit, that would in turn suspend the second, larger circle.  The decorative pieces would hang from that larger disk. Putting our pooled knowledge of Physics (limited to street-smarts and common sense) to use, we guessed that the speed of motion would decrease with each level of suspension (from the first tier to the lowest-ranking decorative pieces), but that the range of movement might appear to increase thanks to the pendulum effect.   After testing this addition to our prototype showed positive results, we set out to design both the large and small circular bases in Adobe Illustrator, intact with hole placement for the threading, sensors and Servo motor. We wouldn’t be able to know whether the two-tiered system would actually work as we hoped till we attached decorative pieces.  

 

23 24 25

First sketches of the two-tier design

We also had to figure out a way of suspending the larger disk from the smaller, top-tier disk.  We did so by way of fishing tackle; threading it in and out of laser-cut groupings of holes instead of cutting individual strands of fishing line, to make the length adjustable.  However, the drawback to using this load-bearing translucent string is that trying to maintain order is like herding cats, and that unyielding lack of control negated any flexibility in this system.  Hindsight led us to conclude that we should have limited its use to serving as a measurement tool. In using it to establish the best distance between the two disks, we could have cut the strands accordingly and then replaced them with individual strands of wire.   As a last-ditch effort to streamline the design, we opted to shorten the distance between the disks, and though this did help avoid tangling, there was an oversight : the lack of slack on the suspension lines made our would-be meditative mobile look rather spastic in presentation.  

26
Images: “Get a grip!” – Maneuvering the unwieldy fishing line through the suspension holes 
27


Links for testing:
https://vimeo.com/367268855

Final Link: https://vimeo.com/367905506

28

 

REFLECTION

As perhaps others in the class can attest, Experiment 2 was in many ways an exploration of restraints.  With the mid-term pile-up of assignments and presentations in our respective classes, we found our initial challenge was in figuring out when we could even find time to meet as a group.  We were also learning as we went, and that our ideation process depended on what material we were presented with in upcoming class lectures. We would watch the videos from class, attempting to replicate what had been demonstrated, repeat the motions ourselves, then try to put these findings to use creative ways.   It was a little exasperating but as a result we learned a valuable lesson about how to be independently resourceful.   

While from the outset, we did all agree that our project would need to be feasible to successfully produce for the rapidly-upcoming deadline, it wasn’t easy to quell our shared enthusiasm for working with LEDs, motors and sensors for the first time.  All three of us coming from graphic design backgrounds, we were simultaneously excited and distracted by colour and light sequencing. The mobile had been but one of many fanciful ideas. We wound up choosing it over the others because although the complexity of the product design risked falling outside the scope of feasibility, it was hard to resist the challenge of making a moving artwork installation that responded to people as they approached it.   Had we more time together to sketch out a road map in the way of a detailed storyboard about how the design would be assembled, we may have gained much from researching solutions to our pain-points instead of stumbling blindly into them like boobytraps throughout the process.  

29

Burning the LED circuits at both ends: final stages of assembly.  

Ultimately the main conflict was that there was a lot of new information being absorbed with too little time to move through cycles of practical ideation.  Instead, we brainstormed what might work then just rolled up our sleeves and hoped for the best.  We crossed bridges when we got to them, like how to suspend the Servo – which wound up being held up by a clamp, like pincers on its poor plastic temples, and fastened to a bar of LED ambient lights on the ceiling.  Another hurdle was how to extend the number of LEDS in the parallel circuit while still getting them to work in conjunction with the Servo rotation. We never did manage to resolve this but we were at least finally able to pinpoint the issue: The pressure on the small Servo to not just carry the weight of the entire three-layer assembly of acrylic objects, but also spin (read: thrash) them, was consuming a lot of voltage.  There was simply not enough power to go around (pun intended) to light the 12 LEDs of the circuit while the motor spun. Had we a little more time, we’d have opted to switch up the Servo for a larger motor, but we had already run out the clock on that part of production.  

A truly satisfying and memorable moment for us was when we succeeded in getting all three sensors responding at once, in conjunction with our three LEDs.  This was after testing the threshold limits of each and tweaking adjustments in the code for several hours. It was late in the evening when we saw all of this coming together on the spinning disk for the first time, and we had a group hug while looking on proudly at our achievement.  That did a lot to double up our drive in the remaining days. We’d proven to ourselves that we were actually capable of pulling off something that only a few days prior had seemed absurdly over our heads. None of us having any prior experience working with Arduino, electricity or much understanding of code fundamentals, it felt good to come that far in a short amount of time while working on something artistic and original.

Ultimately, our experiment was fatally flawed on two counts: 1) we should have resolved how to suspend the Servo before getting started.  We likely would have foreseen the issue had we mapped out the design in a storyboard, which would have likely been invaluable in terms of either finding a viable solution or if not, possibly resulting in eschewing the concept altogether, and 2) There were simply too many moving parts to work through in too little time.   The added layer of mechanical engineering that came with the material assembly meant a lot of questions about physics that we were unable to answer, for one because we barely knew how to ask the right questions, but mostly because we literally had our hands too full with learning to code for circuits of sensors, lights and motors.  

That being said, all of our hard work did result in a beautifully decorative piece, that in spite of the jerky motion, did seem to captivate our classmates on presentation day in the way we had hoped.  The potential peeked out of the iteration stage our device found itself in when we presented. Had there been more natural light in the room, the group might have been treated with a myriad of overlapping, colourful reflections on the floor and surrounding walls.  But with the lights off, we were able to envision what a few LEDs among the spinning translucent decals could do for achieving a similar effect, emitting fractals of reflections across the ceiling that moved around as participants walked in and out of the threshold area below.  In this respect, we felt we had achieved something more significant than a working product. We had attained a strong benchmark of iteration, one that opens doors for future designs for all three of us, working together or separately in our artistic design practice.  

 

REFERENCES

  1. Culkin, J., & Hagan, E. (2017). Learn electronics with arduino : An illustrated beginner’s guide to physical computing. Retrieved from – https://ebookcentral.proquest.com
  2. Digital Futures’ GitHub by Nick Pucket –  https://github.com/DigitalFuturesOCADU/CC19/tree/master/Experiment2
  3.  Significance of Mobile Art – What is Art Mobile –  https://www.widewalls.ch/mobile-art-mobiles-kinetic-art/

The Red Field

Project Title: The Red Field
Project By Arshia Sobhan, Jessie Zheng, Nilam Sari, Priya Bandodkar

Project Description

Our project is an experimentation on light sequences. The piece is meant to be hung on a wall and gets activated when the viewer walks pass by it. The light sequences change based on the interaction the viewer has with the piece. We used mirror board and reflective clear acrylic sheet to create an infinite reflections for more an immersive illumination.

Ideation Process

The idea of the project has gone through series of evolution. At the start of this project, we jotted down the ideas that could potentially be built upon and/or combined. We tried to expand the interaction experience of the users as much as possible with our ideas even with the limited number and categories of tools available to us.  

1

Eventually, we came to an agreement to build something simpler with the limited timeframe to complete the project, yet experimental and aesthetically pleasing so we could still practice our design skills as well as familiarize ourselves with physical electronic basics and the Arduino programming language. Inspired by the LED light cube video on YouTube, we brainstormed ideas to make a variation of it to combine incorporate users’ physical interactions with the lights/ sensors as an essential part of this project. To make sure the project is mostly finished before the day of presentation, we made a schedule to keep us on track since we only have about 5 days to make the project.

Project Schedule

2

Based on the distance sensor data input and LED lights output information, we have explored the possible combinations of how they relate to each other. Initially, we hoped to use 3 distance sensors so that each distance controls a separate attribute of LED lights, for example brightness, blink rates and blink patterns. 

The idea behind our project was to collaboratively control the display of the light in the box in the same way DJs mix their music. Based on this idea, we created a light panel and a controller as the main part of the piece.

Work in progress

Day One (Monday, 7 Oct)

We have established 4 modes, which are idle mode, crazy mode, meditating mode and individual mode. To generate more interesting behavior patterns for the LEDs, we soldered 4 groups (red, blue, purple and green) of LEDs together, leaving one group (8 LEDS marked in yellow) in the center unsoldered in an attempt to have individuality in the midst of unity for LED patterns. To further increase the aesthetic appeal, we decided to use an infinity mirror to put behind the LED lights so that the lighting effects will be enhanced and amplified even more as if there are infinite blinking and fading LEDs.

panel-light-pattern

Day Two (Tuesday, 8 Oct)

We divided the coding into 4 different parts, with Arsh, Priya and Nilam each designing one of the modes of different lighting patterns of the four groups of LEDs that are soldered together, while Jessie designing a separate set of behaviors for the unsoldered group of LEDs. 

We regrouped a few times to make adaptations to our code for the maximum amount of clarity when it comes to users’ interactions with the sensor. Using the same thresholds becomes important when it comes to working individually on our own code and combining it altogether in the end. We tested different sensor values to come to the final threshold numbers.

Day Three (Wednesday, 9 Oct)

In order to hide the distracting wires on the back of LED lights, we designed and laser cut a box to encase the LED light panel as well as the wires at the back. We also designed a pyramid to place 3 sensors at the center of the each side for users to interact with to control the lighting behaviors and patterns. However, we realized by having 3 sensors will significantly affect the speed of execution of the code. Eventually, we decided to use only 1 sensor for this project and utilize different physical ranges to trigger different behaviors for the LED lights.

3

4

With our code close to finish, we started soldering the 4 groups of lights together so we can have the code tested on the LED light panel to see if the light patterns work well together since they have been done by separate people. We soldered the lights in parallel rather than in series in case that one of the lights burns out, it won’t affect the others LED lights that have been soldered onto the same group.

5

To achieve the effect of the infinity mirror, we got reflective acrylics from the plastic shop at the main building. We got this mirror-like acrylic for the base layer of the LEDs, and used clear transparent acrylic and coated it with a reflective layer as the cover for the box. We experienced some struggles while trying to coat the cover acrylic, as air bubbles got between the acrylic and the coating. However, it still looks good with all the physical elements combined.

 

Day Four (Thursday, 10 Oct)

On the final day before the presentation, we finalized the code together in case if they don’t work together. Problems occurred as we tried to do so. Arsh and Priya’s code couldn’t work together for some reason which we couldn’t figure out. Having consulted Nick, we learned that Boolean State is a digital function and can’t work the same time with analogWrite(), as one pin can either be assigned digitalWrite() or analogWrite(), but not both at the same time. We adjusted our code accordingly to solve this issue in the end.

With Arsh, Priya and Nilam finished with their code, Jessie had trouble achieving the effects of making 8 unsoldered individual LEDs blinking one after another, with setting different blink rates in an LED array. However, the 4 groups of LEDs already blink and fade in a coherent and unified manner with Arsh, Priya and Nilam’s code. We decided to let Jessie continue to work on her code. If she worked it out before the presentation, we would have more interesting light patterns. If she couldn’t, the LED panel worked well as the way it was and she could work on it after the presentation as well.

Day Five (Friday, 11 Oct)

Jessie eventually made the 8 individual LEDs behave in the way she desired them to. Unfortunately, there wasn’t enough time to assemble the lights together before the presentation, so we presented it as the way it was. During the presentation, Nick offered some insight on the psychology of human behaviors and possible interactions with our LED panel. He encouraged us to think more about how we could use this to our advantage by discarding the sensor pyramid completely, and hide the distance sensor somewhere as a part of the main body of our LED panel for example. Users would get closer to it in order to find out what triggers the lighting behaviors and have a more intimate and physical experience with the LED panel.

Project Adaptation After Presentation 

After the presentation, we received an important feedback that by having a separate controller, the physical distance between the piece and the controller might impede the natural interactions, because the controller would limit the physical spaces participants could utilize to play and experiment with, which basically only allows the users to wave their hands around it like a zombie. During break, we made changes on the display and concept of our project.

The new piece is meant to be hung on a wall and only gets activated when the viewer walks pass by it. In the new version of our project, the idle state of the wall piece becomes completely dark. It won’t have any sort of reaction until someone walks pass by it and activate the piece. Once the piece is activated and received attention from the viewer, the light sequences on the wall piece will change depending on different ways of interactions.

This new concept plays on the concept of proxemics, and try to minimize space or even eliminate it between viewers and the collaborative aspect. We thought that with this new concept, more focus would be placed on human relationships with the space around them.

Video of interaction

Link to Code

https://github.com/arshsob/Experiment2

Documentation

portfolio-image-fhd

led-board-fritzing

 

Technical Challenges

Due to our very limited experience with Arduino and coding, we faced several technical challenges on our way.

The first issue occurred when we were trying to control the brightness and the blink rate of LEDs at the same time. We understood that we can’t use analogWrite and digitalWrite to the same LED set simultaneously. This Issue was resolved by adding a little more code and changing all digitalWrite functions to analogWrite.

The second issue happened when we connected the LED board to the Arduino. At the test stage, the data coming from the distance sensor was reasonably smooth when there were only 4 LEDs, each connected to an output pin. After connecting the LED board, the distance data was wildly fluctuating and making it impossible to interact with it. This fluctuation was a result of the electrical noise caused by many wires connected to the board.

6

As suggested by MaxBotix, we added two components to filter the noise to our board: a 100ohm resistor and a 100µF capacitor.

7

Adding these components stabilized the distance data significantly and resolved the issue.

Finally, to amplify the brightness of LEDs, we used a transistor for each LED group. Otherwise, all the LEDs were too dim to demonstrate the fade effect relevant to distance changes.

After modifying the idea with regard to presentation feedback, the effect displayed for someone passing the box was another challenge, considering that it was supposed to happen only once after distance changes detected by the sensor. Using a variable to store the time of sudden change and several conditions over duration of fade-in/fade-out effect, the issue was resolved. However, there seems to be some kind of conflict among them, causing minor flickers during the effect.

Several attempts to use a sine function failed trying to create a degree related to time past after the sudden change and to limit it between 0 and PI, due to unnatural (and uncontrolled) behaviour of the output.

Project Context

The work by Harry Le, an 8x8x8 LED Arduino cube project, and The DIY Crab, a DIY infinity mirror coffee table on Youtube.com gave us the inspiration for this project.

Philippe Hamon said, within the context of architecture, “Every building, once completed, concretizes a sort of social and natural proxemics”. This applies to the existence of most objects including artworks. Interactive artwork, in particular, adds a new element to the relationship between the artwork and the viewers. Our work, “Title of our work”, is meant to grab the attention of passers-by to pay more attention to the objects around them.

People are more likely to interact with objects that react to them. “Title of our work” idle mode mimics a still mirror until the sensor picks up on a motion. Once the sensor picks up that there is a person passing by (using the change in distance), the wall piece would play a short light sequence. It is a random blinking effect of lights that has a pleasant fall off, subtly creating a notion of “I just saw you pass by”. On the counterpart, the quick blinking light sequence draws the attention of the passer-by, thus creating a sense of curiosity.

Once the piece grabs one viewer’s attention, it will draw other people’s attention as well. One of our goals is to get people to interact with the piece collaboratively, creating a sensual co-existence. People adjust their distances between each other based on their social activities, but sometimes the distances are also used to raise defense mechanisms when others intrude within their spaces (Hall, 1969). The size of the piece requires participants to share a relatively small space, encouraging them to get close into each other’s personal spaces. We encourage people to get close to each other while interacting with our work. However, we are also interested to see how participants who don’t know each other well would behave in close proximity with each other when they are all drawn to the same object.

Through physical interactions with the piece, participants will gain aesthetic pleasure and gratification through the lighting patterns their actions trigger. After the adaption on the piece, we encased the sensor together with the LED panel so it wouldn’t be easily seen. The idea behind it is for participants to freely experiment with the piece to try to figure out the mechanism behind it driven by their curiosity. As Costello and Edmonds (2007) have put it in their study of play and interactive art, “stimulating “playful audience behavior might be a way of achieving a deep level of audience engagement.” We build on this concept in our interactive piece to obtain engagement and entertainment. Participants will eventually adapt to the ways that LEDs behave, and gain a sense of gratification by obtaining the knowledge of how it works. This kind of rewarding system behind the piece will keep them invested in the experience throughout the interactions. Furthermore, with this acquired knowledge, participants can go on to use this piece for more advanced performances such as making the LEDs react in a cohesive way with the sounds of music and etc.

References

Costello, B. and Edmonds, E. “A study in play, pleasure and interaction design”. ACM New York, 2007.

Le, Harry. “8x8x8 LED CUBE WITH ARDUINO UNO”. Youtube. https://youtu.be/T5Aq7cRc-mU. accessed on October 18th, 2019.

The DIY Crab. “DIY Infinity Mirror Coffee Table”. Youtube. https://youtu.be/OasbgnLOuPI. Accessed on October 18th, 2019.

Hamon, Philippe, “Expositions : Literature and Architecture in Nineteenth-Century France”, trans. Katia Sainson-Frank and Lisa Maguire (Berkeley: U of California P, 19).

Hall, E.T. “The hidden dimension”. Anchor Books New York, 1969

https://dl-acm-org.ocadu.idm.oclc.org/citation.cfm?id=1314168

Experiment 2: Forget Me Not

Exploration in Arduino & Proxemics.
An interactive plant that senses the presence of people in its proximity and alters its behaviour as per the proximity of the people with respect to it.

Team
Manisha Laroia, Nadine Valcin & Rajat Kumar

Mentors
Kate Hartman & Nick Puckett

img_20191016_200058_1-01-01

Description
We started ideating about the project on Proxemics with the intent of creating an experience of delight or surprise for the people who interacted with our artefact from varying proximities. We started exploring everyday objects, notably those you would find on a desk – books, lamps, plants and how they could be activated with servos and LED lights and those activities transformed with proximity data from a distance sensor. We wanted the desired effect to defy the normal behaviour expected from the object and that it would denote some form of refusal to engage with the user, when the user came too close. In that way it was anthropomorphizing the objects and giving them a form of agency.

We explored the idea of a book, a plant or a lamp that would move in unexpected ways. The size of the objects, the limitations of the servo in terms of strength and range of motion posed some challenges. We also wanted the object to look realistic enough not to immediately draw attention to itself or look suspicious, which would help build up to the moment of surprise. We finally narrowed down on an artificial plant that in its ideal state sways at a slow pace creating a sense of its presence, but shows altered behaviour whenever people come in threshold and near proximity of it.

img_20191021_150554-01

Inspiration
Don Norman in his book, The Design of Everyday Things, talks about design being concerned with how things work, how they are controlled, and the nature of the interaction between people and technology. When done well, the results are brilliant, pleasurable products. When done badly, the products are unusable, leading to great frustration and irritation. Or they might be usable, but force us to behave the way the product wishes rather than as we wish. He adds to it that experience is critical, for it determines how fondly people remember their interactions. When we interact with a product, we need to figure out how to work it. This means discovering what it does, how it works, and what operations are possible (Norman).

An essential part of this interaction is the affordance an object portrays and the feedback it returns for a usage action extended by the user. Altering the expected discoverability affordances and signifiers would result in making the experience stranger and surprising. With the rise of ubiquitous computing and more and more products around us turning into smart objects it is interesting to see how people’s behaviour will change with changed affordances and feedback from everyday objects in their environment, speculating behaviours and creating discursive experiences. Making an object not behave like it should alters the basic conceptual model of usage and creates an element of surprise in the experience. We felt that if we could alter these affordances and feedback in an everyday object based on proximity, it could add an element of surprise and open conversation for anthropomorphizing of objects.

The following art installation projects that all use Arduino boards to control a number of servos provided inspiration for our project:

surfacex

Surface X by Picaroon, an installation with 35 open umbrellas that close when approached by humans. See details here.

servosmit

In X Cube by Amman based design firm, Uraiqat architects consists of 4 faces of 3 m x 3 m each formed by 34 triangular mirrors (individually controlled by their own servo). All mirrors are in constant motion, perpetually changing the reflection users see of themselves. See details here.

dont-look-at-me

Elisa Fabris Valenti’s Don’t Look at Me, I’m Shy is an interactive Installation where the felt  flowers in a mural turn away from the visitors in the gallery when then come in close proximity. See details here.

robots_dune-raby

Dunne & Raby’s Technological Dreams Series: No.1, Robots, 2007 is a series of objects that are meant to spark a discussion about how we’d like our robots to relate to us: subservient, intimate, dependent, equal? See details here.

The Process
After exploring the various options, we settled on creating a plant that would become agitated as people approached. We also wanted to add another element of surprise by having a butterfly emerge from behind the plant when people came very close. We also had LEDs that would serve as signifiers along with the movement of the plant.

Prototype 1
We started the process by attaching a cardboard piece to the servo motor and tapping two wire stems with a plastic flower vertically on it, to test the motor activity. We wrote the code for the Arduino and used the sensor, motor, and the plant prototype to test the different motions we desired for different threshold distances.

img_20191009_141204-01

img_20191009_152458-01

The proxemics theory developed by anthropologist Edward Hall examines how individuals interpret spatial relationships. He defined 4 distances: intimate (0 to 0.5 m), personal (0.5 to 1 m), social (1 to 4 m) and public (4m or more). The sensors posed some difficulty in terms of getting clean data especially in the intimate and personal distance ranges. We decided on 3 ranges, combining the intimate and personal range into one less than a meter being < 1000mm, and kept the social range between 1000-3000mm and public ranges as 3000mm.

The plant has an idle state at more than 4 meters where it gently sways under yellow LEDs, an activated state where the yellow lights blink and the movement is more noticeable and an agitated state at less than a meter where its motion is rapid and jerky with red lights blinking quickly. Once we had configured the threshold distances, for which the motors could give the desired motion we moved to a refined version of the prototype.

Prototype 2
We made a wooden box using the digital fabrication lab and purchased the elements to make the plant foliage and flowers. The plant elements were created using wire stems that were attached to a wooden base secured to the servos. The plant was built using a felt placemat (bought from Dollarama) which was cut into the desired leaf like shape attached to the wire stems. Once we confined the setup into a wooden box, like a pot holding a plant, new challenge arose in terms of space constraint. Each time the plant would move the artificial foliage would hit the side walls of the box causing an interruption in the free motion of the motor. We had to continuously trim the plant and ensure the weight was concentrated in the centre to maintain a constant torque.

img_20191010_173559-01

img_20191009_153035-01

img_20191010_131929-01

The butterfly that we had wanted to integrate was attached to a different servo with a wire, but we never managed to get the desired effect as we wanted the rigging of the insect to be invisible for its appearance to elicit surprise. We therefore abandoned that idea but would like to revisit it given more time.

butterfly-setup

At this stage our circuit prototyping board, the sensors and the LEDs were not fully integrated into a single setup. The next steps was to combine all this in a single step, discretely hiding the electronics and having a single cord that powered the setup.

img_20191011_135618-01

led-setup

Final Setup
The final setup was designed such that the small plant box was placed within a larger plant box that housed all the wires, the circuits and the sensors. As we were to use individual LEDs the, connected LEDs were not able to fit in the plant box as they would hamper the motion of the plant and were integrated into the larger outer box with artificial foliage to hide the circuits.

img_20191016_162701-01

img_20191016_200307-01-01

Context aware computing relates to this, where: some kind of context2aware sensing method [1] which provides devices with knowledge about the situation around them; could infer where they are in terms of social action; and could act accordingly. The proxemic theories describe many different factors and variables that people use to perceive and adjust their spatial relationships with others and the same could be used to iterate people’s relations to devices.

img_20191016_200145-01

img_20191016_200300-01

img_20191016_200253-01

 

Revealing Interaction Possibilities: We achieved this by giving the plant an idle state slow sway motion. In case a person entered the sensing proximity of the plant, the yellow LEDs would light up as if inviting the person.

Reacting to the presence and approach of people: As the person entered the Threshold 1 circle of proximity the yellow LEDs would blink and the plant would start rotating as if scanning its context to detect the individual who entered in its proximity radius.

From awareness to interaction: As the person continues to walk closer, curious to touch or see the plant closely, the movement of the plant would get faster. Eventually if the person entered in the Threshold 2 distance, the right LEDs would light up and the plant would have violent movement indicating a reluctance to close interactions.

Spatial visualizations of ubicomp environments: Three threshold distances were defined in the code to offer he discrete distance zones for different interaction; similar to how people create these boundaries around them through their body language and behaviour.

img_20191016_200242-01


Challenges & Learnings

  • Tuning of the sensor data was a key aspect of the project such that we were able to use it to define the proximity circles. In order to get more stable values we would let the sensor run for some time, ensuring no obstacle was in its field, until we received stable values and then connect the motor to it or else the motor would take the erratic values and produce random motions from the ones programmed.
  • Another challenge was discovering the most suitable sensor positions and placement of the setup in the room with respect to the audience that would see and interact with it. It required us to keep testing in different contexts and with varying number of people in proximity.
  • Apart from the challenges with the sensors, we encountered other software and hardware interfacing issues. The programming of the red and yellow LEDs (4 of each colour) presented a challenge in terms of changing from one set to the other. They were initially programmed using arrays, but getting the yellow lights to shut off once the red lights were triggered proved to be difficult and the lights had to be programmed individually in order to get the effect we desired. In a second phase, we simplified things by soldered all the lights of the same colour in parallel and ran them from one pin on the Arduino.
  • The different levels of motion of the plant were achieved by a long process of trial and error. The agitated state provided an extra challenge in terms of vibrations. The rapid movements of the plant produced vibrations that would impact the box that contains it while also dislodging the lights attached to the container holding the plant.

Github Link for the Project code

Arduino Circuits
We used two arduinos, one to control the servo motor with plant and the other to control the LEDs.

motor-circuit

led-circuit

References
[1] Marquardt, N and S.  Greenberg,  Informing the Design of Proxemic Interactions. Research Report 2011100618, University of Calgary, Department of Computer Science, 2011
[2] Norman, Don. The Design of Everyday Things. New York: Basic Books, 2013. Print.

Experiment 2: Promexics Study/Interactive Infinity Mirror

Interactive Infinity Mirror

An interactive & LED-Light project that explored ‘Proxemics’
By Arsalan Akhtar, Jevonne Peters, Jun Li

Proxemics – the study of human use of space and the effects that population density has on behaviour, communication, and social interaction.

Abstract

This experiment is an interactive, LED-light project, exploring in a critique of the concept of proxemics – the study of human use of space and the effects that population density has on behaviour, communication, and social interaction. In our interpretation, we attempt to show a visual representation of various reactions to “personal space” that humans create around them, in the form of various interactions of light, and represent the idea of an ideal level of social interaction amongst multiple parties.

The purpose of this study is to visually demonstrate the effect people can have on each other through the use of different colour effects. The goal is to deconstruct the relationship between behaviours and colour, and reshape this relationship to be presented in a new form. Throughout this process, the parties control the effects based on the states they are in.

Keywords: Colour, Behaviour, Communications, Visualization, Proxemics, Interaction

Repo: https://github.com/jevi-me/CC19-EXP-2


Table of Contents

1.0 Requirements
2.0 Planning & Context
3.0 Implementation
3.1 Hardware
3.2 Software 
3.2.1 Arduino-Only Implementation
3.2.2 TouchDesigner + Arduino Implementation
4.0 Reflections
5.0 Photos
6.0 References


1.0 Requirements

The goal of this experiment is to creatively use a microcontroller, up to 3 rangefinders, and actuators: any number of individual LEDs, or up to 4 servos, to create a minimum of 3 distinct behaviours in response to the environmental conditions.

2.0 Planning & Context

ca_c_exp2_sketches_compiled-min
Sketches and Brainstorming from our planning phase

 

In the planning phase concepts of anxiety, one’s “personal bubble/space”, and ideal desired interactions were examined. We determined that the most effective way to illustrate these changes would be by using light and incorporated the interaction of the primary colours of light (R, B, G) to illustrate this. We used the three distance sensors to capture the location of three participants within the space of the installation.

We completed planning the final design and concept, and began purchasing materials locally and abroad on October 4th.

2.1 Distances

Four ranges were plotted for each of the three sensors that read in the distances. This is reflective of the personal preferences of the three parties:

  • Too close: this is a distance considered to be too close for comfort. This can vary from person to person, but for our study, we fixed this distance.
  • Comfort zone: the general region of comfort, where one isn’t out of touch, and not too close.
  • Out of touch: so far the interaction is not possible.
  • Ideal: the preferred level or region of interaction.

 

c_and_c_range_diagram2x
Diagram of Our Distance Ranges for ‘Green’, ‘Red’ and ‘Blue’.

In our code, these were called states:

Out of touch, also called idle -> 0
Comfort zone -> 1
Ideal Zone -> 2
Too close -> 3

2.2 Interactions

Based on the state of the system, different behaviours were manifested. For example, if the red and blue are in their respective comfort zones, the two zones meshed to form their additive colour, which is magenta. The table below shows the various interactions that were planned for both implementations.

Interactions
State All Parties Some Parties One Party
Out of Touch/Idle (0) Colour wipe Off (Black Static) Off (Black Static)
Comfort (1) Sparkle Glow Blink in the secondary colour blend  Blackout chase of the respective colour
Ideal Zone (2) Rainbow Rainbow Rainbow
Too Close (3) Blink white

if, parties remain in the zone, blink orange, if they stay longer, rapidly blink red.*

Blink in the respective colour Blink in the respective colour

* The time-based interaction was added from a suggestion from Nick.

The colour wipe when all parties are idle represents a state of receptiveness from everyone. No boundaries are being pushed, but no positive interactions are being made either. The colours cycle through without interacting with each other, and forgetting the previous state. If one or more sides are idle, and the other side(s) are in different state(s), the side goes ‘off’. The ‘off’ state indicates that it is not currently participating in the interaction that is happening. It is ‘Out of Touch’.

Once the comfort zone is entered by a  side, that side performs a ‘blackout chase’ of the colour it represents, i.e. the red side will have a red ‘blackout chase’ effect. When two sides enter the comfort zone, their secondary colour is shown. This represents the potential for ideal communication between the parties. Three in the comfort zone results in a Sparkle Glow, a combination of red, green and blue.

The rainbow, a common symbol of happiness and bridging, is used when one or more sides are in the ideal state. When within the installation, the desire is to remain in that state, and hope others can also find their ideal. Once everyone has it, the rainbow effect runs in sync, simulating an ideal flow of information and ideas.

There is of course, the potential for one to feel overwhelmed by a presence, and be ‘too close’ for comfort. When this happens, the corresponding side(s) flashes its colour repeatedly as a warning. If all sides are experiencing this, the additive colour (white), flashes for all sides. If this warning is ignored, the colour changes from white, to orange, then to ‘danger’ as red, speeding up as the warning continues to be ignored.

3.0 Implementation

3.1 Hardware

polaroidmoodboard-polaroidwall-min
3.1.1 LEDs

In this project, we were required to use individual LEDs. After considering the desired final effect of the project, we quickly abandoned the use of domed single colour LEDs, and decided to use WS2812 Neopixel LEDs. These have a full range of colours and can be individually addressed. This decision came with the challenge of soldering all the individual LEDs. Nonetheless, we were at an advantage as Jun Li had previous experience soldering, and was determined to take on the ambitious task.

As a first step, we measured and performed calculations on the box, and deduced that 23 LEDs were to be placed on each side, amounting to a total of 92 LEDs in all. Next was to create the LED strips to be placed inside the box. This involved first adding solder to the 6 connections, cutting the wires to the measured lengths, and soldering on each of the connections. A total of 1104 soldered connections were made by our team, and this figure did not include the several failures that occurred during the process. It was very important to ensure that each connection was sound and functioned impeccably. This was definitely a tough challenge for the team, and we relied heavily on the expertise of Jun Li as guidance, and who was aptly nicknamed ‘Solder King’. The entire soldering process took approximately 30 hours over the course of a few days.

3.1.2 Body

The body of the installation was a shadow box spray painted black. A hole was cut on the side to feed the wires from the LEDs to the Arduino. A mirror was placed at the base, and the LEDs around the inside of the frame. To cover the frame, we cut an acrylic sheet to size, and added a layer of reflective film to it. Arsalan and Jun Li used their knowledge of fabrication and workshopping to make precise cuts and measurements for the holes and the acrylic. Adding the film was a group effort as a smooth and reflective surface was key to creating the desired effect.

Finally, the three rangefinders were hidden under the lip of the shadow box, and the hidden wires fed to the breadboard at the back of the installation. The front and back of the installation were controlled by the single rangefinder located at the front, and the sides were controlled by their attached rangefinders.

img_6815

3.2 Software

3.2.1 Arduino-Only Implementation

The Arduino-only implementation was built with the backing of the Adafruit_Neopixel, and WS2812FX libraries. Several other libraries were tested, including FASTLED, Neo Patterns, and NeoPixel Painter, but ultimately, WS2812FX was selected due to its simplicity, and wide range of built-in effects. The loop of the code ran the service function for the WS2812FX library, and then the distance readings of the rangefinders taken. If the difference between the current and the last distance value measured was above the noise threshold, the new state of the section(s) that rangefinder controlled was determined, and the function to display that light effect was activated.

3.2.2 TouchDesigner Implementation

The effects were replicated using a TouchDesigner + Arduino implementation. In this setup, Arduino was used as a communication tool and bridge between TouchDesigner and the WS2812FX LEDs. All the interactive effects and settings were performed in TouchDesigner, and sent to the LED strip through Arduino in real-time.

4.0 Reflections

Implementation Explorations and Outcomes

The light effects in the TouchDesigner and Arduino implementation proved to be easier to create, as it is a node-based and artist-friendly software. The transitions were smoother, and more visually appealing. However, this came with several drawbacks. The Arduino Nano and Uno were both not powerful enough to provide stable performance as a result of the two-way real-time communication, and the bridge (the Arduino) required to send and receive a large amount of data per second.

There were two possible solutions to this problem: (1) increasing the processing power by using an Arduino Mega, and (2) lowering the frequency of data transfer which would affect the real-time interaction and cause a delay. These two solutions were combined, and the parameters adjusted optimally to give the best performance, and mimic the effect in the Arduino-Only implementation.

The Arduino-Only implementation had similar issues with both processing and electrical power. To solve this, the brightness was lowered, and the rangefinder values were only read in every 5 seconds, as that was found to be the bottleneck in the code. This unfortunately made the installation less real-time, but the benefit was an increase in reliability and performance.

The only major obstacles during the hardware build, was the time required to complete, and skill required to ensure that the connections were true. Although buying an LED strip, would achieve the same effect and solve both of these obstacles, it would have been in violation of the limitations of this experiment.

Concluding Thoughts

This artistic experiment was meant to allow participants to critically consider the ideas of personal space in the context of proxemic behaviour. In a related study, “Proxemic Behavior: A Study of Extrusion”, the cultural group and sex were held constant and introduced the problem of the interviewer’s movement from the original comfortable distance established by the subject. In all cases within that study, the subject re-established a new comfortable distance, which in our study we called the “ideal” zone. The study surmised that this new state of comfort was a compromise between the one originally chosen, and the distance assumed by the interviewer. In our case, a retreat from the ideal resulting in a warning signal, and a new state of comfort was not sought.

Technology-based new media art is one of the forerunners of future art, and allows the creation of collaborative, and interactive artworks. This experiment serves as an example of such.

5.0 Photos

6.0 References

  1. https://www.tandfonline.com/doi/abs/10.1080/00224545.1991.9924653
  2. https://www.youtube.com/watch?v=sAPGw0SD1DE
  3. https://www.youtube.com/watch?v=b2bvWArORSc

Experiment 2 – Pet Me (If You Can)

Project Title
Pet Me (If You Can)

Team Members
Jignesh Gharat, Neo Chen, and Lilian Leung

Project Description
Our project explores the ability to create a creature character with the ability to surprise viewers with interactivity with the use of two distance sensors. The experiment is an example of living effect to give a machine a life of its own and the use of different modes of operation to create distinct emotions with the creature.

The creature was created with the use of two Arduinos, three servos, a row of LEDs and two distance sensors. The creature sits on a pedestal and moves on its own accord, surprises viewers when they come nearby by closing its mouth and eyes becoming from erratic.

Project Video

You can access the code for the experiment here
https://github.com/lilian-leung/experiment2

Project Context

To create a creature with the use of servos and sensors. We explored the ongoing questions as to “Why do we want our machines to appear live?” as mentioned by Simon Penny, a new media artist and theorist. Caroline Seck Langill in The Living Effect: Autonomous Behaviour in Early Electronic Media Art (2013) argues that we create lifelike characteristics to elicit a response from the audience that is suggestive of a fellow life-form to achieve living effect, where as we do not attempt to re-create life but rather to “make things have a life of their own,”.

Our original intention for the project was to create a creature to be halloween themed or to have a security-like box that would guard a valuable item such as jewellery or used for everyday use such as guarding cookies within a cookie jar-like shape.

Langill (2013) proposed the three characteristics of living effect being first, an adherence to behaviour rather than resemblance, the second; the effect is one of a whole body in space with abilities and attributes, and the third being potential for flaws, accidents and technical instabilities as imperfections allow one to acknowledge to living effect within a synthetic organism.

We initially began prototyping using the oscillation of two servos to create the eyes. We began our prototyping with the use of post-it notes over the two servos for the eyes to get the movements of the pupils to move at a natural speed with the use of easing using Nick’s animationTools Arduino library.

In Robotics facial expression of anger in collaborative human–robot interaction (2019) Reyes, Meza and Pineda describe expressive robotic systems favour feedback and engagement with viewers. Emotions such as anger created the most effective responsonse with participants. Using the minimal amount of facial expressions with the components available, we tried to replicate a human-like expression as an indicator for possible modes of operation the creature could react to.

From there we incorporated the main body (the box) of the creature and began exploring ways we could have the box open. Our initial thoughts were to save some sort of lever outside and above the box that would pull the lid open with the use of thread or fishing wire.

 exp2_wip-img1

We also explored possibly having the servo on the side of the box, but were concerned that the motor wouldn’t be able to handle the full width of pushing the lid open from one side. In the end, we landed on having the servo inside the box, in the back centre area where it could push the lid open with the assistance of a curved shape to reach the lid. We then began testing to get the right angle to use for the servo as well as what range it should have, so as not to push the servo out from its spot inside the box or to open to box too wide.

Servo Testing Gif

Before laser cutting out all our final shapes, we tested each component separately using the breadboard to make sure the circuit was functioning before soldering each piece. From there we built out new facial features by using the opening box and laser cutting a tongue-like shape which we then lit up using red LEDs.  We laser cut the pupil and iris to attach onto the servos, as well as making a small enclosure to hide the actuators afterwards. All the cables are then looped inside the box and tucked in the back to keep the cables tidy when the creature opens its mouth.

exp2_wip-img5

exp2_wip-img3

The creature has three modes of operation:

  1. The eyes oscillate from 0 to 180 degrees slowly within the two meter “safety zone” away from viewers at a speed of 0.1. Within this safety range, the servo controlling the mouth also props the mouth open as it deems the area “safe”. During this time, the LEDs within the tongue piece are lit up.
  2. Within the middle zone, the creature becomes “conscious” of viewers and the speed increases to 0.2. The increased speed of the eyes signify hesitation or caution with the creature.
  3. When viewers come in to the “danger zone” within approximately one meter from the object, the speed of the eyes increases to 0.8 and the mouth shuts closed.

To avoid overloading one of the Arduinos and to make sure the electrical circuit was consistent, we divided the sensor controlling the two servos for the eyes from the sensor that controls the LEDs and servo motor that controls the mouth opening. 

One of our challenges was working with the noise generated from the sensors which caused some of the modes of operation to fluctuate from opening the mouth to immediately dropping even when viewers are within a safe distance away from the distance threshold. Though we adjusted the settings to make the middle range shorter to turn the noise disturbance within the sensor to appear more like a laughing motion by the creature.

After our presentation, we got feedback as to how we could better incorporate the sensors into the experiment, so that the experiment can become more mobile and easily placed in different situations.

exp2_wip-img6

Our solution was to attach the creature onto a pedestal and have the sensors hidden away below the surface. The creature stands upright as if it were an exhibition piece. The creature takes a personality of its own when the eyes oscillate as if they’re patrolling the surrounding area and closes its mouth and looks downwards when viewers approach it in a more humble composure. 

1111

2222

6666

5555-copy

1010

Sources

Arduino. (2017, January 13). How to Use an Ultrasonic Sensor with Arduino [With Code Examples]. Retrieved from https://www.maxbotix.com/Arduino-Ultrasonic-Sensors-085/.

Circuit Digest. (2018, April 25). Controlling Multiple Servo Motors with Arduino. Retrieved from https://www.youtube.com/watch?time_continue=9&v=AEWS33uEwzA

Langill, Caroline. (2013). “The Living Effect: Autonomous Behavior in Early Electronic Media Art.” Relive Media Art Histories. Cambridge MA: MIT Press. pp.257-274. .

Programming Electronic Academy. (2019, July 2). Arduino Sketch with millis() instead of delay(). Retrieved from https://programmingelectronics.com/arduino-sketch-with-millis-instead-of-delay/.

Puckett, N. (n.d.). npuckett/arduinoAnimation. Retrieved from https://github.com/npuckett/arduinoAnimation.

Reyes, M. E., Meza, I. V., & Pineda, L. A. (2019). Robotics facial expression of anger in collaborative human–robot interaction. International Journal of Advanced Robotic Systems, 16(1), 172988141881797. doi: 10.1177/1729881418817972

Experiment 2: Pro Yoga!

Names: Liam Clarke, Rittika Basu & Katlin Walsh

Project Description: 

Project “Pro Yoga” blends the idea of fun and fitness for everyone. It’s a fascinating installation for exercising where both the body and the mind are engaged for physical, mental and spiritual upliftment through the art of Asanas. This interactive training set-up can be utilized by any fitness center, yoga studios and at home. The goal of this activity is to make the tri-colored sets of LEDs activate at various heights by stretching out one’s limbs towards the Ultrasonic Range Finders through different Yoga positions. The goals will be set by the verbal instructions from a yoga instructor in the prototype, while the next generation would be a virtual instructor. The activity of making the LEDs activate functions as an indicator to let the participant know they are in the correct position. For example, the instructor might ask the participant to make the blue LED (placed in front at height 5’2 ft), red LED (placed on the right side at ground level) and green LED (placed behind at height 5’2 ft) light up together. The user’s objective is to stretch out his/her arms and legs to make the specified LEDs activate, hold that pose steady for a set amount of time before the next pose.  

Visuals: 

Work-in-Progress Images:

Setting the Ultrasonic Range Finder at different heights and orientations for testing the LED’s blinking functions from various ranges of distance.

20191010_142248

20191011_114310

testing

20191011_114301

Final Images:

img-20191011-wa0017

img-20191011-wa0007

img-20191011-wa0017

003

Interaction Video:

Project Context:

Ideation:

ideation

Our brainstorming of ideas started with sketching possible concepts while exploring various functions of servos and LEDs. We also discussed the various projects created by Arduino software accessible from online platforms like Pinterest, Design Boom Magazine and Arduino Project Hub, YouTube, etc. 

Several ideas involved in creating a space for performing arts and divergent forms of physical activities. The first involved a dance contest, where a ‘Thumbs Signal Sticker’ could be attached to the servo. Essentially a multiplanar Dance Dance Revolution, LEDs are triggered by the distance sensor. The servo attachments can rotate the ‘Thumps-Up’ (180° – indicating a good grade) or as a ‘Thumbs-Down’ (0° – indicating a bad grade) or if the performance gets undetected it turns into a ‘Thumps in the Middle’ (90° – indicating an average grade). 

Multiplanar DDR led to a ‘Body Twister Game’ where the LEDs will blink according to the movements of contestants. We planned to create groups of 2-3 and follow the rules of the original ‘Twister’ game, where contestants would get tangled through trying to activate LEDs. In this version of the game, sensors would be placed at three points around the user to create a common area where the sensors converged. Players would need to be mindful of all body parts present in order to trigger and maintain colors for a specified period of time.

Post several trials and deciding earlier ideas were lacking without a further developed interface, we worked our project into a ‘Health and Fitness System’. We decided on an interactive installation that would be enjoyable and healthy, a gamified home yoga center. In this version, a user becomes more mindful of the space in which they are occupying, allowing them to return to a meditative state while engaging in their practice. Through the use of changing colors, harnessing spiritual meditation and mindfulness of one’s body through a guided practice can be encouraged without interrupting a user’s thoughts. 

body-twister

thumbs-up

Creating and Prototyping:

We further refined our brainchild into ‘Yoga-Training equipment’ where three of the Ultrasonic Range Finder -LV-MaxSonar-EZ0 would be placed in a triadic arrangement. Each section is accompanied by three LEDs of different colors, ie. Red, Blue and Green. Each one of these sections is placed at a different height, orientation and the code is calibrated differently depending on the target limb used on that particular sensor.

The idea is to activate the ‘Blinking’ functions with the aid of Yoga Postures. While calibrating within the code, we had to keep physiology and the body motions of yoga in mind as they relate to the distance sensors.

Initially, we tested the prototype by using one LED and observing how it blinks with input data from the distance sensor. Adding on more sensor/LED setups, we experimented with orientations, lengths and heights of the three sensors.

The code is relatively simple, it involves activating specific LEDs based on input data from the distance sensor. The only issue was trying to find the best-suited range from input data to make holding the pose difficult while not impossible.

Coding

Moving beyond the context of this project, the use of RGB LEDs, battery packs, and additional sensors would be investigated to create a product that is more compact, and able to be placed in a room without wires being shown. This would only require small changes to the code framework, in addition to some hardware upgrades to swap LED types.

CODE

https://github.com/lclrke/Pro-Yoga

Data Collection: Researching on Yoga and Asanas:

yoga-sketches

yoga-poses

While selecting the yoga posture for our installation, we had to consider various factors. We shortlisted simple postures that could be carried by beginners, elderly individuals and even kids. These are enjoyable, healthy and easily integrates with our Arduino installation. They are listed below as: 

  1. Mountain Pose/Tadasana 
  2. Warrior 1 & 2 Pose/Virabhadrasana Pose ( Referred to in our experiment)
  3. Triangle Forward Pose/Trikonasana
  4. Raining Hands in Lotus Pose/Padmasana

Execution: An example of an experiment scenario

Step-by-Step instructions for training the participant to perform the Tadasana Asana or the Warrior Yoga Pose

  1. Welcome to the Pro Yoga experience.
  2. Kindly, step-up on the yoga mat.
  3. We will start by the Mountain pose/Tadasana – Standing Asana that forms the foundation of other standing Yoga poses. 
  4. Keep breathing in and out slowly throughout the whole process.
  5. Bend your left leg forward, try to turn the Red LED on (its blinking function may be positioned from 700mm – 1000mm & placed at the ground level height). Here the participant is expected to bend her leg and stress on the knee in the forward position as long as the specified LED doesn’t blink. 
  6. Stretch out your left hand, maintain the arm at your shoulder level and try to turn the Green LED on (its blinking function may be positioned from 300mm – 500mm & placed at the height of 5’2). Here the participant is expected to stretch out her left hand towards the sensor maintaining the joints at 180 degrees, as long as the specified LED doesn’t blink.
  7. Now, extend your right hand straight in the opposite direction, maintain the arm at your shoulder level and try to turn the Blue LED on (its blinking function may be positioned from 600mm – 800mm & placed at the height of 5’2). Here the participant is expected to stretch out her right hand towards the sensor maintaining the joints at 180 degrees, as long as the specified LED doesn’t blink.
  8. Finally, the participant is expected to hold the performed Yoga posture (Warrior Pose here) for 10-30 seconds and then relax before executing the next Asana/Yoga pose.

References:

This project involves a YogAI, (Yoga Instructor)who focuses on and guiding the user through the workout. It extracts anatomical key points to detect the posture configuration and give continuous feedback on posture correction. Our project derives from this idea in the sense of having a user-guide throughout his/her training session. But Pro Yoga focuses more on easy and universal poses in addition to special focus on the stretching of the limbs along with positive reinforcement (LEDs blinking).

1.“YogAI: Smart Personal Trainer”. Arduino Project Hub, 2019 https://create.arduino.cc/projecthub/yogai/yogai-smart-personal-trainer-f53744.

2. “Change Your Meditation With Colors Spirituality”. Yogi Times, 2015 https://www.yogitimes.com/article/meditation-colors-spirituality

3. “New Energy Geographies: A Case Study of Yoga, Meditation, and Healthfulness”. Journal of Medical Humanities, 2015

https://link.springer.com/article/10.1007/s10912-014-9315-3

4. “E-traces creates visual sensations from ballerinas” Arduino Blog, 2014

https://blog.arduino.cc/2014/11/05/e-traces-creates-visual-sensations-from-ballerinas/

5. This project involves an LED that changes color according to the movement of one’s hand towards the four directions. We were referring to its set-up and coding for our project development.

Motion Controlled Color Changer!“. Arduino Project Hub, 2016 https://create.arduino.cc/projecthub/gatoninja236/motion-controlled-color-changer-299217?ref=tag&ref_id=motion&offset=0

6. We referred to various images of Yoga and Asana positions for studying, comprehending and shortlisting postures that can be used in our project. We went for simple poses that could be carried by beginners, elderly individuals and even kids. 

6.1 RelaxingRecord.com. Top Ten Yoga Positions For Beginners. 2019 http://www.relaxingrecords.com/2015/11/17/top-ten-yoga-positions-for-beginners/.

6.2  Fitwirr. Fitwirr 24 Yoga Poses For Beginners – Yoga Kids (Laminated Poster) – Kids Yoga Poses – Yoga Children – Yoga For Kids -Yoga Wall Arts – Yoga Poster. 2019 https://www.amazon.com/Fitwirr-Yoga-Poses-Beginners-Laminated/dp/B07C1SQK6L.

7. Puckett, Nicholas. October 4 – Videos I Creation & Computation 001. 2019 https://canvas.ocadu.ca/courses/30331/pages/october-4-videos.

8. Puckett, Nicholas. October 8 – Videos I Creation & Computation 001. 2019 https://canvas.ocadu.ca/courses/30331/pages/october-8-videos