Author Archive

Hexa

EXPERIMENT 3:

HEXAFind the pattern to reveal the image

By Adit Verma and Arjun Dutt

 

microsoftteams-image-5

Project Description 

 Our original concept was to create a piano/keyboard that, when a key was pressed, it would trigger some interactivity on the corresponding digital media. However, after a round of feedback we realized that as a concept, it was too simple and too linear. We needed to push the boundaries a bit and further evolve from the piano idea. So we went back to the drawing board and brainstormed on a couple of new ideas.

We eventually settled on making an interactive, pattern-finding, game. For this, we have created a boardgame- with multiple hexagons drawn on its face. Under the surface of the board, we connected 7 touch sensors to 7 out of 31of the drawn hexagons. As a player, you are given 10 hexagonal metal nuts to place on the board to create a pattern with. Once you press the start button at the front , a timer will start and you will have one minute to find the pre-set 7-nut-pattern on the board-game. If you place a nut on a hexagon that’s hooked up to a touch sensor, a pixelated image will pop-up on the screen. Each following nut, if correctly placed in its pattern, will trigger a blur function in the image, making it less pixelated than it was previously. In the end, if you have placed all 7 nuts properly, the digital media which would have started off as completely pixelated, should now be in high resolution.

 

HINT FOR THE PATTERN: All metal nuts placed on the board must be separated by one empty hexagon.

 

Link to Experience Video :

https://vimeo.com/646470038

 

 Link to How It Works Video:

https://vimeo.com/646470177

 

 

Final Project Images: 

microsoftteams-image-3  microsoftteams-image-4 microsoftteams-image-2 

 

 Development Images:

whatsapp-image-2021-11-16-at-3-12-06-pm-2  whatsapp-image-2021-11-16-at-3-12-06-pm-1  whatsapp-image-2021-11-16-at-3-12-06-pm

 

Link to the Arduino and Processing code hosted on GitHub:

https://github.com/HippieInDisguise/Hexa

 

 Circuit Diagram:

microsoftteams-image

IT COMES IN WAVES

IT COMES IN WAVES
ARJUN DUTT
Since I was a child, I’ve always been a lover and an appreciator of music. For Experiment 2, I wanted to try and channel my love for music into something tangible that could perhaps have me engage with music in a way that I haven’t been able to before. The idea that proceeded to formulate in my head was to make a lamp with multiple-coloured LED’s that would react (strobe/flicker) to sound. The idea of an LED lamp that reacts to sound probably isn’t a novel one but being able to make interactive objects like this has been a long standing dream of mine, and one I’m happy to be able to fulfil. I have connected three colours of LED lights- Red, Yellow and Blue. If there is no music in the room, only the Yellow LED’s will remain on. If there is moderate volume of music in the room, the Red LED’s will start flashing. And if the music is loud, all the LED’s will start flashing.
• Discussion of Calm Technology Principles :

My LED lamp is engaging with 3 Principles of Calm Technology, namely:
1. Technology should require the smallest possible amount of attention: The speaker is in a dormant state when there is no/less ambient sound. But if someone decides to have a party and play loud music, then the speaker will reflect the same ‘mood’ and react accordingly.
2. Technology should make use of the periphery: What looks like an otherwise unassuming, everyday origami lamp at first glance, quickly attracts our attention as the LED’s light up. Something that is just part of the furniture is made to stand out by its state changes.
3. Technology should work even when it fails: If all else fails and my music reactive code doesn’t run, it will still work perfectly as a regular lamp.

Experience Video:
https://vimeo.com/manage/videos/639168677

How It Works Video:
https://vimeo.com/manage/videos/639169990

 

img_20211020_164354 img_20211024_123821 img-20211019-wa0002 img-20211019-wa0003

screenshot_20211026-231348 screenshot_20211026-231435 screenshot_20211026-231413

 

• Arduino Code (I couldn’t figure how to upload it on GitHub):

int soundsensor = A0;
int led1 = D12; // yellow
int led2 = D11; // yellow
int led3 = D10; // red
int led4 = D9; // red
int led5 = D8; // blue
int led6 = D7; // blue

void setup() {

Serial.begin (9600);
pinMode (soundsensor, INPUT);
pinMode (led1, OUTPUT);
pinMode (led2, OUTPUT);
pinMode (led3, OUTPUT);
pinMode (led4, OUTPUT);
pinMode (led5, OUTPUT);
pinMode (led6, OUTPUT);
}

void loop() {

int sensorvalue = analogRead (soundsensor);
Serial.println (sensorvalue);

if (sensorvalue >= 200)
{
digitalWrite (led1, HIGH);
digitalWrite (led2, HIGH);
}

if (sensorvalue >= 430)
{
digitalWrite (led3, HIGH);
digitalWrite (led4, HIGH);
}

if (sensorvalue >= 500)
{
digitalWrite (led5, HIGH);
digitalWrite (led6, HIGH);
}

else {

digitalWrite (led3, LOW);
digitalWrite (led4, LOW);
digitalWrite (led5, LOW);
digitalWrite (led6, LOW);
}}

• Circuit diagram:

screenshot-2021-10-26-at-11-49-09-pm

 

Breaking The Ice

This being my first foray into the world of coding, “Breaking The Ice” is a group of 4 interactive art experiments that have for all intents and purposes, broken the frozen layer of ice between myself and coding in the world of Java script. After a cautious start, I am inspired to delve deeper into the virtual world of multimedia art and interactivity.

Keeping in mind my tag of being a novice, I wanted the art/media I was creating to be fairly straightforward and focus rather on the interactivity between the body movements and the art to see how movements and gestures in the physical world could alter the outcome in the virtual world. The interactive artworks all start from a single shape and respond to changes in movements and gestures of the viewers, such that the viewers can direct the outcome of the piece.

The task at hand was to use our bodies as a controller and to find/create 2 ways to perform a “Click” function and 2 ways to perform a “Scroll” function, both otherwise arbitrary motions we take for granted, carried forward by the mouse. In this instance, we will substitute the action of a mouse “click” or “scroll” with a predefined body movement or gesture. Made using the p5.js web editor, ML5 Library and PoseNet.

 

Click #1 : Red-tangles

My first piece is a play on words on ‘Rectangle’. In this case, I have moved the background to setup which allows it to draw over itself again and again in a loop. The screen will stay red until both wrists are brought together in front of the face, like a clicking action. Just then small, white rectangles should start to emerge on screen. And when the wrists are separated again, the ‘red-tangles’ will start to fill the screen up again, eventually going back to an all red canvas.

screenshot-2021-09-28-at-4-11-01-pm

Video Demonstration: https://vimeo.com/616961584

Source Code: https://editor.p5js.org/arjundutt20/sketches/CLlRpep7g

Presentation: https://preview.p5js.org/arjundutt20/present/CLlRpep7g

 

Click #2 : Vertigo

‘Vertigo’ is intended to recreate the feeling of freefalling. Stimulated by my fear of sudden vertical drops, I wanted to see if I could visualize a long, constant free-fall. In this piece, the constantly changing grayscale of the rectangles is meant to depict motion as you fall past each subsequent rectangle. The grayscale freezes if the viewers right wrist touches their left shoulder. An action that can be considered to be ‘half’ of the full brace position one assumes when they’re about to face impact.

screenshot-2021-09-28-at-4-59-07-pm

Video Demonstration:  https://vimeo.com/616962516

Source Code: https://editor.p5js.org/arjundutt20/sketches/skWqat5x3

Presentation: https://preview.p5js.org/arjundutt20/present/skWqat5x3

 

Scroll #1 : Hollipse

‘Hollipse’ or the hollow ellipse was inspired by artists who can draw with both hands simultaneously. Initially, I had given the draw function to the X/Y position of the mouse. So whenever the mouse moved on the canvas it would form a design and its subsequent mirror design would be drawn too. I had later substituted the mouse function for the nose tracking feature in PoseNet. A scroll is a lateral movement in that could move your position on the screen, in this example, I have allowed the scroll to move freely in any direction.

screenshot-2021-09-28-at-5-04-06-pm

Video Demonstration: https://vimeo.com/616960983

Source Code: https://editor.p5js.org/arjundutt20/sketches/0bPTkezl1

Presentation: https://preview.p5js.org/arjundutt20/present/0bPTkezl1

 

Scroll #2 : The Blue Ball

The Blue Ball was the first piece I worked on and one that proved the most informative. The Blue Ball is the representation of the nose on the screen in real time. Wherever you move your face, the ball will follow (albeit mirrored). The closer you bring your face to the camera, the bigger the ball will become. If you take a long, sweeping step back- the ball will become smaller. This back and forth motion relative to the placement of the screen is how I wanted to relate it to the ‘scroll’ function.

screenshot-2021-09-28-at-5-18-15-pm

Video Demonstration: https://vimeo.com/616962613

Source Code: https://editor.p5js.org/arjundutt20/sketches/63bL8aicu

Presentation: https://preview.p5js.org/arjundutt20/present/63bL8aicu

 

Observations and Reflections:

I was very apprehensive at the start because this was my first time I’d be dabbling in coding. I found the syntax to be particularly difficult to wrap my hear around. The more I watched videos and practiced and asked questions of my classmates, the more familiar it became. I am definitely going to keep tinkering with P5.js and PoseNet, it opens the door to interactivity between the physical and virtual world.

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.