Experiment 5 – Unexpressed?

Unexpressed_JG

Project Title
Unexpressed? (Working Title)
An interactive blank canvas.

Project by Jignesh Gharat


Project Description

On our visit to art galleries and museums, we come across abstract as well as minimal art and try to appreciate and interpret the meaning behind it through gestures, expressions, emotions evoking the unconscious mind with movement and color.

Abstract expression finds its roots in ‘intuition’ (of the artist) and ‘freedom’ (for the artist as well as for the viewer). The artist uses their imagination to look beyond what we can physically see and translate intangible emotions onto the canvas. The audience then tries to connect to the artist’s intention and free their mind of visual restrictions.

Minimalism is getting away from the individual expression it doesn’t always have a connection with the artist.

What is a blank canvas?
An empty canvas with some texture, lines, pattern, and tints of white. Or an expression emptiness, curiosity stories, could be anything.

What if a blank canvas could try to express its self instead of viewers projecting their interpretations, emotions, beliefs, and stories. 


Technology

  • Mackbook pro
  • 2 Arduino UNO/ Nano
  • 2/3 Servo motors
  • VL53L0X laser distance sensors 
  • LED Bulb

Materials

  • Stretch fabric, Lycra Fabric
  • Face mask, hand model
  • Old/vintage wooden canvas frame 2’ x 3’
  • 4 Plywood 6” x 32”

Work plan

22nd – 23rd _Material procurement, storytelling, and ideation
24th – 27th _Code- Debugging, Calibrating distance sensor, Mockup, Testing
27th – 30th_Iterating, Fabrication, and Assembling
1st – 2nd_Installation/ setup
3rd – 4th_Exhibitions


Physical installation details

As the viewer comes in the range of the distance sensor the blank white canvas will start showing an embossed human figure popping out of the canvas surface.

sketch_JG

Exhibition Space

Preferably a silent art gallery space to mount a heavy canvas on the wall. Spotlight on the canvas.

HOVER BEAT

cover

HOVER BEAT

Musical instrument  without Touch 

Jignesh Gharat


Project Description:

Hover Beat is an interactive musical instrument that is played without touch or physical contact installed in a controlled environment with a constant light source. The project aims to explore interactions with a strong conceptual and aesthetic relationship between the physical interface and the events that happen in the form of audio output.

Interaction:

A project’s potential radius of interaction is usually determined by technical factors, be it simply the length of a mouse cord or the need for proximity to a monitor used as a touch screen, the angle of a camera observing the recipient, or the range of a sensor. However, the radius of interaction is often not visible from the outset—especially in works that operate with wireless sensor technology.

In the project, the attempt is not to  mark out the radius of interaction or special boundaries at all, so that it can be experienced only through interaction

Explorations & Process:

I started exploring different sensors that can be used for controlling sound. Starting with a sound detection sensor, flex sensor, capacitive DIY sensors using aluminum foil, pressure sensors and finally ended up using a light sensor to make the interaction invisible. So that the user doesn’t see or understand how the instrument actually works and opens many possibilities to interact with the object and they explore, learn while interacting.

img_20191029_131438855

Flex Sensor | Arduino Uno or Arduino Nano

img_20191030_201510090_burst000_cover_top

Light sensor LDR | Arduino Uno | Resistor

img_20191031_120447094

Using a glass bowl to calibrate the base sensor data reading in- DF 601 studio environment at night.

How does the sound actually change?

The data coming from the Arduino and LDR are used in processing to control playback speed and amplitude of the sound. A steady reading is used as a benchmark which is from the constant light amount detected by LDR.

Libraries in Processing: import processing.sound.*; ( For SimplePlayBack )
import processing.serial.*; ( import the Processing serial library )

Sound:

I started experimenting with 8-bit sounds, vocals, and instruments.  The sounds changed its playback rate and amplitude based on the amount of light received by the LDR sensor. to minimize the noise and improve the clarity in ever-changing sounds the best option was to work with clean beats and so I decide to work on instrumental music. The interactions were mostly using hands so I did some research on musical instruments that use hands and make distinct clear sound beats. Indian classical instrument TABLA  a membranophone percussion instrument was my inspiration to develop the form and interaction gestures.musician-tabla

Aesthetics: 

The form is inspired by the shape of Tabla.  Use of semi-circular glass bowl to define the boundary or say a start point to measure readings and define the limit of interaction in a radius as it uses the LDR sensor. The transparent material of the glass actually confuses the user and makes them curious about how it works. The goal was to come up with a minimal simple and elegant product which is intuitive and responsive in realtime.

img_20191031_163131 img_20191031_163119screenshot-2019-11-05-at-7-20-15-pm

Exhibition Setup:

The installations work only in a controlled environment where the light quality is constant and won’t change because there is a base reading calibrated and used as a bench mark to change the sound.

screenshot-2019-11-05-at-7-17-55-pm

Experiment Observations:

The affordances were clear enough. Because of the sound playing the users got a clue that the object has something to do with the touching or tapping but later on interacting they found out quickly that it’s just hovering at different heights over the object to manipulate the sounds of tabla. People tried playing with the instrument. People with some technical knowledge on sensors were more creative as they found out its the light that controls sound.

Coding:

Github – https://github.com/jigneshgharat/HoverBeat

Scope:

The experiment has laid a foundation to develop the project further and make a home music system that reacts to the light quality and sets the musical mood for example if you dim the lights the music player switches the mood to an ambient peaceful soundtrack. A new musical instrument can be made just by using DIY sensors at a very low cost with new interesting interactions.

References:

 

Eggxercise

 

screenshot-2019-10-01-at-10-27-05-am

Eggxercise                                                     

 (Creation & Computation DIGF-6037-001)

  Team : Jignesh Gharat  & Nadine Valcin                                                          Mentors: Kate Hartman & Nick Puckett


Project description

Eggxercise is a hybrid digital and physical version of the traditional egg-and-spoon race where participants balance an egg on a spoon while racing to a finish line. It replaces the typical raw egg used in the classic version with a digital egg on a mobile screen. If the egg touches the side of the screen, it breaks, displaying the contents of a cracked egg with the message “Game over”. 

Because of space and time constraints, a relay race format was used for the in-class demonstration. The participants were divided into 3  teams whose members had to go through a simple obstacle course made out of a simple row of chairs. When they finished their leg, they had to pass on the phone to the next member from their team without breaking the egg. If at any moment, the egg broke, the participant who was holding the mobile phone had to reload the game and wait for the expiration of the 5 second countdown to resume the race.
game-play

Presentation Day

Project context

We both shared a strong desire to explore human-computer interaction (HCI) by creating an experience with an interface that forced participants to use their bodies in order to complete a task. That physical interaction had to be sustained (unlike a momentary touch on the screen or click of a mouse) and had to be different from the many daily interactions people have with their smart devices such as reading, writing, tapping, and scrolling. In other words, we were searching for an experience that would momentarily disrupt the way people use their phones in a surprising way that simultaneously made them more aware of their body movements. 

We also wanted to produce something that was engaging in the simple manner of childhood games that elicit a sense of abandon and joy. It had to have an intuitive interface and an immediacy that didn’t require complex explanations or a high level of skill, but it simultaneously had to provide enough of a challenge as to require a high level of engagement. As Eva Hornecker remarks:

 “ We are most happy when we feel we perform an activity skillfully […] Tangible and embodied interaction can thus be a mindful activity that builds upon the innate intelligence of human bodies.”  (23)


poseNet()  Library (ML)

We explored the different ways in which the body could be used as a controller mainly through sound and movement. Posenet – a machine learning model that allows for Real-time Human Pose Estimation (https://storage.googleapis.com/tfjs-models/demos/posenet/camera.html) offered the possibility of interacting with a live video image of a person’s face. We conjured an experience that would attach a virtual object to a person’s nose and allow the person to move that object along a virtual path on a computer screen. This led to the idea of a mouse, following its nose to find a piece of cheese. It would force users to move their entire upper body in unusual ways in order to complete the task.

We then moved onto the idea of controlling an object on a mobile device through voice and gestures. Building on our desire to make something inspired by childhood games, we decided to transpose the egg-and-spoon game. We didn’t want a traditional touch interaction so we used the accelerometers and gyroscopes data on the phone to sense tilting, rotation, and acceleration to control the movements of a virtual egg on a mobile phone. This allowed for immediate and unmediated feedback to the user who could quickly gauge the acceptable range of motion required not to break the egg. This can be seen as an application of a direct manipulation interface (Hutchins, 315) where the object represented, in this case, the virtual egg, behaves in a similar fashion than a real egg would if put on a moving flat surface. The interface also feels more direct because the user’s intentions to balance the egg as demonstrated by their hand movement provide the expected results that follow the normal rules of physics. 

Eggxercise is engaged with the aim to investigate a natural user interface (NUI) where the interaction with the smart device is direct, tangible and consistent with our natural behaviors. Within that paradigm, it aligns with the multi-disciplinary field of tangible embodied interaction (TEI), that explores the implications and rich potential for interacting with computational objects within the physical world and with the projects and experiments of  MIT’s Tangible Media Group, led by Hiroshi Ishii, that continuously search for new ways “to seamlessly couple the dual worlds of bits and atoms.” (tangible.media.mit.edu)

Mobile phone game play demo

Our project integrates a virtual object on a physical device that responds to movement in the physical world in a realistic way. In that way, it is related to the controllers that are used in gaming devices such as the Nintendo Wii and the Sony Playstation. It also has the embodied interaction of the Xbox Kinect while maintaining a connection to a real-world object.


Game court

Played between 3 teams of 6 participants on a 10mt relay track. The layout looked as illustrated in the image below.

Game Court


The code for our P5 Experiment can be viewed on GitHub:

https://github.com/nvalcin/Eggxercise/blob/master/Final%20code


Technical issues

The sound for Eggxercise launched automatically on Android devices. We spent a lot of time trying to get the sound to work on the iPhone only to accidentally discover that users had to touch the screen to activate the sound on those devices.

Sound output:

  • Android Device – Browser used Firefox.
  • iPhone Device – Browser used safari but only works on the first touch.

Next steps
  • A background with lerp color to have the screen turn red as the egg approached the edges.
  • More levels, increasing the speed of the egg, adding a second egg or obstacles on the screen. 
  • A timer indicating how long the users had managed to balance the egg

Observations / User test
  • The participants were using a different browser on mobile phones with different operating systems and specifications and all the phones were connected over wifi to load the code from P5.js, It was difficult to start the game at the same time as the loading time was different.
  • Participants having a large screen size had an advantage over others.
  • Adding obstacles on the path made the game more challenging and fun.
  • The background sound (Hen.mp3) did enhance the experience as the motion of the egg changed the speed and the amplitude of the sound.
  • Participants were having a good time using the app themselves as well as watching others play.

Learnings
  • Getting started with creative coding and understanding the basic workflow of P5.js and java.
  • Integrating graphics with code.
  • Making the screen adaptive to any screen size (Laptop & mobile phones).
  • Exploring various interaction and interface patterns using code. i.e Touch, voice, motion tracking, swipe and shake.

References

Hornecker, Eva, “The Role of Physicality in Tangible and Embodied Interactions”,  Interactions, March-April 2011, pp.19-23.

Hutchins, Edwin L, James D. Hollandand Donal A. Norma, “Direct Manipulation Interfaces”, Human-Computer Interaction, vol. 1, 1985, pp. 311-338.

Tangible Media Group, tangible.media.mit.edu/vision/. Accessed September 28, 2019.

 poseNet() librart                                                              https://ml5js.org/reference/api-PoseNet/

Amplitude Modulation                                                https://p5js.org/examples/sound-amplitude-modulation.html

Frequency Modulation                                               https://p5js.org/examples/sound-frequency-modulation.html

The Coding Train https://www.youtube.com/channel/UCvjgXvBlbQiydffZU7m1_aw

Hen Sound Effect                                                             https://www.youtube.com/watch?v=7ogWsIYJyGE