OCULAR

ocular-copy

Ocular
Animated Robotics – Interactive Installation

By Jignesh Gharat

Project Description

An animated robotics motion with a life-like spectrum bringing emotive behaviour into the physical dimension. An object with motion having living qualities such as sentiments, emotions, and awareness that reveals complex inner state, expressions and behavioural patterns.

He is excited to peak outside the box and explore around the surroundings but as soon as he sees a person nearby he panics and hides back in the box as if he is shy or feels unsafe. He doesn’t like attention but enjoys staring at others.

Design Process & Aesthetics

It’s an attempted to create an artificial personality. I wanted to work on an installation which encourages participation rather than spectatorship. So wanted to work on a physical installation that lets people experience familiar objects and interactions in refreshingly humorous ways.

The explorations started with exploring objects and living organisms that can be used as a part of the installation. Where I can implement curious interactions and a moment of surprise and unpredictability. Gophers, Crabs, Snails, Robots were few to mentions. I finally ended up finalising periscope as an object that can have behaviour and emotions and a perfect object to play hide and seek with the visitors.

What is Periscope?
An apparatus consisting of a tube attached to a set of mirrors or prisms, by which an observer (typically in a submerged submarine or behind a high obstacle) can see things that are otherwise out of sight. I started thinking of ideas, object behaviors as well as setup of the installation to come up with engaging and playful interactions.

Step 1 – Designing the bot

The model was developed in Rhino keeping in mind the moving parts that is rotating the head and the spin which is lifted from the bottom.

ocular-copy-81

ocular-copy-5

Step 2 – The mechanism.

Testing the motors stepper (28BYJ-48 5V 4 Phase DC Gear Stepper Motor + ULN2003 Driver Board UNO MEGA R3-Head ) and Servo (SG90 Micro Servo Motor – Head and 3001 HB analog servo – Arm ). Stepper motors the RPM was too less to give that quality and reflex to the bot so decided to go with the servo motor which had better torque.

ocular-copy-4

The 3d printed body was too heavy for the Analog servo motor to lift so finally decided to use paper to model the body and reduce the weight and load on the motor. Surface development was done using Auto CAD software for making 3 to 4 different options based on the mechanism and aesthetics of which I decided to work on the design shown in the below image. The arm was laser cut in acrylic and two options were made to reduce the friction between paper and the lifting arm contact surface.

ocular-copy-6

 

How does it Work?

A micro motor controls the oscillation of the head. A Distance sensor at the front of the wooden  controls the motor 2. When the visitor is in the range of set threshold of the distance sensor  that pulls the lever arm down and the motor 1 (Head)stops rotating.

ocular-copy-3

The installations is minimal and simple with just two materials wood and paper that gives a clean finish to the piece. The head is like a focus with a black sheet and white border as if the sensors are installed inside the head that controls the oscillation. In my further explorations and practice I will make sure that the sensor are not directly visible to the visitors as this some time leads to visitors interacting with the sensor and not actually experiencing the installation.

ocular_jigneshgharat_01-jpg ocular_jigneshgharat_02

 

Code :  https://github.com/jigneshgharat/Ocular

 

Project Context

Opto-Isolator (2007), Interactive Eye Robot  -Opto-Isolator (2007: Golan Levin with Greg Baltus) inverts the condition of spectatorship by exploring the questions: “What if artworks could know how we were looking at them? And, given this knowledge, how might they respond to us?” The sculpture presents a solitary mechatronic blinking eye, at human scale, which responds to the gaze of visitors with a variety of psychosocial eye-contact behaviors that are at once familiar and unnerving.

The Illusion of Life in Robotic Characters – Principles of Animation Applied to Robot Animation – Robot animation is a new form of animation that has significant impact on human-robot interaction.
This video relates to and extends a paper we have published at the ACM/IEEE Human-Robot Interaction Conference

Exhibition Reflection

The observations were made and recorded at Open Show Graduate Gallery and Centre for Emerging Artists & Designers which provided me with some good insights and learnings for necessary changes to be made developing the concept further. It was fun to see people interacting and playing around a nonliving object as if it had emotions, feelings and  the way Ocular reacted to the viewer’s actions. The interactions were supposed to be only from the front but as the wall card was placed at the side of the installation placed on a high plinth, people tend to read and start interacting from the side and did not really experience the installation as expected but most of them figured out the working as they read the wall card.
Some interesting comments from the visitors:

  • The Installations is making everyone Smile.
  • I have 3 cats and all 3 cats will go crazy if they see this.
  • She said all the guys react the same way Ocular did so she don’t want to go near him.
  • What are the next steps?
  • What was the inspiration?
  • How does this work?
  • Why does he ignore me?

A visitor (Steventon, Mike) referred to Norman White’s interactive robotic project, The Helpless Robot.

Observations and visitors reactions

ocular-copy-7

Reference

Ocular

ocular-copy01

Project Title
Ocular
Animated Robotics – Interactive  Art Installation

Project by Jignesh Gharat


Project Description

 An animated robotics motion with a life-like spectrum bringing emotive behaviour into the physical dimension. An object with motion having living qualities such as sentiments, emotions, and awareness that reveals complex inner state, expressions and behavioural patterns.
He is excited to peak outside the box and explore around the surroundings but as soon as he sees a person nearby he panics and hides back in the box as if he is shy or feels unsafe. He doesn’t like attention but enjoys staring at others.What if a blank canvas could try to express its self instead of viewers projecting their interpretations, emotions, beliefs, and stories.


Technology

  • Mackbook pro
  • 1 Arduino UNO/ Nano
  • 2/3 Servo motors
  • VL53L0X laser distance sensors/ 
  • LED Bulb
  • Power Bank 20000 mah

Materials

  • Acrylic/Paper/3D printing (Periscope)
  • 4 Plywood 30” x 30”

Work plan

22nd – 23rd _Material procurement, storytelling, and ideation
24th – 27th _Code- Debugging, Calibrating distance sensor, Mockup, Testing
27th – 30th_Iterating, Fabrication, and Assembling
1st – 2nd_Installation/ setup
3rd – 4th_Exhibitions


Physical installation details

The head of the robot(Periscope) is observing the surrounding and is curious to explore things happening  around. As the viewer comes in the range of the distance sensor the robot hides quickly into  the box and peeks out. when there is no one in the range of the sensor the robot pops out again.


Exhibition Space

Preferably a silent art gallery space with a plinth. Spotlight on the plinth.

HOVER BEAT

cover

HOVER BEAT

Musical instrument  without Touch 

Jignesh Gharat


Project Description:

Hover Beat is an interactive musical instrument that is played without touch or physical contact installed in a controlled environment with a constant light source. The project aims to explore interactions with a strong conceptual and aesthetic relationship between the physical interface and the events that happen in the form of audio output.

Interaction:

A project’s potential radius of interaction is usually determined by technical factors, be it simply the length of a mouse cord or the need for proximity to a monitor used as a touch screen, the angle of a camera observing the recipient, or the range of a sensor. However, the radius of interaction is often not visible from the outset—especially in works that operate with wireless sensor technology.

In the project, the attempt is not to  mark out the radius of interaction or special boundaries at all, so that it can be experienced only through interaction

Explorations & Process:

I started exploring different sensors that can be used for controlling sound. Starting with a sound detection sensor, flex sensor, capacitive DIY sensors using aluminum foil, pressure sensors and finally ended up using a light sensor to make the interaction invisible. So that the user doesn’t see or understand how the instrument actually works and opens many possibilities to interact with the object and they explore, learn while interacting.

img_20191029_131438855

Flex Sensor | Arduino Uno or Arduino Nano

img_20191030_201510090_burst000_cover_top

Light sensor LDR | Arduino Uno | Resistor

img_20191031_120447094

Using a glass bowl to calibrate the base sensor data reading in- DF 601 studio environment at night.

How does the sound actually change?

The data coming from the Arduino and LDR are used in processing to control playback speed and amplitude of the sound. A steady reading is used as a benchmark which is from the constant light amount detected by LDR.

Libraries in Processing: import processing.sound.*; ( For SimplePlayBack )
import processing.serial.*; ( import the Processing serial library )

Sound:

I started experimenting with 8-bit sounds, vocals, and instruments.  The sounds changed its playback rate and amplitude based on the amount of light received by the LDR sensor. to minimize the noise and improve the clarity in ever-changing sounds the best option was to work with clean beats and so I decide to work on instrumental music. The interactions were mostly using hands so I did some research on musical instruments that use hands and make distinct clear sound beats. Indian classical instrument TABLA  a membranophone percussion instrument was my inspiration to develop the form and interaction gestures.musician-tabla

Aesthetics: 

The form is inspired by the shape of Tabla.  Use of semi-circular glass bowl to define the boundary or say a start point to measure readings and define the limit of interaction in a radius as it uses the LDR sensor. The transparent material of the glass actually confuses the user and makes them curious about how it works. The goal was to come up with a minimal simple and elegant product which is intuitive and responsive in realtime.

img_20191031_163131 img_20191031_163119screenshot-2019-11-05-at-7-20-15-pm

Exhibition Setup:

The installations work only in a controlled environment where the light quality is constant and won’t change because there is a base reading calibrated and used as a bench mark to change the sound.

screenshot-2019-11-05-at-7-17-55-pm

Experiment Observations:

The affordances were clear enough. Because of the sound playing the users got a clue that the object has something to do with the touching or tapping but later on interacting they found out quickly that it’s just hovering at different heights over the object to manipulate the sounds of tabla. People tried playing with the instrument. People with some technical knowledge on sensors were more creative as they found out its the light that controls sound.

Coding:

Github – https://github.com/jigneshgharat/HoverBeat

Scope:

The experiment has laid a foundation to develop the project further and make a home music system that reacts to the light quality and sets the musical mood for example if you dim the lights the music player switches the mood to an ambient peaceful soundtrack. A new musical instrument can be made just by using DIY sensors at a very low cost with new interesting interactions.

References:

 

Eggxercise

 

screenshot-2019-10-01-at-10-27-05-am

Eggxercise                                                     

 (Creation & Computation DIGF-6037-001)

  Team : Jignesh Gharat  & Nadine Valcin                                                          Mentors: Kate Hartman & Nick Puckett


Project description

Eggxercise is a hybrid digital and physical version of the traditional egg-and-spoon race where participants balance an egg on a spoon while racing to a finish line. It replaces the typical raw egg used in the classic version with a digital egg on a mobile screen. If the egg touches the side of the screen, it breaks, displaying the contents of a cracked egg with the message “Game over”. 

Because of space and time constraints, a relay race format was used for the in-class demonstration. The participants were divided into 3  teams whose members had to go through a simple obstacle course made out of a simple row of chairs. When they finished their leg, they had to pass on the phone to the next member from their team without breaking the egg. If at any moment, the egg broke, the participant who was holding the mobile phone had to reload the game and wait for the expiration of the 5 second countdown to resume the race.
game-play

Presentation Day

Project context

We both shared a strong desire to explore human-computer interaction (HCI) by creating an experience with an interface that forced participants to use their bodies in order to complete a task. That physical interaction had to be sustained (unlike a momentary touch on the screen or click of a mouse) and had to be different from the many daily interactions people have with their smart devices such as reading, writing, tapping, and scrolling. In other words, we were searching for an experience that would momentarily disrupt the way people use their phones in a surprising way that simultaneously made them more aware of their body movements. 

We also wanted to produce something that was engaging in the simple manner of childhood games that elicit a sense of abandon and joy. It had to have an intuitive interface and an immediacy that didn’t require complex explanations or a high level of skill, but it simultaneously had to provide enough of a challenge as to require a high level of engagement. As Eva Hornecker remarks:

 “ We are most happy when we feel we perform an activity skillfully […] Tangible and embodied interaction can thus be a mindful activity that builds upon the innate intelligence of human bodies.”  (23)


poseNet()  Library (ML)

We explored the different ways in which the body could be used as a controller mainly through sound and movement. Posenet – a machine learning model that allows for Real-time Human Pose Estimation (https://storage.googleapis.com/tfjs-models/demos/posenet/camera.html) offered the possibility of interacting with a live video image of a person’s face. We conjured an experience that would attach a virtual object to a person’s nose and allow the person to move that object along a virtual path on a computer screen. This led to the idea of a mouse, following its nose to find a piece of cheese. It would force users to move their entire upper body in unusual ways in order to complete the task.

We then moved onto the idea of controlling an object on a mobile device through voice and gestures. Building on our desire to make something inspired by childhood games, we decided to transpose the egg-and-spoon game. We didn’t want a traditional touch interaction so we used the accelerometers and gyroscopes data on the phone to sense tilting, rotation, and acceleration to control the movements of a virtual egg on a mobile phone. This allowed for immediate and unmediated feedback to the user who could quickly gauge the acceptable range of motion required not to break the egg. This can be seen as an application of a direct manipulation interface (Hutchins, 315) where the object represented, in this case, the virtual egg, behaves in a similar fashion than a real egg would if put on a moving flat surface. The interface also feels more direct because the user’s intentions to balance the egg as demonstrated by their hand movement provide the expected results that follow the normal rules of physics. 

Eggxercise is engaged with the aim to investigate a natural user interface (NUI) where the interaction with the smart device is direct, tangible and consistent with our natural behaviors. Within that paradigm, it aligns with the multi-disciplinary field of tangible embodied interaction (TEI), that explores the implications and rich potential for interacting with computational objects within the physical world and with the projects and experiments of  MIT’s Tangible Media Group, led by Hiroshi Ishii, that continuously search for new ways “to seamlessly couple the dual worlds of bits and atoms.” (tangible.media.mit.edu)

Mobile phone game play demo

Our project integrates a virtual object on a physical device that responds to movement in the physical world in a realistic way. In that way, it is related to the controllers that are used in gaming devices such as the Nintendo Wii and the Sony Playstation. It also has the embodied interaction of the Xbox Kinect while maintaining a connection to a real-world object.


Game court

Played between 3 teams of 6 participants on a 10mt relay track. The layout looked as illustrated in the image below.

Game Court


The code for our P5 Experiment can be viewed on GitHub:

https://github.com/nvalcin/Eggxercise/blob/master/Final%20code


Technical issues

The sound for Eggxercise launched automatically on Android devices. We spent a lot of time trying to get the sound to work on the iPhone only to accidentally discover that users had to touch the screen to activate the sound on those devices.

Sound output:

  • Android Device – Browser used Firefox.
  • iPhone Device – Browser used safari but only works on the first touch.

Next steps
  • A background with lerp color to have the screen turn red as the egg approached the edges.
  • More levels, increasing the speed of the egg, adding a second egg or obstacles on the screen. 
  • A timer indicating how long the users had managed to balance the egg

Observations / User test
  • The participants were using a different browser on mobile phones with different operating systems and specifications and all the phones were connected over wifi to load the code from P5.js, It was difficult to start the game at the same time as the loading time was different.
  • Participants having a large screen size had an advantage over others.
  • Adding obstacles on the path made the game more challenging and fun.
  • The background sound (Hen.mp3) did enhance the experience as the motion of the egg changed the speed and the amplitude of the sound.
  • Participants were having a good time using the app themselves as well as watching others play.

Learnings
  • Getting started with creative coding and understanding the basic workflow of P5.js and java.
  • Integrating graphics with code.
  • Making the screen adaptive to any screen size (Laptop & mobile phones).
  • Exploring various interaction and interface patterns using code. i.e Touch, voice, motion tracking, swipe and shake.

References

Hornecker, Eva, “The Role of Physicality in Tangible and Embodied Interactions”,  Interactions, March-April 2011, pp.19-23.

Hutchins, Edwin L, James D. Hollandand Donal A. Norma, “Direct Manipulation Interfaces”, Human-Computer Interaction, vol. 1, 1985, pp. 311-338.

Tangible Media Group, tangible.media.mit.edu/vision/. Accessed September 28, 2019.

 poseNet() librart                                                              https://ml5js.org/reference/api-PoseNet/

Amplitude Modulation                                                https://p5js.org/examples/sound-amplitude-modulation.html

Frequency Modulation                                               https://p5js.org/examples/sound-frequency-modulation.html

The Coding Train https://www.youtube.com/channel/UCvjgXvBlbQiydffZU7m1_aw

Hen Sound Effect                                                             https://www.youtube.com/watch?v=7ogWsIYJyGE