Author Archives: Zaheen Sandhu

Devs 3 – Follow Me (Group 2)

Group 2: Shipra, Mona, Ricky, and Zaheen


ezgif-com-video-to-gif-2

 

img_7381-4

 

Description

For Devs 3, we wanted to experiment with the data received from the robot to trigger a response. The primary goal for this experiment was to familiarise ourselves with the connections between the Robot and the programs; Processing, P5, and TouchDesigner. During these experiments, we worked with a simple program to generate simple responses. For example, moving the point on the screen according to the robot’s coordinates. Adjusting these parameters we wanted to investigate further tracking the robot’s movements and creating a visual response on the screen.

Final Project Video

Robot + Screen


MindMap of the Project & Learnings

After settling down on the first round of explorations, we decided to work with processing. We programmed the code to track the coordinates of the robot and used to map() function to translate those coordinates onto the screen. The first iteration was drawing the path tracing the robot’s movements and waypoints. Since we did not want to make another drawing tool, we learned from this logic to build on the final idea. The intent was to create an eyeball, that traces and follows a small dot on the screen. Both these were mapped based on the robot’s coordinates. We were also inspired by thinking about how to create a responsive, emotive screen action to the movement of the robot; how to make the screen “communicate” with the robot.

This process helped in understanding the limitations of working with the said technology. We realized the importance of working towards smaller goals and it helped in understanding the movements, familiarizing ourselves with the controls, and how the data can be translated into something meaningful using the programs. This assignment being the first step towards Project 2, resulted in a short animation controlled by the Robot, where we tracked and mapped the movements. The intent was to be able to create smooth transitions between the robot and the animation. This exercise was a great learning and will add value to our Project 2 Ideation process.

 

ezgif-com-video-to-gif-3

 

Test and Trial: Challenges

Our goal in this step was to determine the best approach and explore our preferred way to create an insightful interaction with the robot by conducting a series of tests. As we came up with many different ideas and concepts, we decided to follow a learning along making approach to find out where this interaction fits most creatively and on a practical level. In all of these tests and trials, we gained a deeper understanding of the concept of interacting with a non-human agent, in this case a robot, and we were forced to think about the potential effort and challenges we might face in this field, as well as having enough flexibility to change the path, which will definitely lead to more insightful results! The following are some experiments with our concepts using different platforms:

  • Touch Designer

In this sketch our initial test with receiving data from a cellphone OSC was successful, but we couldn’t figure out how to translate it into a way to process data from the robot. The project is as follows:

TD-osc test.gif

  • Pure Data

As with the first sketch, this one was not successful with the data received from the robot. The project details are as follows.
https://www.youtube.com/watch?v=vf84-FVJZx4

screen-shot-2023-03-17-at-10-15-06-am

Group 1 – Diaries of a Robot

Diaries of a Robot
Mona Safari, Dorothy Choi, Jiamin Liu, Zaheen Sandhu

gif-1-with-text

From researching creative inspirations that we found, we were inspired by cinematic themes in nature, miniatures, and light imagery. Our brainstorming led us to investigate how to create a simple, cinematic nature scene (e.g. animals and a cottage at the top of a hill in the countryside, near the ocean) that involves spatial and temporal changes, such as day-to-night transition.

picture1
picture2

Description

Our main concept for project 1 revolves around creating a cinematic nature scene. We manipulated miniature elements and the robot arm creatively to demonstrate different camera angles that can capture and illustrate the scene.

The story for project 1 centralizes from the Robot’s point of view. The robot is getting in touch with nature, its elements, and the native people, which is reflected by the words in this poem that goes along with our 20-second cinematography. Throughout this journey, our process involves storyboarding in the initial process, along with experimenting with the robot arm and settling with our final resources to create the scene.

ezgif-com-video-to-gif-1
Our Explorationsscreenshot-2023-02-28-at-2-20-32-pm

Devs 1 – Exploration
Devs 2 – Exploration

 

Story Board and Scene Creation

scene
Light and shadow
(Day-to-night transitions)

We explored these spatial and temporal changes through different light effects using a mobile color app and different tangible materials, e.g. pink bubble wrap envelope, and a water bottle, to create ideal colored light that would reflect morning-to-evening transitions.

Its impact on the scene in terms of ambiance and storytelling.

light

Cinematography

The voiceover narration comes from the lens of the “Robot arm”. We personified the Robot, as it is visiting nature and getting in touch with its world.

“Camera scenes follows the poem reflecting the respect of nature.”
picture4

 

May the warp be the white light of morning,

May the weft be the red light of evening,

That we may walk fittingly where grass is green,

O our Mother the Earth, O our Father the Sky.

American Indian | Tewa Song

 

The Process – Behind the scene

picture5picture6

picture7
picture7-1

 

 


Touch Point of the Process

One of the most crucial, challenging yet tricky of our process was controlling and playing with that light of the scene. In order to get the effect of the morning light and transition that into the evening light, we decided to use colored lights on our mobile phones and play around with the spatial and temporal changes. For the first part, our main goal was to have a seamless transition of the sun rising and having that sunlight effect. For this effect, we used a mix of orange and blue light. We first hit the record button, waited for a few seconds, and then pointed the light sources toward the scene. Moving along with the arm, we were naturally working with the arm to provide that seamless lighting effect. The next part was to transition to the evening light, for which we slowly moved away from the orange light source but still kept the blue light there, bringing it closer to the camera slowly. We were able to achieve that slow yet seamless color transition through this process.

 

light-steps

picture8
ezgif-4-cb6d017c89                                                                        ezgif-4-091ab62523

 

Our Challenges

  • ENVIRONMENTOne of the main challenges that we faced while we were setting up our scene was the surrounding environment in the video. We noticed that it was tough to be able to record the scene and not have the immediate environment exposed to the camera. To tackle this, we decided to create walls of thick sheets that would solve this purpose. While this did in fact solve our main challenge, it also acted like a bonus element for our project. We wanted to play with light and shadow and the sheet acted as a great background for that purpose. We also tested different angles and close-up focus with the camera to avoid capturing the surrounding environment the best that we can.picture9
  • TIMING

    Sometimes it was difficult to pinpoint how fast or how slow the robot arm should move, with respect to the scene, and also with respect to the narration to match the scene. There was a lot of learning through trial and error during this time. We were able to finalize the timing by first creating a pace for the narration and matching it with each of the keyframes when shooting the scene.
  • LIGHTING

    Sometimes the lighting was tricky – as it may have created too much shadow, captured too much shadow from the robot arm, or didn’t have enough brightness/opacity in colour. We found that the flashlight function on different cellphones has variations in brightness, so we used it to our advantage. For example, a brighter light against an opaque colour can create a bright colour light instead of a dull one. Additionally, casting light adjacent to or in the opposite direction of the robot arm (rather than in the same direction) would minimize shadow capture from the robot arm itself.


Development for the Future!

Initially, we intended to use a blurry background, but during our experimentation stage, we were unable to do so. This could also have been accomplished in a few other ways, but we decided to focus on the main part, which was the narrative of the project, and make it more valuable. After all, this is something we definitely need to experiment with in the future!

 

Final Video: Diaries of a Robot