Project 1
Group 3: Wentian, Nicky, Shipra, Gavin
Final video: https://drive.google.com/file/d/1yAy_Jzsbj3LDxE-Kuq0D4yje0iWWmC43/view?usp=sharing
In project 1, our group intended to investigate the rapport between humans and robots. Rather than treating the robot arm as a clumsy machine, we personified the robot and created a live scenario where robots can interact with humans.
In the scenario we created, the robot is endowed with a “personality”, creating the illusion of it interacting with a human in a distinctive and enchanting way.
Description
While exploring the robot’s operation, one of the issues we recognized is that robots still fall into uncanny valley territory when trying to replicate human behavior. Although the futuristic robots in sci-fi movies have demonstrated possibilities of consciousness the scenario that robots are fully in possession of mortal characteristics is still not entirely plausible in today’s world. Based on the identified problem space, we wanted to experiment with the potential interactions between humans and robots personified from conscious characteristics, mindsets, and behaviors.
We brainstormed an interesting scenario where the actor interacts with the robot arm without real conversation or physical contact. To be specific, the actor will be gossiping with someone else on a phone call, and the noise she makes would wake up the robot’s arm. Then, the actor will gesture to the robot to go away, and the robot will start to pretend to not eavesdrop on her conversation until the moment that the robot hears something explosive.
Our intended outcome is that, by showcasing how robots could interact with us in daily circumstances, people (especially technological designers/researchers) could be inspired and conduct further research in the field of robot personas.
We created a setup that matched the script and storyboard. We used props such as books, plants, a water bottle, paper, laptop, and a desk lamp to create the appearance of a design studio/office. Our actor was placed directly opposite the robot arm. A hidden microphone was set up next to her to capture sound, and the phone was put in the robot arm’s gripper. We used mostly natural lighting by opening up the blinds on the window, and also aided it with desk lamp featured in the video, and an additional hidden off-screen lamp. A meta element is added to the project because the robot is looking at its own operating instructions throughout the final video.
MIMIC for Maya helped us to visualize the final video, but we decided to go with manual input for the robot arm’s waypoints, because we felt more comfortable with those controls at this point in our learning. The trickiest aspect was getting the timing correct – we had to coordinate the actor and robot to create the illusion of them responding to each other. An additional challenge was ensuring the robot didn’t solely function as a mechanical object. We gave it somewhat natural actions – looking away, glancing at their documents, being nosy – to personify it and give it some character.
We set waypoints to create the different actions of the robot. The difficult part about this was applying different speeds to the robot arm – some actions, such as the robot looking at its documents required a bit more breathing room, whereas the robot being “surprised” by what it hears needed a much faster movement. After testing this out, we used the “wait” keyframe for the part where the robot needs to pause.
Development Process
Storyboard and link to script
Mimic images and link to Mimic output (1, 2)
After settling on an idea, we started gathering references for shots and the scripts. The intent was to be able to create an environment for the actor and the robot to be able to coexist, a scene where they could have a conversation. After writing the first draft of the script, we started framing the scenes and sequences. Since it is a short video, we broke it down into 8 major frames, and drew a storyboard based on the script. This was further developed on Mimic and helped in understanding timing, movement, and camera angles.
Post this ideation and prototyping stage, we gathered props (set elements) from the Digital Futures Lounge (6th floor) and created the environment for the shoot. Since the scene is set in a study environment, we decided to work with the robot arm and the table set up. After a few test runs with the actor, we were able to synchronize the script to the robot movements. It was easier to modify and change the dialogue delivery, as compared to changing the robot’s movement. The final shot was edited with the added effects of phone ringtone, robot arm sounds (to exaggerate the movements). The blinking action was also added to make it seem like the robot had just woken up from sleep, hearing the ringtone.
Code or Files
Ring tone: iPHONE RINGTONE SOUND EFFECT
Sound design Robot movement: Sound Design: Robotic Arm Sequence