Project 2 – Robot Tools

Robot Conductor

Shipra Balasubramani. Mona Safari. Zaheen Sandhu. Ricardo Quiza



For this Project 2, we wanted to explore the relationship between a human and an orchestra conductor

in the form of a robot. A conductor during a performance serves as a messenger for the composers. Their responsibility is to understand the music and convey it through gestures (hand movements) so transparently that the musicians in the orchestra understand it perfectly.

For Devs4 we worked with this idea in mind and considered this experiment as a prototype for the final project. Therefore, we tried to understand the hand movements of a conductor, how that reflects in the music composition and whether the robot will be able to mimic those movements.

This exploration helped us in understanding the first few steps of playing with sending data and state machines. This was then further developed and refined for the final output of Project 2. The intent was to be able to mimic the movements of a conductor while synchronizing the movements to a classical sound piece.

Case Study

YuMi takes center stage in Pisa

A case study we looked at while exploring was ABB’s Robot YuMi, wherein a robot’s performance was developed by capturing the movements of maestro Andrea Colombini through a process known as lead-through programming. The robot’s arms were guided to follow the conductor’s motions meticulously; the movements were then recorded and further fine-tunes in the software. Taking inspiration from this example, we wanted to see how a robot could be in charge of conducting a performance.


Blob Opera

Blob Opera is a new machine learning experiment by artist David Li that lets you create your own festive song inspired by Opera on Google Arts & Culture. Users guide the pitch and vowel sound of our four festive blobs who stand ready to transform their musical ideas into beautiful harmonies. This interactive website also has prerecorded popular tracks/songs based on location. Currently, there are 6 preloaded locations and a total of 22 prerecorded soundtracks. The website also allows users to record their compositions or the preloaded tracks, although it does not allow them to download the recorded pieces, it only allows the user to share it as a video that would be linked to the browser.


Storyboards, Process and Fabrication


For this project, we wanted to create a collaborative experience between one human user and one UR10E Robot Arm. Keeping this in mind, we decided to work with the idea of the robot as an Orchestra Conductor, although it is the human user who would be controlling the robot from the laptop.

The idea was to create a program on processing where the user would use the mouse to scroll over the 4 notes/ buttons seen on the screen. The interaction is designed such that the user would be able to scroll over these notes to create their own symphony. A reaction to this is that the state machines get activated and the robot arm in response would move to the predefined positions to make the experience seem like the Robot is conducting the orchestra.

During the development process, we decided to create miniature tools and props for the scene, which resulted in creating a more dramatic effect. We were also keen on understanding how the robot’s movements can be manipulated if it held a miniature baton. Keeping these aspects in mind we developed a storyboard to frame the scene and understand lighting. This also helped us in building the props required for the scene.

While going through this process we were also keen on giving the robot some character, hence to add to the mood and experience we decided to step away from traditional and classical sound pieces, instead, we decided to work with Blob Opera (Google Arts & Culture). Blob Opera is an interesting tool that allows the user to compose and generate sound pieces while adjusting the pitch and vowel sound as per requirement. We found the characteristics of this very interesting and it provided an opportunity for us to build our own notes for this interactive experience. After understanding the basics of working using this method, we were able to create a program on processing to send mouse data and trigger the note and motion of the robot arm holding the baton.

screenshot-2023-04-20-at-11-50-27-am  screenshot-2023-04-20-at-11-51-11-am

screenshot-2023-04-20-at-11-51-28-am  screenshot-2023-04-20-at-12-05-43-pm

screenshot-2023-04-20-at-11-54-38-am screenshot-2023-04-20-at-11-54-46-am

Final Code

Processing Code –

Final Video

final shot.mp4