Category Archives: Uncategorized

Project2: Naughty Robotcat – Group1

Group members: Siyu Sun, Yueming Gao, Maryam Dehghani, Victoria Gottardi


  • Brainstorming 

To generate ideas for a project involving interaction, we considered the type of interaction, physical tools related to motion and interactions, and how to interact. This led us to the initial idea of a pet robot.Given that our equipment was limited to a robotic arm, we decided to model the actions of a cat’s paw and incorporate playful movements.

As we progressed, this concept evolved and became more refined. Ultimately, we settled on the idea of a cat’s paw that displays movement and behaviour that the viewer can communicate with and develop a connection to.



  • Pet Therapy

Pet therapy, also known as animal-assisted therapy (AAT), is a type of therapy that involves trained animals interacting with people. The animals are trained to interact calmly and gently with people and are often used in hospitals, nursing homes, and schools to provide emotional support and companionship to patients and students.

  • Research questionIs it possible to create an interactive and collaborative Naughty Robocat and can all of our separate components work together to create a fun, inviting, and somewhat stress free work environment?


  • The Loona Robot: This is an adorable robotic pet. Some people find comfort in interacting with robotic animals, as they can provide a sense of companionship without the potential allergies, mess, or responsibility of caring for a live pet. (


  • PARO Therapeutic Robot: This is an advanced interactive robot developed by AIST, a leading Japanese industrial automation pioneer. It allows the documented benefits of animal therapy to be administered to patients in environments such as hospitals and extended care facilities where live animals present treatment or logistical difficulties. (



  • Laser Cut for Cat paw

To create a robotic arm that resembles a cat’s paw, we began by altering its appearance in the first phase. This involved utilizing a laser cutting machine and acrylic craft felt sheet to make a glove-shaped covering that shows the form of a cat’s paw. Then we placed this covering over the robotic arm.

lasercut01 lasercut02microsoftteams-image-4

  • Interactive Interface – Arduino& Robot

However, in order to increase engagement with the audience, we decided to include a touch sensor to initiate the arm’s movements. The Arduino was set up under the table. And the sensor was hidden under the sheet.


We design different movements for two Robot State: State 1st is usually like a cat observing the environment tentatively. State 2nd is the cat doing something for the environment. 


  • Working environment 

Ultimately, we aimed to simulate a real-life setting by creating a suitable environment and positioning a person in a workspace. Instead of a live cat, we used a robotic arm in this space. By incorporating familiar cat’s behavior and making slight adjustments to the arm’s appearance, we endeavored to evoke a sense of owning a pet similar to the real-life experience for humans.

microsoftteams-image-2microsoftteams-image-2-copyAs we expand the range and diversity of the movements, it is possible that we may observe more interactions between the robot and humans than exist in traditional robot-human relationships.

  • To Mimic a cat 

In the next phase, we focused on modifying the movements of the robotic arm to emulate the desired actions. Our goal was to replicate the natural behavior of a cat, while also incorporating playful and mischievous movements, such as touching an object and suddenly knocking it over with a swift motion.

  • Three movements of robot cat 

Slapping the cup


Interrupting work


Patting a new toy




  • We had to consider how the robot arm’s movements would affect our created environment, as some movements did not go as planned. (knocking over a plant and hitting the laptop).


  • The pig was too soft to be recognized by the touch sensor, so we had to adapt by taping a hard object (in this case a USB stick) on the bottom of it. Even then, it took many tries to get the USB stick to touch the sensor successfully.



For this assignment, we considered the use of animal therapy robots, such as PARO, the therapeutic robot. However, we aimed to develop an animal robot that could potentially help people in the workplace. Our cat robot, “The Naughty Robocat,” would serve as a companion in the workplace, helping to destress the environment and provide workers with a fun, healthy distraction and break from their work as they interact with the Robocat.

As we considered the interactions that The Naughty Robocat would perform, we began to address some important questions. The established interactions would be collaborative between us, the Robocat, and a touch sensor attached to an Arduino. Currently, the project requires the use of one human in a simulated work environment. Other humans behind the scenes would oversee preplanning the Robocat’s movements through two different robot states, which would then be carried out when the human in the work environment interacts with the touch sensor while working.

In future work, our project can be extended to explore other aspects, such as designing more interactions between humans and robotic cats, investigating whether robotic cats can have different reactions to users’ actions, and exploring the possibility of playing games together between robotic cats and humans. Researching these issues can help us delve deeper into the relationship between humans and robots.

Video URL:

Robot script URL:

Arduino/processing code URL:

Group 3 – Project 1: Robot Camera

Project 1
Group 3: Wentian, Nicky, Shipra, Gavin

Final video:


In project 1, our group intended to investigate the rapport between humans and robots. Rather than treating the robot arm as a clumsy machine, we personified the robot and created a live scenario where robots can interact with humans.

In the scenario we created, the robot is endowed with a “personality”, creating the illusion of it interacting with a human in a distinctive and enchanting way.

While exploring the robot’s operation, one of the issues we recognized is that robots still fall into uncanny valley territory when trying to replicate human behavior. Although the futuristic robots in sci-fi movies have demonstrated possibilities of consciousness the scenario that robots are fully in possession of mortal characteristics is still not entirely plausible in today’s world. Based on the identified problem space, we wanted to experiment with the potential interactions between humans and robots personified from conscious characteristics, mindsets, and behaviors.

We brainstormed an interesting scenario where the actor interacts with the robot arm without real conversation or physical contact. To be specific, the actor will be gossiping with someone else on a phone call, and the noise she makes would wake up the robot’s arm. Then, the actor will gesture to the robot to go away, and the robot will start to pretend to not eavesdrop on her conversation until the moment that the robot hears something explosive.

Our intended outcome is that, by showcasing how robots could interact with us in daily circumstances, people (especially technological designers/researchers) could be inspired and conduct further research in the field of robot personas.

setup3   setup4   setup5

We created a setup that matched the script and storyboard. We used props such as books, plants, a water bottle, paper, laptop, and a desk lamp to create the appearance of a design studio/office. Our actor was placed directly opposite the robot arm. A hidden microphone was set up next to her to capture sound, and the phone was put in the robot arm’s gripper. We used mostly natural lighting by opening up the blinds on the window, and also aided it with desk lamp featured in the video, and an additional hidden off-screen lamp. A meta element is added to the project because the robot is looking at its own operating instructions throughout the final video.

MIMIC for Maya helped us to visualize the final video, but we decided to go with manual input for the robot arm’s waypoints, because we felt more comfortable with those controls at this point in our learning. The trickiest aspect was getting the timing correct – we had to coordinate the actor and robot to create the illusion of them responding to each other. An additional challenge was ensuring the robot didn’t solely function as a mechanical object. We gave it somewhat natural actions – looking away, glancing at their documents, being nosy – to personify it and give it some character.

We set waypoints to create the different actions of the robot. The difficult part about this was applying different speeds to the robot arm – some actions, such as the robot looking at its documents required a bit more breathing room, whereas the robot being “surprised” by what it hears needed a much faster movement. After testing this out, we used the “wait” keyframe for the part where the robot needs to pause.


Development Process

Storyboard and link to script
storyboardMimic images and link to Mimic output (1, 2)
After settling on an idea, we started gathering references for shots and the scripts. The intent was to be able to create an environment for the actor and the robot to be able to coexist, a scene where they could have a conversation. After writing the first draft of the script, we started framing the scenes and sequences. Since it is a short video, we broke it down into 8 major frames, and drew a storyboard based on the script. This was further developed on Mimic and helped in understanding timing, movement, and camera angles.

Post this ideation and prototyping stage, we gathered props (set elements) from the Digital Futures Lounge (6th floor) and created the environment for the shoot. Since the scene is set in a study environment, we decided to work with the robot arm and the table set up. After a few test runs with the actor, we were able to synchronize the script to the robot movements. It was easier to modify and change the dialogue delivery, as compared to changing the robot’s movement. The final shot was edited with the added effects of phone ringtone, robot arm sounds (to exaggerate the movements). The blinking action was also added to make it seem like the robot had just woken up from sleep, hearing the ringtone.

Code or Files
Sound design Robot movement: Sound Design: Robotic Arm Sequence

Game Of Thrones Camera Movement



After getting hands-on experience with a robotic arm in dev 1 we decided to use the Game of Thrones intro song as the primary music element for replicating a camera shot, by using robotic camera arm movements. The iconic theme music of the show has a powerful and impactful sound that adds a sense of excitement and drama to the camera movements.



We aim to coordinate between traditional cinematography and robot-assisted miniature, and thus creating more comprehensive and innovative ways of storytelling. We also emphasize the Incorporation of non-traditional materials as movie components and the protagonists of storytelling.

As the robot arm is a replacement for filming tools and technologies, we are trying to explore to what extent we can substitute other components in a short film clip: e.g., actors, sounds, and environmental elements.


What is it you are trying to figure out?

To get started, we would need to determine how we want the camera to move in synchronization to the music. This could involve programming the robotic arm to move in specific ways, such as moving up and down, left, and right, or in a circular motion, while creating emphasis and blurring effects at certain phrases in the song. We think synchronizing camera footage with a sound beat can create a visually appealing and rhythmically engaging experience for the viewer.

We focus on using non-traditional actors in combination with traditional filming techniques to generate familiarity and arouse emotions in the audience. In addition, we are trying to project irony through hilarious visuals and intensity through music.

Documentation of process and results

Video Link 1
Video Link 2

What you learned

The importance of exact control in terms of speed, time, and acceleration. The rhythm of the robot arm’s movement should be in accordance with background music and the actor’s entrance and exit to achieve coherence. This exact control would benefit from program software such as Maya, in order to render at the most appropriate coordination.

It is not necessary to have specific actors in the scene to maximize thematic responses/effects. There are many alternative ways of generating similar visual effects which have a lot of room for creative input. Taking overall cohesiveness and consistency into consideration is also an essential aspect which should not be ignored.

How did it go? It does not have to work to be a useful experiment.

It went well and beyond our early expectations. We brought a reflective aluminum box, and some vegetables, and pulled coloured sheets as the background. Thinking through making/experimenting is a valid design research method in terms of idea generation and exploration.

We were able to replace the background in Premier with a still image as a form of post-production, which brings in another layer of creative elements into the scene. Despite not being the focus, incorporating software into the overall production generates unique effects and expands possibilities.


Dev 1- Group 4

Team Members: Anas Raza; Chitra Chugh; Ellie Huang; Ricardo Quiza Suarez

Description of what you are investigating (100 – 200 words)

For the first experiment, our team is investigating different perspectives of cinematography and their effects on the rendering of scene dynamics. Each of the team members has presented an idea and brought materials for testing. Anas decided to use the robotic arm for light painting, specifically drawing text using long camera exposure capturing a robotic arm holding a light source. Ellie is trying to explore close-up shots to top view shots, recreating a miniature landscape with elements of abstract sculptural objects. Ricardo delves into the question of how we can translate pop culture elements, such as memes or iconic video game scenes, into a precise perfect camera shot with the aid of a robot? And Chitra is trying to capture the speed and motion using the anime character toys widely known as beyblades. She is trying to create an intra-interactive  pattern between the objects and then using a robot arm to capture the zestful dynamics of the toys. She is trying to achieve a comical violence essence through this idea.

What is it you are trying to figure out?

For dev 1, we are trying to experiment with a diverse range of possibilities with robot cameras, in terms of rotation, movement, control and timing. We are striving to familiarize ourselves with the robot arms and the technicality of it in order to better achieve optimal collaboration, specifically how the robot can aid in creating a precise scene. In addition to the technical part, we are exploring multiple themes incorporated into the cinematography of robot cameras and how unique effects can be generated via such methods.

Documentation of process and results


screen-shot-2023-02-03-at-10-34-15-pm screen-shot-2023-02-03-at-10-34-07-pm



The scene: Boo haunting a character (Mario usually) until it touches him and Mario dies.

The props:1 Boo figure + another figure. Human hands to assist movement.

Death by Boo example links (what we are trying to replicate):

Boo movement pattern:

Sketch diagram for setting up the scene:

screen-shot-2023-02-03-at-10-32-16-pm screen-shot-2023-02-03-at-10-32-06-pm

Description of what you learned (100 – 200 words)

  • Importance of speed and acceleration: altering a different speed and acceleration completely changes the way of storytelling from the robot camera. There are a lot more that could be further explored in terms of the accurate control of robot arms.
  • The project has to be thought about with the limitations of the setting. That includes, the length in a grounded setup can not be larger than the table, and the robot itself has many limitations regarding how much it can displace its parts. 

How did it go? It doesn’t have to work to be useful experiment. Give a brief description of what you learned (100 – 200 words) 

  • Human components: human’s appearance adds on another layer of playfulness and dynamics to the whole scene. On one hand, hiding human interaction makes the whole cinematography intriguing when the objects “come alive”, while on the other hand, having human components in the scene registers human presence and the collaboration between robot and human is strengthened. 
  • Learning on team collaboration: when one person is maneuvering, others could be helping with holding the bottom, checking from different angles and et cetera. A good and smooth collaboration maximized efficiency.
  • Regarding robot arm placement and sturdiness, we discovered that the arm could not move more than a speed of 250mm/s. The safety feature was halting the arm when we tried to move the arm more than 250mm/s. This is because of the mechanical vibration that was produced as a result of high speed and unstable base.

Hello world!

Welcome to OCAD University Blogs. This is your first post. Edit or delete it, then start blogging! There are tons of great themes available for you to choose from. Please explore all of the options available to you by exploring the Admin Toolbar (when logged in) at the top of the page, which will take you to the powerful blog administration interface (Dashboard), which only you have access to.

Have fun blogging!