Author Archives: Wentian Zhu

Project 2 – Group 5

My Robot Coach

Group 5: Wentian Zhu, Ellie Huang, Jiamin Liu, Yifan Xu



Bounty Hunter on Cloud:

In this project, we investigated the potential of robot arms to enhance sports and gaming training in a simulated environment. In the physical world, expertise in sports and games usually entails repetitive exercise and access to appropriate equipment. Unfortunately, such resources may not be accessible to everyone, which is where robot arms come in as a valuable aid. By generating virtual scenes and offering support for sports or game-based training, robot arms can be of tremendous assistance to a wider population.
Through the use of technologies such as Arduino, WebSocket, and Unity, we were able to demonstrate how robot arms can be used to create an immersive and engaging training experience that promotes hand-eye coordination and quick thinking. However, challenges such as the optimal delay time in the communication between softwares need to be addressed to improve the overall gaming experience.

In addition, the feedback on the game’s representation of guns and shooting from a political perspective highlights the potential for exploring the concept further. By tying it into current societal, cultural, and political issues, we can create a thought-provoking experience that engages players in a dialogue about violence, human society, technology, and ethics.




The Arduino converts the data into JSON format and sends it to the WebSocket server, which in turn receives the data and sends it to Unity. Unity then converts the received data and convert it into C# format.


Arduino Diagram and Container Design


Environment Setup


User Flow



Design Process

Step 1: Connect both Arduino and Unity to Websocket

In Step 1, we connected Arduino and Unity to the WebSocket for real-time data transfer, use a light sensor and laser pointer to trigger animations, add a score system, reset button, and LED light strip for feedback. This created an engaging game experience that responds promptly to user actions.



Step 2: Refinement

In Step 2, we refined the scene design and created storylines that will add depth to the game. We also started to consider the design of the Arduino container and cable organization to enhance user experience. Lastly, We changed the scoreboard to a reward system that will make the game more engaging and encourage players to continue playing.

Step 3: Final Adjustment

In Step 3, We laser cut the monster as the target and added ambient light to create a more immersive gaming experience, then we programmed the robot arm to respond to user’s movements, and enhancing the design of the Arduino container. These improvements allowed a more seamless and enjoyable game and helped us to reach a wider audience, including gamers and professional athletes looking to improve their reaction time and critical thinking skills.



Link to presentation slides:

Github Link:



Unity Assets:

Magic Effects Free:

Free Pixel Font – Thaleah:

Pixel Art Icon Pack – RPG:

Simple Heart Health System:

LowPoly Environment Pack:

Adventurer Blake:

Character Cactus:

Sound Effects:

Minecraft Villager Hurt Sound Effect:

“8 Bit World!” Fun Upbeat Chiptune Game Music by HeatleyBros:

3 2 1 0 Countdown With Sound Effect | No Copyright | Ready To Use:

WIN sound effect no copyright:


Comic speech bubble with vs text:

Big win surprise banner in comic style:

Dev 4 – Group 5





In Development 4, we are continuing our exploration of creating mixed reality by combining real objects with the virtual environment. Since we have successfully imported robot data into Unity and achieved live communication between the robot and the software, we are now aiming to incorporate the VR headset (Meta Quest2) and utilize its built-in feature called Passthrough in order to create the mixed reality experiment.





First of all, we imported the latest Oculus Integration SDK V50 into Unity, then we dragged the OVR camera from the official file to our scene and added the OVR Passthrough layer to the OVR CameraRig to activate the passthrough feature.

Our original plan was to develop an aiming game in which the robot arm would act as the movable target. Players could view the moving target with an OVR camera while simultaneously shooting it with a controller within the context of the game’s mechanics. However, we encountered several issues that stuck us from pushing it forward. The biggest problem was that the passthrough feature in v50 does not support direct playback in Unity. Therefore, we needed to build the scene and import the built apk. file to the headset, which unfortunately meant that live communication with the robot arm was impossible.

After discussing with Nick and the group members, we decided to simplify the communication medium. Instead of going through the robot, screen monitor, and VR headset, we have chosen to let the robot arm directly communicate with the controllers, without the need for an additional screen as an intermediary.

And after these two attempts, something different happens when revisiting the concept of making this game. Instead of making a straightforward aiming game, we would like to target kids as our primary players and design a game that primarily helps them practice their reaction capacity, which can be beneficial for developing prompt and responsive thinking.

Therefore, our next step is to explore different controllers, such as Xbox or PlayStation controllers, and Arduino joysticks, that will allow us to connect to Unity while enabling live communication with the robot arm. Meanwhile, we are considering incorporating User Interface Design components into building a game scene different from a typical shooting game and could be tailored to children. The overall structure of the game is still up for debate.



Dev 2/ Group 3

Group 3: Wentian, Nicky, Shipra, Gavin





For Dev 2, we wanted to test out two specific things – the camera’s range of movement and the ability to program using Mimic within Maya. We took into consideration the visuals and images that we hoped to eventually achieve for Project 1 and made it a worthwhile experiment for Dev 2. We knew that we wanted the robot arm to zoom and pan across an actor at a desk, so we used that setup to create keyframes in Mimic for Maya, and then recorded the resulting animation. By doing so, we investigated the capacity for Mimic to help with the preproduction stage of the project, by helping us to visualize the final output of shot sizes and angle. Furthermore, we were able to see how intuitive Mimic can be for programming the robot, as opposed to the manual input that we’ve used for prior testing.


Process and Results

After settling the thoughts, we started exploring and programming in Mimic. In the beginning, it took us a while to learn all the fundamental terminologies and techniques so that we could have a basic understanding of how to create our intended animation. Then we built some essential setups in the Mimic such as the robot arm, table, and background. After that, we embedded the camera from Maya into the robot arm and adjusted it to the proper position. Considering our initial idea involved an actor in that scenario, we also downloaded a human model from Sketchfab and placed it in front of the camera, which is not functional but just to fit into the scene we created. Finally, we programmed the robot arm with certain movements inspired by our early ideation, and it turned out to work quite well.


Learnings & Outcomes

This process helped in understanding the relationship between maya, the robot arm and the scene. It was a step towards familiarizing ourselves with the tools and controls. The intent was to be able to test the shots/ scenes using Maya as a preliminary trial for Project 1. The outcomes are two videos, one displaying the scene, the actor and the robot arm holding the phone camera; The other from the camera’s point of view, displaying the shot. This helped in understanding timing, movement, and angles. This exercise was able to add value to our Project 1 prototyping process.


3D Model:

“SF_Girl” by Stan, February 10, 2023


Video Link: