Author Archives: Wentian Zhu

Project 2 – Group 5

My Robot Coach

Group 5: Wentian Zhu, Ellie Huang, Jiamin Liu, Yifan Xu

Sti

Group5

Bounty Hunter on Cloud: https://www.youtube.com/watch?v=jXso3idTFWA

In this project, we investigated the potential of robot arms to enhance sports and gaming training in a simulated environment. In the physical world, expertise in sports and games usually entails repetitive exercise and access to appropriate equipment. Unfortunately, such resources may not be accessible to everyone, which is where robot arms come in as a valuable aid. By generating virtual scenes and offering support for sports or game-based training, robot arms can be of tremendous assistance to a wider population.
Through the use of technologies such as Arduino, WebSocket, and Unity, we were able to demonstrate how robot arms can be used to create an immersive and engaging training experience that promotes hand-eye coordination and quick thinking. However, challenges such as the optimal delay time in the communication between softwares need to be addressed to improve the overall gaming experience.

In addition, the feedback on the game’s representation of guns and shooting from a political perspective highlights the potential for exploring the concept further. By tying it into current societal, cultural, and political issues, we can create a thought-provoking experience that engages players in a dialogue about violence, human society, technology, and ethics.

 

Workflow

wwwww

The Arduino converts the data into JSON format and sends it to the WebSocket server, which in turn receives the data and sends it to Unity. Unity then converts the received data and convert it into C# format.

 

Arduino Diagram and Container Design

w

Environment Setup

ezgif-com-video-to-gif

User Flow

wewew

 

Design Process

Step 1: Connect both Arduino and Unity to Websocket

In Step 1, we connected Arduino and Unity to the WebSocket for real-time data transfer, use a light sensor and laser pointer to trigger animations, add a score system, reset button, and LED light strip for feedback. This created an engaging game experience that responds promptly to user actions.

ezgif-com-video-to-gif-1

 

Step 2: Refinement

In Step 2, we refined the scene design and created storylines that will add depth to the game. We also started to consider the design of the Arduino container and cable organization to enhance user experience. Lastly, We changed the scoreboard to a reward system that will make the game more engaging and encourage players to continue playing.
ezgif-com-video-to-gif-23222

Step 3: Final Adjustment

In Step 3, We laser cut the monster as the target and added ambient light to create a more immersive gaming experience, then we programmed the robot arm to respond to user’s movements, and enhancing the design of the Arduino container. These improvements allowed a more seamless and enjoyable game and helped us to reach a wider audience, including gamers and professional athletes looking to improve their reaction time and critical thinking skills.

323232

 

Link to presentation slides: https://docs.google.com/presentation/d/1UARgt_aAtIAvsono_Bd3K7zpYPW3exxm33bZj-n9tMY/edit?usp=sharing

Github Link: https://github.com/NarrowSpace/ArduinoUnityWebsocket

 

Credits:

Unity Assets:

Magic Effects Free: https://assetstore.unity.com/packages/vfx/particles/spells/magic-effects-free-247933

Free Pixel Font – Thaleah: https://assetstore.unity.com/packages/2d/fonts/free-pixel-font-thaleah-140059

Pixel Art Icon Pack – RPG: https://assetstore.unity.com/packages/2d/gui/icons/pixel-art-icon-pack-rpg-158343

Simple Heart Health System: https://assetstore.unity.com/packages/tools/gui/simple-heart-health-system-120676

LowPoly Environment Pack: https://assetstore.unity.com/packages/3d/environments/landscapes/lowpoly-environment-pack-99479

Adventurer Blake: https://assetstore.unity.com/packages/3d/characters/humanoids/adventurer-blake-158728

Character Cactus: https://assetstore.unity.com/packages/3d/characters/creatures/character-cactus-32933

Sound Effects:

Minecraft Villager Hurt Sound Effect: https://www.youtube.com/watch?v=1wJsOoUYKyY

“8 Bit World!” Fun Upbeat Chiptune Game Music by HeatleyBros: https://www.youtube.com/watch?v=VijZQa6hT9U&list=LL&index=6

3 2 1 0 Countdown With Sound Effect | No Copyright | Ready To Use: https://www.youtube.com/watch?v=DiUGv1vsuSU

WIN sound effect no copyright: https://www.youtube.com/watch?v=rr5CMS2GtCY

Image:

Comic speech bubble with vs text: https://www.freepik.com/free-vector/comic-speech-bubble-with-vs-text_14201655.htm#query=vs&position=21&from_view=search&track=sph

Big win surprise banner in comic style:

https://www.freepik.com/free-vector/big-win-surprise-banner-comic-style_4192021.htm#page=3&query=win%20title&position=4&from_view=search&track=ais

Dev 4 – Group 5

g501

finalll

 

Introduction:

In Development 4, we are continuing our exploration of creating mixed reality by combining real objects with the virtual environment. Since we have successfully imported robot data into Unity and achieved live communication between the robot and the software, we are now aiming to incorporate the VR headset (Meta Quest2) and utilize its built-in feature called Passthrough in order to create the mixed reality experiment.

 

Process:

process0-1

 

First of all, we imported the latest Oculus Integration SDK V50 into Unity, then we dragged the OVR camera from the official file to our scene and added the OVR Passthrough layer to the OVR CameraRig to activate the passthrough feature.

Our original plan was to develop an aiming game in which the robot arm would act as the movable target. Players could view the moving target with an OVR camera while simultaneously shooting it with a controller within the context of the game’s mechanics. However, we encountered several issues that stuck us from pushing it forward. The biggest problem was that the passthrough feature in v50 does not support direct playback in Unity. Therefore, we needed to build the scene and import the built apk. file to the headset, which unfortunately meant that live communication with the robot arm was impossible.

After discussing with Nick and the group members, we decided to simplify the communication medium. Instead of going through the robot, screen monitor, and VR headset, we have chosen to let the robot arm directly communicate with the controllers, without the need for an additional screen as an intermediary.

And after these two attempts, something different happens when revisiting the concept of making this game. Instead of making a straightforward aiming game, we would like to target kids as our primary players and design a game that primarily helps them practice their reaction capacity, which can be beneficial for developing prompt and responsive thinking.

Therefore, our next step is to explore different controllers, such as Xbox or PlayStation controllers, and Arduino joysticks, that will allow us to connect to Unity while enabling live communication with the robot arm. Meanwhile, we are considering incorporating User Interface Design components into building a game scene different from a typical shooting game and could be tailored to children. The overall structure of the game is still up for debate.

 

 

Dev 2/ Group 3

Group 3: Wentian, Nicky, Shipra, Gavin

 

untitled-1

 

Idea

For Dev 2, we wanted to test out two specific things – the camera’s range of movement and the ability to program using Mimic within Maya. We took into consideration the visuals and images that we hoped to eventually achieve for Project 1 and made it a worthwhile experiment for Dev 2. We knew that we wanted the robot arm to zoom and pan across an actor at a desk, so we used that setup to create keyframes in Mimic for Maya, and then recorded the resulting animation. By doing so, we investigated the capacity for Mimic to help with the preproduction stage of the project, by helping us to visualize the final output of shot sizes and angle. Furthermore, we were able to see how intuitive Mimic can be for programming the robot, as opposed to the manual input that we’ve used for prior testing.

 

Process and Results

After settling the thoughts, we started exploring and programming in Mimic. In the beginning, it took us a while to learn all the fundamental terminologies and techniques so that we could have a basic understanding of how to create our intended animation. Then we built some essential setups in the Mimic such as the robot arm, table, and background. After that, we embedded the camera from Maya into the robot arm and adjusted it to the proper position. Considering our initial idea involved an actor in that scenario, we also downloaded a human model from Sketchfab and placed it in front of the camera, which is not functional but just to fit into the scene we created. Finally, we programmed the robot arm with certain movements inspired by our early ideation, and it turned out to work quite well.

 

Learnings & Outcomes

This process helped in understanding the relationship between maya, the robot arm and the scene. It was a step towards familiarizing ourselves with the tools and controls. The intent was to be able to test the shots/ scenes using Maya as a preliminary trial for Project 1. The outcomes are two videos, one displaying the scene, the actor and the robot arm holding the phone camera; The other from the camera’s point of view, displaying the shot. This helped in understanding timing, movement, and angles. This exercise was able to add value to our Project 1 prototyping process.

 

3D Model:

“SF_Girl” by Stan, February 10, 2023

https://sketchfab.com/3d-models/sf-girl-a219070284e64ddd8a51f69c207cc81

 

Video Link:

https://drive.google.com/file/d/1mQIW6Zh_D0KTh51CO6dtcA8hf1Mf6JiR/view?usp=sharing

https://drive.google.com/file/d/1saL-Pwhs0dbql0qG0pFilpfRaSnBBCj3/view?usp=sharing