Dev3-Group5

Team member:

Yifan Xu, Wentian Zhu, Ellie Huang, Jiamin Liu

Description of what you are investigating

In Project 2, we are investigating the potential of robot arms to enhance sports and gaming training. In the physical world, gaining expertise in sports and games usually entails repetitive exercise and access to appropriate equipment. Unfortunately, such resources may not be accessible to everyone, which is where robot arms come in as a valuable aid. By generating virtual reality scenes and offering support for sport or game-based training, robot arms can be of tremendous assistance to a wider population.

screen-shot-2023-03-17-at-6-22-05-pm

Aim Lab Mobile

Robots are commonly employed in technology-driven businesses and academic research, but they have yet to be fully integrated into people’s daily lives. In the context of shooting training, traditional aim trainers are limited by fixed tracks laid out on the ground or ceiling, which make it difficult to change or replace targets easily. In contrast, the use of a robot arm in shooting games offers unparalleled flexibility. By acting as a moving target, the robot arm’s speed and position can be quickly and easily adjusted without the need for rebuilding tracks. This makes it an ideal tool for aim training during sports and gaming activities, offering a more immersive and interactive experience for users.

To achieve our goal, we plan to incorporate Unity, a popular game development platform, to create a target and establish a motion pattern that reacts to the robot arm’s movements. This will effectively turn the robot arm into a moving target, providing a challenging and engaging game for aim training. This modality could be further designed to support a variety of sports and games, enabling us to explore how robot arms can enhance training for different activities.

Our ultimate aim is to demonstrate the effectiveness of using robot arms in sports and gaming training. We believe that the flexibility and adaptability of robot arms make them a valuable tool for people who do not have access to traditional training resources. With the help of robot arms, individuals can train in a more immersive and interactive environment, leading to better results and a more enjoyable training experience.

 

Documentation of process and results

  1. link Unity to the robot arm

 screen-shot-2023-03-17-at-6-23-20-pm

  1. create Unity code

screen-shot-2023-03-17-at-6-24-32-pm
screen-shot-2023-03-17-at-6-25-18-pm

In Dev3, we successfully mapped the parameters received from the robot data input scripts to the plane’s XYZ coordinates. This process allowed us to establish the connection between the robot arm and the virtual environment. In addition, we divided the XYZ parameter values from robot data by 100 so that we are able to track the plane in the Unity window. 

  1. test the code

mar-18-2023-20-22-38

Video:

https://youtu.be/fednh6YGnmg

Code we used

https://github.com/NarrowSpace/HumanRobotCollab_Dev3_WIP

 

How did it go?

 In our recent experiment, we were able to achieve horizontal movement after some tests. However, we faced challenges due to space constraints and not considering the length of the robot arm, which prevented us from developing a code that would allow for flipping motions. This would have increased the difficulty level of the shooting game.

Moreover, we found that there is still much to learn about vertical motion and speed control. However, we are confident that we have sufficient time to further enhance the code and explore these aspects further. Despite the challenges, we are encouraged by the progress we have made so far and remain committed to advancing our understanding of robot arms and the human-robot interaction.

 

 

Description of what we learned

 During dev 3, we gained valuable insights that will help us in our future work with robot arms. One of the key takeaways was the importance of understanding axes and coordinate systems. This knowledge is critical in ensuring the precision of the robot arm movements and must be applied when writing code. By gaining a deeper understanding of axes, we were able to improve the accuracy of our robot arm movements and ensure that they matched the intended motion patterns.

In addition to learning about axes, we also improved our programming and debugging skills. Through trial and error, we were able to identify and fix bugs in the code, which helped us to develop more robust and efficient software. This experience will be valuable for future projects, as we will be better equipped to identify and fix errors in our code.

Moreover, this exploration allowed us to gain a fundamental understanding of the game development process. This involved designing the game mechanics and exploring human-robot interaction. Through this process, we gained insight into the complexities of game development and the importance of iterative testing and debugging.

 

Leave a Reply

Your email address will not be published. Required fields are marked *