Dev 4 – Group 5





In Development 4, we are continuing our exploration of creating mixed reality by combining real objects with the virtual environment. Since we have successfully imported robot data into Unity and achieved live communication between the robot and the software, we are now aiming to incorporate the VR headset (Meta Quest2) and utilize its built-in feature called Passthrough in order to create the mixed reality experiment.





First of all, we imported the latest Oculus Integration SDK V50 into Unity, then we dragged the OVR camera from the official file to our scene and added the OVR Passthrough layer to the OVR CameraRig to activate the passthrough feature.

Our original plan was to develop an aiming game in which the robot arm would act as the movable target. Players could view the moving target with an OVR camera while simultaneously shooting it with a controller within the context of the game’s mechanics. However, we encountered several issues that stuck us from pushing it forward. The biggest problem was that the passthrough feature in v50 does not support direct playback in Unity. Therefore, we needed to build the scene and import the built apk. file to the headset, which unfortunately meant that live communication with the robot arm was impossible.

After discussing with Nick and the group members, we decided to simplify the communication medium. Instead of going through the robot, screen monitor, and VR headset, we have chosen to let the robot arm directly communicate with the controllers, without the need for an additional screen as an intermediary.

And after these two attempts, something different happens when revisiting the concept of making this game. Instead of making a straightforward aiming game, we would like to target kids as our primary players and design a game that primarily helps them practice their reaction capacity, which can be beneficial for developing prompt and responsive thinking.

Therefore, our next step is to explore different controllers, such as Xbox or PlayStation controllers, and Arduino joysticks, that will allow us to connect to Unity while enabling live communication with the robot arm. Meanwhile, we are considering incorporating User Interface Design components into building a game scene different from a typical shooting game and could be tailored to children. The overall structure of the game is still up for debate.



Leave a Reply

Your email address will not be published. Required fields are marked *