Blog Post 6: Getting to Completion

screenshot-349

At this point, almost all assets have been successfully implemented. In the screenshot above, there are 7 artifacts fully setup, and I have three left to do. The artifacts are hovering above the pail for testing purposes without access to the headset because I cant grab artifacts off the ground and manually place them into the pail. All the artifacts above can successfully combine to spawn a collection object.

screenshot-348

screenshot-347

The above screenshots show how the new inventory system works. As objects are added to the bucket, only the selected inventory object is shown. I wrote a script to keep track of the last selected inventory object so the player can cycle forward and backwards through the inventory. The new system has solved the issue of artifacts falling out of the bucket, but we still have problems with the artifacts staying parented and kinematic once they leave the pail.

During our most recent tests in VR we also noticed there were some physics issues with the bucket which resulted in it distorting when grabbed by the Ray Interactor. Also, the body of the bucket would be somehow left behind in the 3D beach scene whenever the Beach Recreation scene was entered.

screenshot-350

Speaking of the Beach Recreation scene, the old version has been replaced with a duplicate of the 3D Beach scene. The screenshot is of a version from earlier in the day, so the set has changed a lot and I have added in the spawners for all of the Beach Rec collection objects.

screenshot-345

This screenshot is of the Sunkenship and Strangled Fish collection objects. Like the previous screenshot, the scene was not completed when I took it, so I have added the remaining spawners and collection artifacts since then. Still, It shows one of the custom animations I made (in Blender) for the sea/ocean life. I animated the Strangled Fish, Strangled Turtle, and Turtle Hatching collection objects.

Finally, we still need to add in audio, a start and quit button, the controller scheme, a popup for collections created, add  in Nik’s 360 skyboxes, and fix the pail physics. The newest VR tests also broke the render textures of the portals, so the issue needs to be fixed as well.

Blog Post 5: Personal VR Game Progress

The majority of my progress so far has been on prototyping the setup of each scene in Echoes of the Tides and implementing the mechanics and player controls. I was unable to come to class yesterday because I was unwell, but I made sure to keep up to date with what my group was doing and I worked a bit on the Unity project at home. They ended up adding their newer assets to the scenes, which was great to see, but I was unable to attach any interactivity to any of the assets that needed it. We envision certain objects like the radio/speakers and the bonfire being interactable. For example, the radio would play music and the bonfire could be lit. 

 

Fortunately, I was able to finally sort out the buggy artifact proximity mechanic at home. Previously artifact objects could be right next to each other and it would not be detected or would be overridden by a false proximity flag from another unrelated artifact. Sometimes it seemed to work, but more often than not it did not. I was able to get the Bonfire to be triggered regularly, but I was not able to untrigger it even if I moved the artifacts out of proximity from each other. Printing to the console when objects within a certain distance showed that the if statements were working, but for some reason the code inside of the if statement to trigger the creation of a collection did nothing. After several weeks of trying to fix my code, I finally decided to rewrite it and approach proximity through collisions and not distances. I wanted to have it set up, so my group could present the proximity and triggers working, but I was unable to finish it on time. Below are screenshots of the proximity-based collections working and triggering different combo objects.

 

screenshot-241

The Beach Recreation scene with the assets that Annie and Nik made. They set it up. Based on the critique we got about this scene, we have decided to focus on the Underwater scene and try implementing the bonfire collection in the 3D beach gameworld to replace the purpose of the Beach Recreation scene.

screenshot-245_li

The three objects that can make the bonfire collection are circled: Lighter, Beer Can, and Wood. In the screenshot, they are too far apart, so nothing is being triggered. In the future, it would make more sense if the bonfire was just the Wood and the Lighter, but to test out three-way combos I added the can.

screenshot-246_li

The bonfire artifacts are in close enough proximity, so the Bonfire object has been spawned into the world. Moving an object out will remove it.

screenshot-253_li

The brackets show there is distance between the artifacts that can make up the collections. The Fish and Can will trigger the “Strangled Fish” combo object and the Fish and Wood will trigger the “Sunken Ship” combo object. The pink cubes are placeholders to show where the combo objects will spawn.

screenshot-254_li

The “Strangle Fish” combo object is spawned. As you can see in the inspector, I keep track of the complete collections in a list. Each spawner’s spawn script has access to it and will spawn its combo object when it detects its collection. In the future, I may create a spawn manager to keep track of all the spawns and triggers in each world the triggers.

screenshot-251_li

The “Sunken Ship” combo object is spawned. For testing, the console window will update whenever two artifacts are in proximity and will stop when they aren’t.

screenshot-250

Finally, the portal shader showing the other worlds is an ongoing issue that still is not fully resolved. In the screenshot, it looks fine, because when there are only one camera and eye to render to its very smooth, but in VR it has issues rendering a version of the shader that is properly orientated in each eye. I will continue looking into how to solve the issue, but I may have to scrap the shader and create a portal VFX instead. The main reason I had it in was to make the world feel connected, almost as if you were not being teleported to the destination but being connected/parallel to it through a wormhole.

Blog Post 3 – Tutorial Report (Jan 28, 2020)

My biggest takeaway from Hector’s class on Jan 28th was the tutorial on setting up Oculus controllers in Unity. Previously, all my group’s VR tests used the XR Interaction Toolkits default teleportation mechanics to move around the scene. Teleporting was fun, but we wanted to recreate the feeling of walking on the beach and picking up objects that were stumbled upon, rather than automatically teleporting the player to where they wanted to go. Walking also made the beach world seem more vast, empty, and open.

The Oculus controllers were a lot more difficult to set up than I thought it would be because my group decided to transition to Unity’s new XR Integration from Oculus Integration which had a lot more documentation. Fortunately, Hector had done a lot of tests himself and shared with the class a fairly hard to find the documentation page on the correct API. I started off adapting Hector’s code on accessing each individual controller for our group’s needs and implemented the new version with my controls into my PlayerScript. Unfortunately, at first, it did not seem to work, so I briefly tried setting up the controllers through Unity’s Input Manager. Interestingly enough. I only recognized the left joystick. Accessing one joystick was okay for very linear movement using the global axis/coordinates of the scene, but it was awkward when the player wanted to turn around or was reorientated which swapped the directional movement.

In the end, I realized that the reason why the controllers were not recognized through XR Integration was that the headset needed to sense someone wearing it before the scene could be played and the controllers could be found. At the moment, I am testing two methods of movement controls. The first one is where the player has access to both joysticks and controls the rotation of their virtual body with one joystick and the direction they want to go (in reference to the angle they have rotated) using the other. The other is just using one joystick to move in a direction based on the direction that the HMD is facing through head tracking. I found that rotating the player’s body was the smoothest and easiest movement, but it was sometimes a bit much for my eyes when I looked around well moving. Moving towards the direction the HMD was facing felt fine, but it involved physically facing the direction you wanted to go. Overall, my group still needs to figure out what method feels the most comfortable.