Blog Post 6: Class Reflection

Mika Hirata



In the process of creating Project B with my group, I was able to learn how to use Unreal Engine 4. Since I did not have any experience with Unreal Engine 4, I had to follow many tutorials and learn how to use the software in a short period of time. Our group faced many bottlenecks of coding bugs and lighting issues. However, the final outcome was very successful and all of the group members were very proud of our VR project. Our VR game is called “Genera”, which is a game of completing planting tasks in a zero-gravity environment. The protagonist is the viewer, who is a lonely sentient robot scientist on an alien planet. To create this game, we divided up the tasks and assigned roles to each member. My main role in this project was to create the materials and shaders for the environment to match the objects with the science-fiction theme. I created most of the textures, materials and the shaders which made the game more immersive and aesthetically pleasing. The water material that is used for the pond and the fountain in the building was the most complicating material in this game. In the process of creation, I learned how to use the blueprinting method by combining animations and textures. Personally, using blueprint on UE4 was more intuitive than coding on Unity, since connecting the nodes on each element was very simple and does not require any typing. Moreover, I was able to learn building lightings and baking are very important in UE4. Without building the lightings, we faced many issues to play the game.


Furthermore, I was able to learn how to work with the team. Since my group was very balanced out with the skills, we were able to peacefully divide up our rolls and focus each asset at a time. However, gathering and putting every asset together took much more time than we expected, and were not able to create what we planned initially. Even though we are very satisfied with the final outcome, there are many potentials to make this game better. For next time, I learned to have more time to put all the assets together and test everything.

Blog Post 4

Mika Hirata


The VR experience called “Testimony” gave me many inspirations to create a 360 video for Project A. Testimony is a storytelling VR experience and its UX design is very unique compared to other VR experiences. Multiple speakers surround the viewer. When the viewer looks at one speaker, the speaker starts talking about their experiences. Since the background is totally black, I was able to focus on each person’s story. Moreover, some videos that relate to the speakers’ stories show up as the background, and I thought it is a very smart way of using the void space. Testimony is a very isolated experience from the outside of VR headset. Before trying on the experience, I did not have any clue of what was happening in the VR headset. However, after experiencing Testimony, I realized that this VR experience conveys the idea very well in a unique storytelling style.

For our project called “Still Point”, we wanted to create a totally isolated environment, so that we could convey the idea of relationships between scents and memories. We decided to use a room to isolate the viewers from the surroundings. The idea of the isolation came from Testimony since we all agreed that the idea of the VR experience should not be affected by the outside of the VR headset. Moreover, our project “Still Point” is meant to be a settle experience. We did not want the experience to be rushed or noisy, but instead, we wanted our project to make each viewer take time and think about what they feel and remember from the experience. Furthermore, the idea of blackout at the beginning, at the end and in between scene came from Testimony. We thought the blackout gives a nice little break between the scenes and it leads the viewer to focus more. Also, the blackout let us change the scent in between the scene.

Blog Post 3 – Unity 360 tutorials

Mika Hirata 3154546

I was very impressed by how easy it is to create 360 VR experience with Unity. Hector demonstrated to us the way of inserting 360 videos into Unity and connecting the project with a VR headset. By assigning Skybox Panoramic Shader to Material, it was extremely easy to insert a 360 video to Unity. I was surprised that transforming Unity into VR project did not take long. The process took us only about 10 minutes in total and I found the steps were very simple and clear. Moreover, Hector showed us how to create 3D objects in the scene and we can insert videos on the surface of the sphere, square or any kind of 2D/3D shapes. I found this super useful when I want to create a fancy button or an interactive interface in VR experiences.

Audio is also essential when it comes to VR experience since VR experience controls the viewer’s eyesight and their hearing. With using Audio Manager on Unity we are able to control the special sound. From the Hierarchy panel, we can select the game object that has an attached audio source. We can control the volume and pitch on the inspector panel under the audio source component. It is important to set the spatial blend to 3D, which is a numeric value of one. We can edit the volume of the sound depending on the room size or the environment. I found this feature fascinating because having a realistic audio can create the whole VR experience much more realistic.


Blog Post2 “Intro to VR Concepts & Production”

Mika Hirata 3154546


Both Unity and Unreal Engine are popular game engines available to the public for free. In this blog, I would like to compare those two engines and comment on what I liked and disliked.


I have developed  2D games and VR projects only in Unity. Unity is a great tool to build the VR games/projects since it allows users to edit and move around 3D objects easily. One of my most favorite tools of Unity is its cross-platform integration. The cross-platform allows users to develop the game and switch any game platform between iOS, Android and Windows. I personally think this feature makes the game developers’ life a lot easier. Moreover, Unity is capable of handling many types of file formats. For instance, 3D Max, Blender, Maya, and Rhino, etc. Since I often develop 3D models on Rhino and Maya, Unity is very useful and easy to combine my files together.

Assets store is also a great feature of Unity. Unity provides a variety of assets, such as props, materials, textures, and characters all free for the public. Asset store even has sound effects and particles to download. I think Unity really comes out on one of the biggest asset stores in the game engine field. It offers every kind of assets from intuitive animation and motion-capture software.

Unlike Unreal Engine, Unity uses C# and JavaScript for coding. Personally, I only have experience in using JavaScript. Once I got the hang of how to use Unity, I think Unity has a quick and simple interface to built the projects.


Unreal Game Engine:

I have never used Unreal Game Engine, but it was very interesting to learn how to use the basic interface since the platform is very different from Unity. When I first looked at the interface of the Blueprint, I was a little bit overwhelmed by all the nodes which are surrounding the main window in the middle. Unreal Engine was developed by one of the greatest American video games and software companies called Epic Games. So far the latest update of the engine is 4.20.3. We can download the archived versions from the library tab of Epic Games Launcher.

Unreal Game Engine requires coding with C++ programing language. Personally, C++ is much harder to learn compared to JavaScript so it will be a challenge for me to get used to using Unreal Engine. However, Unreal Game Engine made the users to create a game without coding, but the quality of the game will be limited.

According to the lecture that we were given on Tuesday, Unreal Engine has Blueprint visual scripting, instead of writing code. Blueprint is node-based interfaces and the UI of the platform is very clean and simple. I think Blueprint visual scripting would be very efficient for creating 3D prototypes. Also, it is possible to create the entire game with using Blueprint scripting.

I was very impressed by how Hector connected his VR headsets and controllers to Unreal Engine and tested out moving the 3D objects. I am very excited to test out and play around with the nodes on Unreal Engine.

Hello world!

Welcome to OCAD University Blogs. This is your first post. Edit or delete it, then start blogging! There are tons of great themes available for you to choose from. Please explore all of the options available to you by exploring the Admin Toolbar (when logged in) at the top of the page, which will take you to the powerful blog administration interface (Dashboard), which only you have access to.

Have fun blogging!