HOKUSAI

Mika Hirata 3154546, Vivian Wong 3158686, Anran Zhou 3157820

Concept

HOKUSAI is an interactive art installation experience of culturally influenced art illustrating the evolution of the famous Japanese artist, Katsushika Hokusai.

Projected on a paper-like hanging screen, we bring Hokusai’s artwork to life, in an inviting, open, interactive space. Layered in a parallax formation to display depth, animations are activated with motion detected by a Kinect. The installation features reworked and refined digital paintings, special sound effects, the artist’s quotes, and self-composed, culturally-influenced music.

Beginning with a looping animated scene of the artist’s name, the audience is invited to approach the piece. Once a user is detected by the Kinect, the screen reveals a dynamic representation of Hokusai’s painting, The Great Wave of Kanagawa. The viewer can interact with the painting, as the waves and particles follow the viewer’s horizontal location. The closer the viewer gets to the painting, the louder the accompanying music and waves become. The audience is directly invited to experience the culture of the artist. As an interactive installation, the viewer is no longer only the audience, rather, the viewer becomes a part of the piece.

Inspiration

For our final project, we decided to continue our previous VR project using a completely different medium. Due to our lack of knowledge with Unity and VR, we struggled greatly with developing the previous project. As a result, we decided to take this project into Processing, a program that our team has much more experience with. We also wanted to represent our concept in a more digital way.

Process

We originally had six different scenes in the VR version, but decided to only work with one scene for the final project. We dissected the scene into separate layers to be manipulated by user interactions using Photoshop. We determined our interactions and visuals in a storyboard initially.

wave2-1 wave_layers_high
Separated layers of the bottom and high waves in the scene.

Initial Storyboard
screen-shot-2018-04-10-at-1-16-46-am

In After Effects, we first attempted to create videos with quotes in a calligraphy-like style. The writing animations ended up taking too long to produce for each scene, so we used special ink-like effects to recreate it.

%e3%82%b9%e3%82%af%e3%83%aa%e3%83%bc%e3%83%b3%e3%82%b7%e3%83%a7%e3%83%83%e3%83%88-2018-04-09-21-25-09 %e3%82%b9%e3%82%af%e3%83%aa%e3%83%bc%e3%83%b3%e3%82%b7%e3%83%a7%e3%83%83%e3%83%88-2018-04-09-21-25-00

We started putting the scene together by importing libraries, adding the layers of images one at a time. Once we had a basic setup of waves figured out, we included the SimpleOpenNI user3D example template. To get the interaction to work, we replaced the x-axis locations of specific images with the userX center of mass variable. We used mouseX to prototype our sketch motion animations when we did not have the Kinect at hand. We learned to calibrate the Kinect with the map function.

We created functions to oscillate boats back and forth, to play videos, and to fade in scenes and assets. We also created separate sketches for the Processing particles animation. To clean up the sketch, everything was organized to have its own function.

Code on Github

Sparkles on the drawing: https://github.com/0102mika/sparkles/blob/master/mouseInteraction.pde

Main Sketch: https://github.com/vee97/HOKUSAI/blob/master/Hokusai_New_O.pde

Testing

test-20180329_103249
Testing and determining the best way to setup the piece. Found it effective to project onto a large hanging sheet of mylar with a mirrored display, so that viewers can interact with the piece up close, without blocking the projection.

https://drive.google.com/file/d/1wsXkSNpNQ6wLVsPeB1TGPiNgNeOZ2Ozq/view?usp=sharing

Challenges

Hand Detection

Initially, we planned to add sparkle animation following the hand waving position. We were trying to use User3D example file for hand detection but it doesn’t work properly when we combine the code to our sketch, so in the end we decided to make the sparkle animation follow body movement.

We also often had trouble determining how to use the Kinect Processing library, SimpleOpenNI. While it was simple to use with the default example sketches, it became more challenging when combining it with our own scene, due to the several translate() and perspective() functions it came with. The scene resolution and display was often altered due to these issues. Sometimes the scene was mirrored or displayed a zoomed in version of the original image unintentionally. We had to start the sketch from scratch, testing everytime we changed or added new code several times to identify the issue.

We had attempted to add a third interaction to our installation. Originally, the number of users detected by the Kinect could affect the scene as well. Depending on the number of users detected, the boats would oscillate with different speeds. The more users detected, the faster the boats would oscillate. Unfortunately it did not work, the boats spun around in 360 unrealistically. With the final sketch, the boats would also stop oscillating after the sketch resets, which we could not debug.

Sketch of the final outcome:

diagram

How we locate the kinect, projector and user interaction.

On the day of presentation, we noticed that our sketch ran with a lag. However, during our testing and development sessions, we never encountered this error. This could have been due to the resolution change, or too much going on in the sketch that Processing could no longer handle it. We lowered the count of the particles in the sparkle animation, which helped a bit.

Final Product

img_6600
20180403_103939
20180403_104037

Final Product Videos:

https://drive.google.com/file/d/1lRQGLSMWRStfo08hYR2y8f85hx84Uwrw/view?usp=sharing

https://drive.google.com/file/d/1IaM0RFK8OZ57deaPZS6VQRpmNBUxQ_RG/view?usp=sharing

https://drive.google.com/file/d/1mEz7g0c5xiDzYZxxuEtnHwqETpMbQjAx/view?usp=sharing

Master Folder: https://drive.google.com/open?id=1CKJDuclJwnhc52yPIAcPBisZTPIJI8v-

Conclusion

The interactive art installation approach proved to be more effective than the VR piece. By simplifying and reducing the project to one scene, we could focus on making the scene’s interactions and visuals more appealing. Taking the project further, we would refine the animations, continue to debug the code, remove the lag, and focus on improving the viewer relation to the installation. We gained valuable knowledge and experience from working with the Kinect and Processing throughout the development of this project.

This entry was posted in Uncategorized. Bookmark the permalink. Comments are closed, but you can leave a trackback: Trackback URL.

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.