Synopsis: Creation of a puppet storytelling environment in screen spaces through the use of hand tracking and hand motion controls for staging short stories for an audience.
Cast: As Venus; Mufaro Mukoki, As Jupiter; Ricardo ‘Ricky’ Quiza Suárez
ACT 1- OVERTURE; The referents
Puppet Narrator. Article: Hand gesture-based interactive puppetry system to assist storytelling for children.
Figure 1: Both images illustrate different Puppet Narrator features. (Left) The implementation of the system architecture is mainly composed of three parts: input, motion control, and output (Right) Basic gesture control, an example of using a combination of gestures to steer and manipulate the puppet. a Stretch. b Grip.
Authors: Hui Liang is a Marie Curie senior research fellow at the National Centre for Computer Animation, (NCCA) Bournemouth University, the UK, and an associate professor at the Communication University of China, Dr. Ismail K. Kazmi, a Senior Lecturer of Games Programming/Development at Teesside University, where he teaches a wide range of courses in Computer Science and Game Development; Peifeng Jiao, a lecturer at the basic school of the Southern Medical University of China, Jian J. Zhang, Professor of Computer Graphics at NCCA, where he leads the National Research Centre, Jian Chang, Professor and active scientist in computer animation with over 15 years research experience in the NCCA.
This article was a pivotal point of information gathering and referential research for our project. The authors intended with this system, to develop narrative ability in the virtual story world. Depth motion sensing and hand gestures control technology were utilized in the implementation of user-friendly interaction. A key aspect in the drive for developing the Puppet Narrator was ‘how digital techniques have been used to assist narrative and storytelling, especially in many pedagogical practices; with the rapid development of HCI techniques, saturated with digital media in their daily lives, young children, demands more interactive learning methods and meaningful immersive learning experiences’ (Liang, Hui, et al 517).
Its abstract premise proposes the creation of a hand gesture-based puppetry storytelling system for young children; players intuitively use hand gestures to manipulate virtual puppets to perform a story and interact with different items in the virtual environment to assist narration. Analyzing the collected data in this article, facilitated us in scoping and giving form to our screen space exercise. Some of this data includes; how would interaction through the system architecture happens (Input, Motion Control, Output, what kinds of hands gestures can be used, what skills/areas are trained in the public with its use (narrative ability, cognitive skills, and motor coordination ability), what technologies were used in its realization and how and what was accomplished through the practical application of a prototype of this Puppet Narrator system in a selected target audience.
Handsfree.js. Library: Online library with multiple projects about its potentiality.
Figure 2: Both images illustrate different Handsfree.js projects. (Left) Laser pointers but with your finger (Right) Hand pointers and scrolling text. More examples at https://handsfree.dev/.
Author: Oz Ramos a.k.a Metamoar / Midifungi, a generative artist exploring compositions with p5.js and fxhash artist, who has been exploring gesture-driven creative coding for about 4 years.
Handsfree.js started back in 2018 while the author was homeless, in order to help a friend recovering from a stroke at the shelter they habited, by navigating the web with face gestures. Over time and through the support and encouragement of many people, the project grew to become much more than that, expanding into a full-grown library with the capacity for both face and hand tracking. Regarding hand, it differs from the handpose ML5 library, in regard to detecting multi-hands, up to 4, a key aspect to integrating into our project.
As we began code sketching prototypes of our project, we encountered one issue; the libraries we were working with only detected one hand. P5js library Handsfree.js became key in overcoming that limitation, but it was more than just that. The online repositories of Oz Ramos, the author, about his and others’ explorative research use of Handsfree.js was beyond useful; both inspiration and code-wise, these examples allowed us to get more understanding of how the hands can interact and influence the screen space. While we manually explored venues on how to adapt the library to our screen space, we also incorporated plugins, like pre-established hand gestures (pinching fingers) to allocate functions within the screen space.
ACT 1 – ARIA; The idea
Marionettes have been a part of many world cultures for thousands of years and have played a unique role in storytelling. Although puppetry was never reserved solely for the entertainment of children, our project focused on the use of it to assist storytelling to children. Puppetry continues to be a respected form of art, however, the new generation, who were born into technology, are not fascinated much by the physical instruments of play around them. They have spent the better part of their lives glued to screens. It is important for educators to devise more interactive learning methods and meaningful immersive learning experiences (Liang, Hui, et al 517).
In our project, we attempted to answer the following questions:
- How can human-computer interaction assist in learning and communication?
- What are new ways of engaging in education to facilitate learning?
- How does HCI assist/improve engagement?
Our project uses handsfree.js which is easy to access through a browser. Handsfree.js uses hand tracking to position the cursor on the screen and incorporates hand gestures, creating an intuitive program. As a result, the user is able to use simple hand gestures to control the marionette puppet on the screen. To control the puppet, fingers are mapped to specific strings of the puppet. This results in a puppet that can move anywhere on the screen and whose arms make puppet-like movements. Other functions of the marionette puppet/s, such as mouth movements, are triggered by other hand gestures like pinching the fingers
Figure 3: A scene is composed of 1 to 2 puppets and interactive stage props. Hands are used to control puppet movement via fingertips, and gestures trigger different interactions.
The project provides a range of ready-made scenes, characters and props for educators to choose from to assist them in telling their chosen stories. If more than one scene is chosen, pressing the arrow keys on the keyboard will help them navigate from one scene to the next.
Figure 4: The storytelling unfolds itself through multiple scenes, controlled by the arrow keys. For implementation, we choose Easop’s fable, the Cat Maiden.
Figure 5: The interaction in the system occurs in a dialectic way; the puppeteer(s) controls action through screen space, reflected upon another screen space. Props can be used to expand upon the space where the action happens, mimicking puppeteer stands.
The final product is presented on a screen the size of a television and physical props, such as curtains and boxes, can be added around the screen space in an effort to emulate a traditional puppet show. An essential part of the program is that it must be user-friendly and require minimum effort from the facilitator which means the facilitators can be the children themselves, with supervised assistance. This makes it easy for them to adapt to a variety of different stories. The facilitator will still need to stand behind the main screen, much like the fashion of a typical puppet show, however, they have the advantage of having a smaller screen to coordinate their story. This means that they will also get to see their narrative as it is experienced by others.
Figure 6: Final version of the coded system while being implemented.
ACT 3 – FINALE; The documentation
Figure 7: Body of work showing the evolution of the project.
For our presentation, we made use of a 70-inch TV connected to a laptop via an HDMI connection. We stood and presented our puppet show in front of the laptop and the output was projected onto both the laptop and the television. As accessibility was an important consideration, we situated the puppet show in a small room to amplify the audio as much as possible without requiring additional hardware. Additionally, the presentation was accessed through the chrome browser. The puppet show was presented to an audience who sat and/or stood in front of the large-screen television. In preparation for our presentation, we stood in front of the laptop to test the best distance to stand from it in order to move the puppets more easily (2.5 inches) and to prevent the software from picking up any more hands that would disrupt the performance.
Upon reflection, we ought to have created a physical structure around the large-screen TV to emulate the stage of a puppet show and create a more immersive experience. Our challenges and limitation were our own limited experience with coding. We experimented with various codes and sketches to mimic and stimulate hand movements and map our puppet strings to human fingers via the web camera.
Figure 8: One of the usability tests of different ways of applying hand detection on the screen, and using it as means of moving objects on the screen space.
Figure 9: Mock-up practices and usability test of the system. The interaction happens on a screen that the puppeteers use to conduct the show, while the puppet show is output to an external, bigger screen.
You can watch our presentation here:
https://www.youtube.com/watch?v=6LSZq3VBrLo
The final code utilized in the presentation (Edit):
https://editor.p5js.org/ricardokiza654/sketches/94qop2xpi
The final code utilized in the presentation (FullScreen):
https://editor.p5js.org/ricardokiza654/full/94qop2xpi
Some notable sketches we created can be found on the following links:
https://editor.p5js.org/ricardokiza654/sketches/GDkoYSEVd
https://editor.p5js.org/ricardokiza654/sketches/Xa5LVG0yX
https://editor.p5js.org/mufaromukoki/sketches/mi2fY65HfG
https://editor.p5js.org/mufaromukoki/sketches/XFVMztb-l
To view more of our documentation visit this link.
Bibliography
Aesop, . ““The Cat-Maiden”.” Aesop’s Fables. Lit2Go Edition. 1867. Web. <https://etc.usf.edu/lit2go/35/aesops-fables/377/the-cat-maiden/>. October 24, 2022.
Canada Museum of History. Qualities of a Puppet, Canada Museum of History, 2022, https://theatre.historymuseum.ca/narratives/details.php?lvl2=4812&lvl3=4826&language=english.
Fling, Helen. Marionettes: How to Make and Work Them. Dover Publications, 1973.
Flower, Cedric, and Alan Jon Fortney. Puppets: Methods and Materials. Davis Publications, 1993.
Liang, Hui, et al. “Hand Gesture-Based Interactive Puppetry System to Assist Storytelling for Children.” The Visual Computer, vol. 33, no. 4, 2016, pp. 517–531., https://doi.org/10.1007/s00371-016-1272-6.
Mediapipe Solutions. “MediaPipe Hands.” Mediapipe, 2022, https://google.github.io/mediapipe/solutions/hands.html.
Ramos, Oz. “Handsfree.js Intro (Spring 2019).” Vimeo, Oz Ramos, 24 Oct. 2022, https://vimeo.com/476537051.
Ramos, Oz. Handfree.dev. “Hands-Free Libraries, Tools, and Starter Kits.” Handsfree.js, 17 May 2022, https://handsfree.dev/.
Ramos, Oz. “MIDIBlocks/Handsfree: Quickly Integrate Face, Hand, and/or Pose Tracking to Your Frontend Projects in a Snap ✨👌.” GitHub, MidiBlocks, 2022, https://github.com/MIDIBlocks/handsfree.
Victory Infotech. “Indian Wedding Bride In Choli And Groom Kurta Pajama With Koti Couple Traditional Outfits Doodle Art PNG Picture.” Pngtree, Pngtree, 11 July 2022, https://pngtree.com/element/down?id=ODA3MTc2Ng&type=1&time=1666238499&token=NDllYmQ3MzdmY2JiYTkwMmRmYjg1MjEwMjBkYWE1M2M.