Author Archives: Ricardo Quiza Suarez

cat-siren-for-sketch-zoom

Sketch 5 – Cat Mermaid “Ricky” QS

img_3944-4

 

 

 

 

 

 

 

 

In this sketch, I experimented with replicating a single image in different screen sources.
I studied how to convert an image into a sequence of data and incorporate that sequence into a code used by Arduino to outsource it to a screen. Using class code examples of transmitting data between devices, I adapted it with elements of an example code material that I found, that used another library, for drawing bitmaps. I played around with interactivity, by mapping the movement of the mouse to replicate the same image, like a visual texture, in the screen connected to Arduino.

CODES: https://github.com/zeldaso2ff/Cat-Mermaid-Sketch.git

REFERENCE FOR CREATING BITMAPS: https://www.instructables.com/How-to-Display-Images-on-OLED-Using-Arduino/

sketch-mermaid-cat-still-2

 

 

 

Catopia

Created by: Mufaro Mukoki, Sunidhi “Pom” Naik, Ricardo “Ricky” Quiza Suarez

Emotive Object working: https://www.youtube.com/watch?v=ED_QEEqdLFM

Related Works: 

For our project, we drew inspiration from electronic cat toys. SMARTYKAT is a US and Asia-based company that creates innovative products for use by cats and their owners. The company was founded in 1990 and has since strived to provide “innovative products, stylish designs, environmental responsibility and affordability all in one place” (Worldwise Inc 2021). The majority of their products are toys meant to encourage interaction, exercise, play, independence and safety. Loco Motion is one of the several toys offered by SmartyKa.

cat toy 1

Figure 1: LocoMotion by SmartyKat (Amazon.ca 2022)

This toy, meant to mimic the motion of prey, served as our starting point. One such type of prey is birds, and this simple electronic toy is meant to satisfy the hunting spirit of a cat. A wand, with a feather suspended from it on one end, moves in 360-degree regular motions. Another important aspect of this toy is that the cat owner can control the speed to ensure the cat’s safety.

91iughbt25l-_ac_sl1500_

Figure 2: Pulatree Interactive Cat Toy (Amazon.ca 2022)

Another variation is the Pulatree 3-in-1 Automatic Cat Toy. Although not much information is available on its manufacturers Kemi Intelligent Manufacturing (Shenzhen), the toy seems to build on the concepts of Loco Motion. It combines three different elements; a feather wand in a circular motion, a laser toy and a ball, to engage the cat mentally and physically. Three different kinds of speeds are used; slow, fast and random. 

From the mechanisms of these toys, we could derive the elements that make for a successful cat toy. Circular rotations, their use and manipulation, are a good way to mimic playfulness and initiate the participation of the cat. We could achieve these varied effects by using both continuous and standard servos. However, we were using strings as our chosen material, therefore, we had to factor this into the conceptualization. The motions of birds and feathers inspire cats to hunt and needed to be a part of the design. Cats also love to play with lasers, therefore we could play around with light sensors to achieve this effect. Lastly, it was important to ensure a safe environment for the cat to play by setting up ways for cat owners to help control the toys; such as being able to alter the speed of rotation.

Concept: 

How are cats toys in the future? How can the cat owner, engage interactively / play with them diversely within a particular area? How do we translate codes from the theme parks or circuses, inscribed within very contextual spaces, to other dimensionally scaled and localized ones (recontextualize them)? These were some of the questions we asked ourselves as we developed our emotive object. A project born out of exploring concepts like playing, capturing attention, twisting and gravitating.

We selected as constraints for our project: “Strings” as the material, and “To twist” as the action verb. We began individually exploring sketches that harness the potentiality of strings as materials in creating engaging experiences; how they react to gravity and weight, and what unique interactions and “twistiness” can you achieve using different types of strings (soft, hard, metallic, elastic, etc.); How do you trigger or modulate string mechanisms, what other support-like pieces are needed to use with strings and how to trigger and emote twist.

Circus and theme parks were pivotal in landing the imagery of our emotional object; they are never “dead”, with a myriad of visual codes and tricks to engage into capturing attention. These two very iconic man-made constructs incorporate intricate mechanisms and vibrant imagery and colours to engage with the user. “Fun” is experienced in diverse a way, with the “mounting a show” concept being fleshed out through its different mechanisms; seemingly articulate parts, being cohesively stitched together by the string of “having fun”.

We arrived at the “cat toy mechanism” part of the concept after considering how we, as cat owners, can envision a future type of cat toy; how can Arduino as an extension of technology and physical computing can be implemented to excite cats and play with them; how to translate traditional cat toys though the physical computing lens, and how to use strings in creating twisting mechanisms that impress and capture cat’s attention. An increasing number of cat-related spaces, such as cat cafes, makes us imagine how to develop new, future ones. We decided in using everyday objects as materials for our emotive object piece because cats often feel more attracted to playing with them rather than professional toys. The colour palette selection emphasized blue:  blue is the colour that cats see most clearly, and they can also clearly see yellowy-green colours well. We were also inspired by the vibrant and colourful palettes used in circus and theme parks.

CATOPIA, our project, combines 3 mechanisms that allude to 3 specific cat toys. They use strings as a core material, different motions of twisting, and unique ways of engaging with cats.

Mechanism 1 – Idle state Carousel. Inspiration: The feather teaser cat toy, carousels.

carrousel 1img_3278

 

 

 

 

 

 

 

 

 

 

 

carousel engine

collage-2

 

 

 

 

 

 

 

 

 

Figure 3: (Up) The idle state of our emotive object, the carousel. (Down) Mechanisms, board, and trigger (potentiometer) of the speed and direction of the mechanism, are located in the upper zone of our emotive object. A continuous servo was utilized in this mechanism.

Inspired by the feather teaser cat toy, we created a carousel-like mechanism, using a continuous servo. This mechanism acts as the idle state of our emotional object. A potentiometer located on the upper “cat box” nose is used to adjust velocity and direction, regulating the twist in our emotional object. We explored different ways of interacting with the emotive object in non-evident, but also intuitive ways; between cat owners, it is common to touch or pet the cat’s nose, so we wanted to use this interaction in our emotive object.

cutest_cats_funny_cats_playing_with_feathers_toys_by_animals

33888168211_e9b91cbdc4_b

 

 

 

 

Figure 4: The feather teaser cat toy, the inspirations we used in creating this mechanism.
Source: (Left) https://www.youtube.com/watch?v=PLO4lUCYnro&t=133s, (Right) Creator
Tambako the Jaguar. Copyright: CC CY-ND Tambako 2016

Mechanism 2 – Alive state hands. Inspiration: Laser cat toy, Maneki Neko.

cat toy 2.2

close-up-hand-mechanism

 

 

 

 

 

 

 

 

mufaro-mech-behindcollage-mufaro

Figure 5: (Up) One of the two alive states, the hand’s movement is triggered by light, and the light sensor is hidden behind a yarn ball.
(Down) Rear view of the inner working of the mechanisms, as well as a view of the bobbin and servo inside the upper area, where it is covered.

The back ‘arms’ of our cat were used to introduce the yarn ball and laser features of our toy. The ball was attached to one of the arms and the movement of the arms was triggered by a light shining on the ball. The mechanism assumes that the cat will be attracted to the light which exposes the ball, and as the cat moves towards it, the ball moves up to tease the cat. This effect was achieved by using a pulley system to drop and lift the arms. The string of the pulley is wound and unwound with the assistance of a bobbin attached to a standard servo moving between 180 and -180 degrees. A light sensor ensures this mechanism is only triggered in the presence of light, and a laser pointer is recommended to play with the cat.

cats_vs_laser_pointers_compilation

whats_the_story_behind_japans_lucky_cats

 

 

 

 

Figure 6: The laser cat toy and the Maneki Neko were some of the inspirations we used in creating this mechanism. Source: (Left) https://www.youtube.com/watch?v=ebLaWzjQ2Xg, (Right) https://www.youtube.com/watch?v=zXX79eaZsOg&t=101s

Mechanism 3 – Alive state snake. Inspiration: Snake cat toy.

cat gif 3pom-mechanism4

 

 

 

 

 

 

 

 

 

pom-mechanism-collageimg_3212

Figure 7: (Up) One of the two alive states, the stripes’ movements are triggered by the absence of light, and the light sensor is on top of the nose of the cat’s face.
(Down) Top view of the inner working of the mechanisms, as well as a view of the sensor in the face area, and a closer view of the stripes.

The last mechanism to consider for our toy was the base. It would serve as another way to tease the cat. Individual strings are stitched into nine plastic stips folded in an accordion manner and secured on one end with a bead/weight. The other ends are secured into a bobbin connected to a continuous servo. A light sensor, located at the nose of the base, is attached to the circuit. When the sensor is pressed by the cat owner, the bobbin winds,  tightening the threads and shrinking the accordion up the ramp, and possibly away from the cat. After a bit of a delay, the bobbin unwinds, releasing the tension from the string and relaxing the accordion.

moving_snake_toy_for_cats__mella_and_maia_love_it_-1

 

 

 

 

 

 

 

 

Figure 8: The snake cat toy’s sinuous movement was inspirational in creating this mechanism.
Source: https://www.youtube.com/shorts/5EhlTOd3-SY

Documentation: 

Codes:

1st mechanism (carousel): 

https://github.com/zeldaso2ff/Mechanism-1-carousel/blob/main/Mechanism1/Mechanism1.ino ricky_bb

2nd mechanism (hands): 

https://github.com/mufaromukoki/Arduino-First-Group-Project-Arm-Movement/blob/7852578cd76df5badddba27f90cd12b69bf3dc12/Arduino%20Group%20Project%20Arm%20Code%20with%20light%20sensor.inomufaro_bb 

3rd mechanism (accordion stripes): 

https://github.com/zeldaso2ff/Mechanism3-Stripes-Accordion/blob/main/Mechanism3/Mechanism3.ino

poms-mech

Sketches:

img_3198 untitled_artwork-18 untitled_artwork-19 untitled_artwork-20

BIBLIOGRAPHY

Amazon.ca. (2022). Amazon.com: SmartyKat Loco Motion Electronic Motion teaser wand cat toy … Amazon.ca. Retrieved November 14, 2022, from https://www.amazon.com/SmartyKat-Loco-Motion-Automated-Activity/dp/B000N9W7HW 

Amazon.ca. (2022). Pulatree Interactive Cat Toys, 4 in 1 USB rechargeable cats laser toy feather kitten toys, cat ball toys for indoor play Chase Exercise, auto off, 3 speed modes + 2-angle light, KM05. Amazon.ca: Pet Supplies. Retrieved November 14, 2022, from https://www.amazon.ca/dp/B09X9V31GL/ref=sspa_dk_detail_0?pd_rd_i=B09X9V31GL&pd_rd_w=HuKKx&content-id=amzn1.sym.c7dca932-da6a-44fc-af09-cc68d2449b34&pf_rd_p=c7dca932-da6a-44fc-af09-cc68d2449b34&pf_rd_r=MD8V4NS7YHWTHC6SFYFP&pd_rd_wg=YEUF6&pd_rd_r=d80a433f-7c72-45ed-81a2-b321517f3a31&s=pet-supplies&sp_csd=d2lkZ2V0TmFtZT1zcF9kZXRhaWw&th=1 

Worldwise Inc. (2021, July 7). About SmartyKat. SmartyKat.com. Retrieved November 14, 2022, from https://smartykat.com/about-us/ 

 

 

cribble-rattle-main

Sketch 4 – Crib rattle with strings “Ricky” QS

Link to code:

https://github.com/zeldaso2ff/Ricardo-Quiza.git

Link to YouTube video of the mechanism:

https://www.youtube.com/shorts/AMMvLdTDZf8

As part of our group exercise exploration, I wanted to focus on a sketch that navigates the possibilities of using strings in a physical computing environment, by exploring weight, rotative movements, and playfulness as concepts. My initial sketch-like idea was to use those strings, along with tin-tacks to make strings follow specific directions by rotating those strings, attached to a continuous servo over the top. I encountered several issues while deploying that idea, like the need to have a bigger radius during rotation and more weight.

using-pringles-to-move

img_3088-3

 

 

 

First exploration of using strings
with a continuous servo. 

The final exploration idea uses a crib’s rattle as inspiration while fixing the mistakes I committed in the previous exploration. The strings are attached to paper clips that have been prolonged to create a wider radius for a smooth rotation. The continuous servo is attached to a horizontal structure, over the top, the strings hanging from it. the speed and direction of the continuous servo use a map function, that’s linked to the value that is registered by a potentiometer.

strins

servo-attached-to-lamp

 

 

 

 

 

 

 

Strings with beads attached at the bottom to add weight (left) continuos servo and base for the rotation, which uses clips with the strings attached (right)

rotation-movement

Rotation movement using 85 degrees, without adding the potentiometer, that would allow for speed and direction controls.

board

Board used.

 

Virtuālis Puppet

Synopsis: Creation of a puppet storytelling environment in screen spaces through the use of hand tracking and hand motion controls for staging short stories for an audience.

Cast: As Venus; Mufaro Mukoki, As Jupiter; Ricardo ‘Ricky’ Quiza Suárez


ACT 1- OVERTURE; The referents

Puppet Narrator. Article: Hand gesture-based interactive puppetry system to assist storytelling for children.

4hand-gesture-based-interactive-puppetry-system-to-assist-storytelling-for-childrenhand-gesture-based-interactive-puppetry-system-to-assist-storytelling-for-children

 

 

 

 

Figure 1: Both images illustrate different Puppet Narrator features. (Left) The implementation of the system architecture is mainly composed of three parts: input, motion control, and output (Right) Basic gesture control, an example of using a combination of gestures to steer and manipulate the puppet. a Stretch. b Grip.

Authors: Hui Liang is a Marie Curie senior research fellow at the National Centre for Computer Animation, (NCCA) Bournemouth University, the UK, and an associate professor at the Communication University of China, Dr. Ismail K. Kazmi, a Senior Lecturer of Games Programming/Development at Teesside University, where he teaches a wide range of courses in Computer Science and Game Development; Peifeng Jiao, a lecturer at the basic school of the Southern Medical University of China, Jian J. Zhang, Professor of Computer Graphics at  NCCA, where he leads the National Research Centre, Jian Chang, Professor and active scientist in computer animation with over 15 years research experience in the NCCA.

This article was a pivotal point of information gathering and referential research for our project. The authors intended with this system, to develop narrative ability in the virtual story world. Depth motion sensing and hand gestures control technology were utilized in the implementation of user-friendly interaction. A key aspect in the drive for developing the Puppet Narrator was ‘how digital techniques have been used to assist narrative and storytelling, especially in many pedagogical practices; with the rapid development of HCI techniques, saturated with digital media in their daily lives, young children, demands more interactive learning methods and meaningful immersive learning experiences’ (Liang, Hui, et al 517).

Its abstract premise proposes the creation of a hand gesture-based puppetry storytelling system for young children; players intuitively use hand gestures to manipulate virtual puppets to perform a story and interact with different items in the virtual environment to assist narration. Analyzing the collected data in this article, facilitated us in scoping and giving form to our screen space exercise. Some of this data includes; how would interaction through the system architecture happens (Input, Motion Control, Output, what kinds of hands gestures can be used, what skills/areas are trained in the public with its use (narrative ability, cognitive skills, and motor coordination ability), what technologies were used in its realization and how and what was accomplished through the practical application of a prototype of this Puppet Narrator system in a selected target audience.

Handsfree.js. Library: Online library with multiple projects about its potentiality.

laser-pointer-handsfree

handsfree

 

 

 

 

Figure 2: Both images illustrate different Handsfree.js projects. (Left) Laser pointers but with your finger (Right) Hand pointers and scrolling text. More examples at https://handsfree.dev/.

Author: Oz Ramos a.k.a Metamoar / Midifungi, a generative artist exploring compositions with p5.js and fxhash artist, who has been exploring gesture-driven creative coding for about 4 years.

Handsfree.js started back in 2018 while the author was homeless, in order to help a friend recovering from a stroke at the shelter they habited, by navigating the web with face gestures. Over time and through the support and encouragement of many people, the project grew to become much more than that, expanding into a full-grown library with the capacity for both face and hand tracking. Regarding hand, it differs from the handpose ML5 library, in regard to detecting multi-hands, up to 4, a key aspect to integrating into our project.

As we began code sketching prototypes of our project, we encountered one issue; the libraries we were working with only detected one hand. P5js library Handsfree.js became key in overcoming that limitation, but it was more than just that. The online repositories of Oz Ramos, the author, about his and others’ explorative research use of Handsfree.js was beyond useful; both inspiration and code-wise, these examples allowed us to get more understanding of how the hands can interact and influence the screen space. While we manually explored venues on how to adapt the library to our screen space, we also incorporated plugins, like pre-established hand gestures (pinching fingers) to allocate functions within the screen space.


ACT 1 – ARIA; The idea

Marionettes have been a part of many world cultures for thousands of years and have played a unique role in storytelling. Although puppetry was never reserved solely for the entertainment of children, our project focused on the use of it to assist storytelling to children. Puppetry continues to be a respected form of art, however, the new generation, who were born into technology, are not fascinated much by the physical instruments of play around them. They have spent the better part of their lives glued to screens. It is important for educators to devise more interactive learning methods and meaningful immersive learning experiences (Liang, Hui, et al 517).

 In our project, we attempted to answer the following questions:

  • How can human-computer interaction assist in learning and communication?
  • What are new ways of engaging in education to facilitate learning?
  • How does HCI assist/improve engagement?

virtualis-puppet-mindmap

Our project uses handsfree.js which is easy to access through a browser. Handsfree.js uses hand tracking to position the cursor on the screen and incorporates hand gestures, creating an intuitive program. As a result, the user is able to use simple hand gestures to control the marionette puppet on the screen. To control the puppet, fingers are mapped to specific strings of the puppet. This results in a puppet that can move anywhere on the screen and whose arms make puppet-like movements. Other functions of the marionette puppet/s, such as mouth movements, are triggered by other hand gestures like pinching the fingers

the-cat-maiden-storyboard-01

Figure 3: A scene is composed of 1 to 2 puppets and interactive stage props. Hands are used to control puppet movement via fingertips, and gestures trigger different interactions.

The project provides a range of ready-made scenes, characters and props for educators to choose from to assist them in telling their chosen stories. If more than one scene is chosen, pressing the arrow keys on the keyboard will help them navigate from one scene to the next.

the-cat-maiden-storyboard

Figure 4: The storytelling unfolds itself through multiple scenes, controlled by the arrow keys. For implementation, we choose Easop’s fable, the Cat Maiden.

screen-set-up

Figure 5: The interaction in the system occurs in a dialectic way; the puppeteer(s) controls action through screen space, reflected upon another screen space. Props can be used to expand upon the space where the action happens, mimicking puppeteer stands.

The final product is presented on a screen the size of a television and physical props, such as curtains and boxes, can be added around the screen space in an effort to emulate a traditional puppet show. An essential part of the program is that it must be user-friendly and require minimum effort from the facilitator which means the facilitators can be the children themselves, with supervised assistance. This makes it easy for them to adapt to a variety of different stories. The facilitator will still need to stand behind the main screen, much like the fashion of a typical puppet show, however, they have the advantage of having a smaller screen to coordinate their story. This means that they will also get to see their narrative as it is experienced by others.

window-giff

Figure 6: Final version of the coded system while being implemented.



ACT 3 – FINALE; The documentation

copia-de-moving-circle

moving-puppet-0-dot-5

 

 

 

 

moving-circles-3

 

 

movin-circles-4moving-circles-5

Figure 7: Body of work showing the evolution of the project.

For our presentation, we made use of a 70-inch TV connected to a laptop via an HDMI connection. We stood and presented our puppet show in front of the laptop and the output was projected onto both the laptop and the television. As accessibility was an important consideration, we situated the puppet show in a small room to amplify the audio as much as possible without requiring additional hardware. Additionally, the presentation was accessed through the chrome browser. The puppet show was presented to an audience who sat and/or stood in front of the large-screen television. In preparation for our presentation, we stood in front of the laptop to test the best distance to stand from it in order to move the puppets more easily (2.5 inches) and to prevent the software from picking up any more hands that would disrupt the performance. 

Upon reflection, we ought to have created a physical structure around the large-screen TV to emulate the stage of a puppet show and create a more immersive experience. Our challenges and limitation were our own limited experience with coding. We experimented with various codes and sketches to mimic and stimulate hand movements and map our puppet strings to human fingers via the web camera.

scheme-2

Figure 8: One of the usability tests of different ways of applying hand detection on the screen, and using it as means of moving objects on the screen space.

for-schemes1

scheme-3Figure 9: Mock-up practices and usability test of the system. The interaction happens on a screen that the puppeteers use to conduct the show, while the puppet show is output to an external, bigger screen.

You can watch our presentation here: 

https://www.youtube.com/watch?v=6LSZq3VBrLo

The final code utilized in the presentation (Edit):

https://editor.p5js.org/ricardokiza654/sketches/94qop2xpi

The final code utilized in the presentation (FullScreen):

https://editor.p5js.org/ricardokiza654/full/94qop2xpi

 

Some notable sketches we created can be found on the following links:

https://editor.p5js.org/ricardokiza654/sketches/GDkoYSEVd

https://editor.p5js.org/ricardokiza654/sketches/Xa5LVG0yX

https://editor.p5js.org/mufaromukoki/sketches/mi2fY65HfG

https://editor.p5js.org/mufaromukoki/sketches/XFVMztb-l

 

To view more of our documentation visit this link.

 

Bibliography

Aesop, . ““The Cat-Maiden”.” Aesop’s Fables. Lit2Go Edition. 1867. Web. <https://etc.usf.edu/lit2go/35/aesops-fables/377/the-cat-maiden/>. October 24, 2022.

Canada Museum of History. Qualities of a Puppet, Canada Museum of History, 2022, https://theatre.historymuseum.ca/narratives/details.php?lvl2=4812&amp;lvl3=4826&amp;language=english.

Fling, Helen. Marionettes: How to Make and Work Them. Dover Publications, 1973.

Flower, Cedric, and Alan Jon Fortney. Puppets: Methods and Materials. Davis Publications, 1993.

Liang, Hui, et al. “Hand Gesture-Based Interactive Puppetry System to Assist Storytelling for Children.” The Visual Computer, vol. 33, no. 4, 2016, pp. 517–531., https://doi.org/10.1007/s00371-016-1272-6

Mediapipe Solutions. “MediaPipe Hands.” Mediapipe, 2022, https://google.github.io/mediapipe/solutions/hands.html.

Ramos, Oz. “Handsfree.js Intro (Spring 2019).” Vimeo, Oz Ramos, 24 Oct. 2022, https://vimeo.com/476537051.

Ramos, Oz. Handfree.dev. “Hands-Free Libraries, Tools, and Starter Kits.” Handsfree.js, 17 May 2022, https://handsfree.dev/.

Ramos, Oz. “MIDIBlocks/Handsfree: Quickly Integrate Face, Hand, and/or Pose Tracking to Your Frontend Projects in a Snap ✨👌.” GitHub, MidiBlocks, 2022, https://github.com/MIDIBlocks/handsfree.

Victory Infotech. “Indian Wedding Bride In Choli And Groom Kurta Pajama With Koti Couple Traditional Outfits Doodle Art PNG Picture.” Pngtree, Pngtree, 11 July 2022, https://pngtree.com/element/down?id=ODA3MTc2Ng&amp;type=1&amp;time=1666238499&amp;token=NDllYmQ3MzdmY2JiYTkwMmRmYjg1MjEwMjBkYWE1M2M.

catrnaval-ricky-qs

Sketch 3 – Cat-rnaval – Ricardo “Ricky” QS

Edit Link: https://editor.p5js.org/ricardokiza654/sketches/3ehu_nZC0

Screen: https://editor.p5js.org/ricardokiza654/full/3ehu_nZC0

For this sketch I wanted to create ”popping appearing” textures with .pngs all around the screen, with different values and ways of appearing. This sketch also explores extracting and drawing info from a .png, that can be used to be re-drawed pixel by pixel and then re modified, such as altering its color.

Reference: https://www.youtube.com/watch?v=3gXpk2mvTWk

drag-spring-and-loom-2

Sketch 2 – Grab, Spring and Loom – Ricardo “Ricky” QS

Edit Link: https://editor.p5js.org/ricardokiza654/sketches/O36vcXFck
Full Screen: https://editor.p5js.org/ricardokiza654/full/O36vcXFck

For this sketch I wanted to first create conditions in which interaction would happen, in this case, ”grabbing” an object to move it (and then more functions). Secondly, I wanted to play with for loops and the sin method, to create different types of smooth back and forth  animations (I strived to simulate spring and looms movements) as well as buccle designs.

 

 

sketchoneghosthandrqs2

Sketch 1 – GhostHand – Ricardo “Ricky” QS

 

Edit Link: https://editor.p5js.org/ricardokiza654/sketches/YPGb5siwa

For this sketch, I wanted to play and experiment with concepts like relativity and dimension control in the Screen Space; also, generate variables and adaptive resources that would permutate/fit according to the screen size. A core concept I wanted to try was to translate the mapping of the hand, outside of the ”hand”, a sort of ghost hand.

After playing with that, I followed by mapping the tips of the middle and index finger,  creating interactivity by switching the opacity of resources in the space in accordance to the distance between those elements. The stroke color also changes each time the hand appears/disappears.