Category: Experiment 1

Experiment 1 – Body As Controller

Experiment 1: Body as Controller in 3 parts

Nooshin Mohtashami

The goal of this experiment is twofold:

  1. To explore the possibility of creating interactive virtual body art inspired by my related works research, and
  2. Using the computer’s built-in webcam and body movement to initiate interactions with the computer program instead of using the mouse or keyboard already attached to it.  Specifically creating: 2 ways to perform the action CLICK and 2 ways to perform the action SCROLL.

The tools and libraries used in this experiment are p5.js and ml5.js that enable immediate access to the pre-trained models through a Web browser (and much more).

The goals were reached in 3 separate parts listed below.

Part 1: Moving Sparkles

p5.js Present: https://preview.p5js.org/nooshin/present/QoWPCen-3

p5.js Editor:  https://editor.p5js.org/nooshin/sketches/QoWPCen-3

This is a simple sketch to show how controlled movement (scroll) can be implemented using a body pose.

movingsparkles

Fig 1 – Moving Sparkles

Moving Sparkles

Fig 2 – lifting shoulder to move the sparkles

In this sketch, lifting your shoulder to your ear will move the sparkles on the screen towards the lifted shoulder side (please note that the camera view on the screen is reversed). Meaning, lifting the left shoulder to your ear will move the sparkles to the left of the screen and lifting the right shoulder will move them to the right.

This is not really a sophisticated program and the sparkles do run “out” of screen and if the user starts by lifting the right shoulder first, the sparkles disappear and the user can get confused. However, this was one of my first experiments where I learned a lot about the basics of  how to work with p5.js and ml5.js’s libraries!


Part 2: Bowie’s Bolt

p5.js Present: https://preview.p5js.org/nooshin/present/izwLGOi_N

p5.js Editor: https://editor.p5js.org/nooshin/sketches/izwLGOi_N

This is a sketch to show how state change (click) can be implemented using a body pose.

face_m1

Fig 3 Bowie’s Bolt

Bowie's Bolt

Fig 4 – wrist to eye to change the virtual make up colour

In this sketch, there are 2 pre-defined “virtual make ups” drawn on the user’s face using p5.js vertex function. The colour of the virtual make ups can be changed by the user when the eye is touched or “clicked”.

One important learning from this experiment is learning about the size of the drawings on the screen and user’s face. When I was programming this I created the shape of the virtual make ups based on my distance from the webcam at the time and although the virtual make up moves on the screen with the model, it currently does not resize itself to be proportional to the face. This is because I used absolute numbers with the x,y coordinates of the body parts to create the shape. For example, to start the drawing I used:
vertex(eyeL.x, eyeL.y-100);
vertex(eyeL.x-80, eyeL.y-100)
This will draw a line from Left Ear’s x-position to Left Ear’s (x-80)-position while keeping the y-position constant. Now if the user moves further from the screen, while the face looks smaller, the 80 & 100 remain constant and therefore the virtual make up takes over the entire face (Fig 5). In the future version I would use dynamic values to draw the shapes. For example, by calculating the distance between the ears to find the width of the face on the screen and using that as reference to calculate where to place the virtual make up.
Moving away from the screen makes the virtual make up take over the entire face.

Fig 5 – the size of the virtual make up is not dynamic in the current sketch


Part 3: Finger Painting

p5.js Present: https://preview.p5js.org/nooshin/present/tLrxxrEjb

p5.js Editor: https://editor.p5js.org/nooshin/sketches/tLrxxrEjb

 

finger_p1

Fig 6 Finger paint on the screen

 

 

finger_paint_sparkvideo1

This is a 2-in-1 sketch where both movement (scroll) and state change (click) are implemented. I started with Steve’s Makerspace useful video on how to create the functionality for finger painting on the screen, extended it to include random colours and clearing of the screen by bringing the thumb and index finger together. It was also important to create a state of “painting” vs. “not painting” so that the screen doesn’t get cleared unintentionally if the thumb and index fingers accidentally come close together while the user is painting on the screen.

Instructions are as follows:

  • First raise your hand ✋for the program to recognize it. A white pointer will show on your index meaning it’s ready.
  • To start painting: point your index finger 👆.The white pointer will turn red. You are now ready to paint on the screen.
  • To stop painting open your hand and show all fingers ✋. You can move your hand to a different location, point your index finger and paint more.
  • To clear the screen: stop painting first ✋ then touch index + thumb together👌

 


Summary & Learnings 

I learned a lot with this experiment and also that the ML5 libraries seem to prefer:

  • Bright and clear rooms: body parts and movements were recognized much faster and more accurately in bright rooms and solid coloured user backgrounds than in dark rooms or with lots of colour or objects in the background.
  • Light clothing: wearing light coloured clothing worked better than dark coloured ones when working with body movement recognition.
  • No jewelry: wearing a bracelet or watch on the wrists seems to cause delay and confusion in recognizing hands/wrists and their location.
  • Slow movements: moving slowly was easier for the computer to recognize the body and its movements (or maybe this is a programmer’s issue!).

And as I started writing this summary, I realized I could extend the project and allow the user to draw their own “virtual makeup” on the screen using their finger (Part 3) and when done for the program to tie the drawn shape from the screen to the user’s facial coordinates. Then using Part 2, the user could change the colour of their virtual make up. A definitely fun future project 🙂

Breaking The Ice

This being my first foray into the world of coding, “Breaking The Ice” is a group of 4 interactive art experiments that have for all intents and purposes, broken the frozen layer of ice between myself and coding in the world of Java script. After a cautious start, I am inspired to delve deeper into the virtual world of multimedia art and interactivity.

Keeping in mind my tag of being a novice, I wanted the art/media I was creating to be fairly straightforward and focus rather on the interactivity between the body movements and the art to see how movements and gestures in the physical world could alter the outcome in the virtual world. The interactive artworks all start from a single shape and respond to changes in movements and gestures of the viewers, such that the viewers can direct the outcome of the piece.

The task at hand was to use our bodies as a controller and to find/create 2 ways to perform a “Click” function and 2 ways to perform a “Scroll” function, both otherwise arbitrary motions we take for granted, carried forward by the mouse. In this instance, we will substitute the action of a mouse “click” or “scroll” with a predefined body movement or gesture. Made using the p5.js web editor, ML5 Library and PoseNet.

 

Click #1 : Red-tangles

My first piece is a play on words on ‘Rectangle’. In this case, I have moved the background to setup which allows it to draw over itself again and again in a loop. The screen will stay red until both wrists are brought together in front of the face, like a clicking action. Just then small, white rectangles should start to emerge on screen. And when the wrists are separated again, the ‘red-tangles’ will start to fill the screen up again, eventually going back to an all red canvas.

screenshot-2021-09-28-at-4-11-01-pm

Video Demonstration: https://vimeo.com/616961584

Source Code: https://editor.p5js.org/arjundutt20/sketches/CLlRpep7g

Presentation: https://preview.p5js.org/arjundutt20/present/CLlRpep7g

 

Click #2 : Vertigo

‘Vertigo’ is intended to recreate the feeling of freefalling. Stimulated by my fear of sudden vertical drops, I wanted to see if I could visualize a long, constant free-fall. In this piece, the constantly changing grayscale of the rectangles is meant to depict motion as you fall past each subsequent rectangle. The grayscale freezes if the viewers right wrist touches their left shoulder. An action that can be considered to be ‘half’ of the full brace position one assumes when they’re about to face impact.

screenshot-2021-09-28-at-4-59-07-pm

Video Demonstration:  https://vimeo.com/616962516

Source Code: https://editor.p5js.org/arjundutt20/sketches/skWqat5x3

Presentation: https://preview.p5js.org/arjundutt20/present/skWqat5x3

 

Scroll #1 : Hollipse

‘Hollipse’ or the hollow ellipse was inspired by artists who can draw with both hands simultaneously. Initially, I had given the draw function to the X/Y position of the mouse. So whenever the mouse moved on the canvas it would form a design and its subsequent mirror design would be drawn too. I had later substituted the mouse function for the nose tracking feature in PoseNet. A scroll is a lateral movement in that could move your position on the screen, in this example, I have allowed the scroll to move freely in any direction.

screenshot-2021-09-28-at-5-04-06-pm

Video Demonstration: https://vimeo.com/616960983

Source Code: https://editor.p5js.org/arjundutt20/sketches/0bPTkezl1

Presentation: https://preview.p5js.org/arjundutt20/present/0bPTkezl1

 

Scroll #2 : The Blue Ball

The Blue Ball was the first piece I worked on and one that proved the most informative. The Blue Ball is the representation of the nose on the screen in real time. Wherever you move your face, the ball will follow (albeit mirrored). The closer you bring your face to the camera, the bigger the ball will become. If you take a long, sweeping step back- the ball will become smaller. This back and forth motion relative to the placement of the screen is how I wanted to relate it to the ‘scroll’ function.

screenshot-2021-09-28-at-5-18-15-pm

Video Demonstration: https://vimeo.com/616962613

Source Code: https://editor.p5js.org/arjundutt20/sketches/63bL8aicu

Presentation: https://preview.p5js.org/arjundutt20/present/63bL8aicu

 

Observations and Reflections:

I was very apprehensive at the start because this was my first time I’d be dabbling in coding. I found the syntax to be particularly difficult to wrap my hear around. The more I watched videos and practiced and asked questions of my classmates, the more familiar it became. I am definitely going to keep tinkering with P5.js and PoseNet, it opens the door to interactivity between the physical and virtual world.

Can’t believe I am using ML!

I am still a newbie with P5js. Last year I had tried to make a simple platform game with keypress actions to move a character around the screen in my leisure time. So when it came to applying ML library for ‘body as controller’ there was more curiosity than fear of using a new library or attempting something new. It was delightfully easy to start with Ml5 and poseNet. My only foray into using AI/ML till now was building a perceptron program coding along Daniel Shiffman’s video.

I have tried to use code shared from coding train video + learn.ml5js.org and modified it to try out newer ways to click and scroll. I didn’t want to keep the benchmark too high for myself and fall into the trap of chasing it. So I thought up a simple experiment of making (reusing from tutorial video) bubbles float around and using body gestures to click for the first exercise. I am still not touching the confidence scores as well looking at fine-tuning the positions of the pose Key points. 

I did learn about and have tried to use LERP function and liked how it allows us to smoothen the movement of pose Key points by taking the midway point between what’s predicted and current position. It makes the jumps a little less jittery. I also liked capturing click events using distance function from Dan’s videos. It was strikingly effective and I couldn’t believe it solved the problem so quickly of checking whether both wrist key points are inside the bubble.

 

‘X’

ezgif-4-a15f15ced46b

The action is to cross your wrists to make a pose ‘X’. Where the two keypoints of left wrist and right wrist intersect any of the floating bubble, the response changes the colour of the translucent bubble to white indicating the CLICK. The code checks for distance between each of the circle’s centre and the bubble’s centre and qualifies it to be a CLICK event if the range is within the size of the bubble’s radius.

So by taking one wrist, the program doesn’t consider it a CLICK event until the second wrist is brought in the are engulfed by the bubble.

Present View –

https://preview.p5js.org/rewritablehere/present/gc2XD3M0s

Code View –

https://editor.p5js.org/rewritablehere/sketches/gc2XD3M0s

Video –

https://ocadu.techsmithrelay.com/au31

 

Swoop

ezgif-4-091d2c2b740b

The action is to rotate your elbow as if you are turning a wheel by holding one of the edge of the wheel. The key points of right wrist defines the angle which is read by arctangent to be translated into SCROLL. SCROLL value can be mapped to the rotation angle described. I attempted to use both wrist and elbow point for accuracy but couldn’t use it employ in the code.

The gesture is similar to browsing a carousel where in you are repeatedly rotating the carousel to bring the next horse towards you. Got the chance to use Push & Pop function here to draw the rectangle. And used Translate and Rotate for positioning and orienting it. the Translate put the origin to the centre of the screen and made the angle detection possible. The challenge here was working with trigonometry and arriving at a common understanding of whether angles or radians can be called into the arguments.

Present View –

https://preview.p5js.org/rewritablehere/present/kDjF5qOz5

Code View –

https://editor.p5js.org/rewritablehere/sketches/kDjF5qOz5

Video –

https://ocadu.techsmithrelay.com/ozfj

 

Just because you have ears 😛

ezgif-4-7cb5b5465a73

A simple gesture of tapping your ear to your shoulder to register a click has been tried with Blazeface model/detector of Google’s TensorFlow library. A boolean flag has been made use to capture the click event and show the eyes for “MOUSEDO

WN” event and make the eyes disappear on “MOUSEUP” events.

Present View –

https://preview.p5js.org/rewritablehere/present/vx1XAY-ws

Code View –

https://editor.p5js.org/rewritablehere/sketches/vx1XAY-ws

Video –

https://ocadu.techsmithrelay.com/SLr2

 

Reflection:

This experiment has given me some kind of confidence to use computer vision. I can now think about all the possibilities of creating simple experiments around body guestures and interactions that we embody on regular basis and whats possible in using AI to classify them, make inferences. It made me think about making use of this to define things to a digital assistant. I could just signal to alexa to change a song by winding my hand or something on those lines.

 

References and code reused from:

ml5.js: https://ml5js.org/

The Coding Train: https://www.youtube.com/user/shiffman

Multiple hands detection  for p5js coders. https://www.youtube.com/watch?v=3yqANLRWGLo. Accessed 8 Sept 2021.

Blazeface: https://arxiv.org/abs/1907.05047 

Play Along

The series focuses on user’s facial movements through machine learning. The key idea is to interact with the user with the help of sound and hand gestures, to make regular movements more fun. The experiments have been designed by keeping in mind the  ‘Click’ and ‘ Scroll’ movements. While everybody has their own interpretation of these actions, the ‘Play Along’ experiment has been designed in a way to help the user interact with the computer through body postures and gestures without the use of a mouse/keypad. I have tried to combine actions with playful movements and with a touch of music.

Experiment 1: Coloured Claps

With every clap the colour of the ellipse changes

screenshot-2021-09-27-at-17-43-45

 

 

The idea behind the experiment is to change the colour of the ellipse with every clap. When the user brings his/her hands together with every action of ‘clap’ the ellipse switches colours. Once the user stands/ sit in front of the webcam, the system recognises the writs and when both the wrists come together the system generates an output of changing colours.

 

Links:

Present link: https://preview.p5js.org/preetimahajan11/present/qlorcGTkF

Edit link: https://editor.p5js.org/preetimahajan11/sketches/qlorcGTkF

Video link: https://ocadu.techsmithrelay.com/2deV

 

Experiment 2:Move it

Pushing the text by lifting your right hand

move-it

 

 

This experiment focuses on the scroll gesture. The user can interact by raising his/her right hand and pushing the text. The concept was how the text moves to the end and then bounces back to its initial position. This idea can be adapted for several ‘swipe’ feature

Learnings: PoseNet was identifying my left wrist as well and the text was bouncing back before reaching the x-axis.

 

 

Present link: https://preview.p5js.org/preetimahajan11/present/9GYuaqm6E

Edit link: https://editor.p5js.org/preetimahajan11/sketches/9GYuaqm6E

Video link: https://ocadu.techsmithrelay.com/AJyT

 

Experiment 3: Pause and Play

Using Nose press to play and pause music

Free music: https://freemusicarchive.org/genre/Instrumental

screenshot-2021-09-28-at-15-42-59

 

With a defined ‘y’ axis, the user can use his/her nose to play and pause music. The user can move his head up and down the defined axis to control the music with their nose.

Learnings: I faced an issue with the music loading slowly even though the prompt commanded it to ‘preload’. I was able to make a slight change to my command and changed it to poseNet = ml5.poseNet(video, modelLoaded);

 

Present link:https://preview.p5js.org/preetimahajan11/present/20MolhoPN

Edit link: https://editor.p5js.org/preetimahajan11/sketches/20MolhoPN

Video link: https://ocadu.techsmithrelay.com/ObFh

 

Experiment 4: Laser eyes

With the user’s eyes as the source and the laser beams move with the user’s eye movement.

screenshot-2021-09-28-at-16-00-51

 

 

This fun-filled experiment works in coordination with the user’s eye movement. The laser tracks the movement of the eye and moves along.

Glitches: I intended to make the lines squiggly, but they were not capturing the x and y properly

 

 

Present link: https://preview.p5js.org/preetimahajan11/present/tfO6EaVtp

Edit link: https://editor.p5js.org/preetimahajan11/sketches/tfO6EaVtp

Video link: https://ocadu.techsmithrelay.com/W6ru

 

Bibliography

https://www.youtube.com/watch?v=bkGf4fEHKak

https://p5js.org/examples/

https://www.youtube.com/watch?v=FYgYyq-xqAw

https://www.youtube.com/watch?v=ISkrBJ9YqCs

https://www.youtube.com/watch?v=LO3Awjn_gyU

https://github.com/tensorflow/tfjs-models/tree/master/posenet

 

To Whom I Love

Project Name: To Whom I Love (Click)

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-21-07 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-20-16 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-21-52

Project description:

The project is another “click” experiment. As I learn more about movement recognition, I start to realize there are a lot more that I can teach the computer. Therefore, I started to teach the computer more of my gestures. The the project , To Whom I Love, is built on movement recognition program from Teachable Machine. When I put hands on my chest, referring myself, the word “I” pops up, and when I do “hand heart” gesture, the emoji of hear pops up. Finally when I point to the camera, “You” will appear on the screen.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/IwRW3tPyL

Full screen: https://editor.p5js.org/YoungYoungYoung/full/IwRW3tPyL

Hello! I’m Young.

Project Name: Hello! I’m Young. (Click)

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-16-17 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-15-54

Project description:

The project is a simple start of the “click” experiment. I learned that computers can do movement recognition through machine learning. Specifically, Teachable Machine is a great platform on which you could teach the computer to recognize your voice and movements. The the project , Hello! I’m Young, is built on movement recognition program from Teachable Machine. When I raise my arm, indicating that I’m saying hello, the banner pops up. And when I put my hands down, the banner disappears.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/X_oEEzjn_

Full screen: https://editor.p5js.org/YoungYoungYoung/full/X_oEEzjn_

Delightful Bits

4

It is surprisingly beautiful how binary digits can be transformed into pleasurable experiences. Watching colors blend, listening to music, and seeing animating visualizations make us feel happy. But I believe there is something missing, that is the emotional connection in the way we interact to produce them. Simply clicking a mouse or scrolling on a trackpad lacks the connection that our intent and the feedback deserve. If we add natural ways to control these reactions using our body movements, they become a delightful experience.

In this experiment, I’ve worked on building such “Delightful Bits” using computer vision to control different aspects of media. There are four interactive studies that have been developed using p5.js and ml5.js for body and hand tracking. While working on the ideas I realized that the simplicity of the tools plays a huge role in providing a much larger scope for creativity. However, simpler tools like p5.js and ml5.js have less precision but can be a great way of prototyping interaction techniques. It was exciting to be able to combine my interests in different fields such as maths, graphic design, musical instruments, and visualizations quickly using these libraries with this series of experiments.

Studies

frame-1

Present(p5.js) | Edit(p5.js) | Video(demo)

Illumination Pinch is inspired by generative art techniques used by Etienne Jacob to create GIFs. Most of his artworks are illusions that represent dynamic systems. In this experiment, the visualization is nothing but a normal distribution. Users can move their thumb and index fingers across the screen and when a pinch action is performed, a circular grid of circles lighten up at that point and ripple away radially. As the lights on the grid ripple away, they change their size and opacity to give the desired illusion of the circles actually moving outwards.

_____________

frame-3

Present(p5.js) | Edit(p5.js) | Video(demo)

Current design trends are geared towards layered combinations of radial and/or linear gradients. It’s difficult to make gradients quickly and I always looked for tools but ended up using background images from Google search. This experiment tries to solve this challenge. Also, the interaction is inspired by how artists create abstract art by splashing paints on the canvas.

In this study, I have worked on creating a way to generate new gradients using splashing colors onto the screen. For this study with p5.js, I’ve used handPose(ml5.js) to track one hand’s movements. When a user expands fingers (splashes on the screen), a new color is added to the canvas in the form of a layered radial gradient that blends outwards.

_____________

frame-2

Present(p5.js) | Edit(p5.js) | Video(demo)

This experiment is aimed to control song visualizations we usually find media players like VLC and Windows Media Player. The mechanics are based on sin wave structures and are inspired by a “Colorful Coding” video on YouTube I watched a while ago.

In this setup, the visualization reacts to the user’s wrists. Using ml5.js(poseNet), a scroll is detected and when the user quickly moves their arms together to left/right. The rotation speed of the visualization depends on the speed of the hand movement. Users can also move their hands up and down; during which the illustration loops through the sin wave propagation providing spring feedback.

_____________

frame-5

Present(p5.js) | Edit(p5.js) | Video(demo)

Everybody loves to strum when they are handed a guitar, but they have no idea about the other part of the guitar. Definitely, it’s a pretty steep curve to learn switching chords; it involves months of practice. This study is to help myself and all the enthusiastic guitar learners to experience playing a guitar and switching chords easily.

Using poseNet, the tool tracks the user’s wrists. Currently, it allows users will be able to strum the guitar with one hand and choose among the 4 most commonly used chords (A, G, C, Em) with the other hand.

_____________

References

Etienne Jacob | Personal Website | 25 Sept. 2021. https://bleuje.github.io/

Sound files for Guitar chords |  Autochords | 25 Sept. 2021 https://autochords.com/

copines.mp3 | Aya Nakamura | 23 Sept. 2021 | https://www.youtube.com/watch?v=EkGiGf8utCM

Sine wave structures in p5.js | Coding Project #1 | https://www.youtube.com/watch?v=vmhRlDyPHMQ

 

Experiment 1 – Social Animal

“Man is by nature a social animal” – Aristotle

Project description  – My project centers around the theme of social network and how individuals are constantly subject to the influence of close affinities. The relationship between movement of body parts and abstract forms is a miniature of the interactions on a greater scale. I started my first studies from an initial idea in responding to a casual (or maybe not) comment from my close family. Despite knowing the inherent rightness or wrongness of the comment and underlying societal values, I still find it hard to reconcile sometimes – especially coming from people who we are very close with. The intertwining relationship resembles a series of connected actions when I was learning and compiling p5.js, which led to my following four experimentations.

Overall successes/failures & what I discover in the process – The process of learning itself is a big success for me, via actual experimentations with code and hands-on research. It made me realize how much I don’t know and have yet to learn. I have encountered many mini-frustrations in the process when I was unable to achieve certain tasks, but it’s rewarding when I can finally map a way to solution. For final delivery, the smoothness of real-time interaction and feedback loop is yet to be optimized- partially caused by the slight imprecision of tracking tools. And more importantly, I shall try to make my code more efficient and take less computational steps in the future.


Project details – 4 studies:

  • Judgement Machine: The idea for this study came from a recent comment from my grandma, who kindly suggested me to have a plastic surgery on my lips because she thinks they are too thick and ultimately the cause of me being single. Reflecting upon underlying beauty standards pursued by the society across time and region, I want to create an experience of  being looked at and judged upon while staring into a mirror/reflection. One can move around and try to fit different part of face into the boxes, which would trigger a series of super subjective judgements just by your facial appearances.

Judgement machine

Link to present: https://preview.p5js.org/huang42y/present/4J6waHn_g

Link to edit: https://editor.p5js.org/huang42y/sketches/4J6waHn_g

Video: https://youtu.be/u3thWXjoBR8


  • Eye Puppet: Extended from last study on with face tracking technique and “clicking”/state change feature, this study I explored more on face tracking and how to apply relational factors(speed) with abstract shapes. I still want to keep the concept from latest one, the eyelashes symbolizes for the invisible string attached from other social networks – to me that’s a representation of relationship with people who are close to me. And no matter how I move, in spite of small changes on my surroundings, I still can’t get rid of the attached lines – as I can’t fully be unaffected by any influence from my intimate relationships, unless I quit the game.

Eyelash Puppet

Link to present: https://preview.p5js.org/huang42y/present/B5usEQXAl

Link to edit: https://editor.p5js.org/huang42y/sketches/B5usEQXAl

Video: https://youtu.be/_xOQTGwRZYY


  • How many pigeons are watching you?: Again extended from the notion of watching/being watched. I experimented with PoseNet to create this study of ”How many pigeons are watching you?” With trained data from sets of poses, generally speaking, the more complicated/pretentious/phony the pose is ( according to my personal judgement) the greater number of pigeons will show up. This set of pigeon drawings were my early doodles a few years ago.

screen-shot-2021-09-28-at-5-06-50-pm

Link to present: https://preview.p5js.org/huang42y/present/ICCPKgD-g

Link to edit: https://editor.p5js.org/huang42y/sketches/ICCPKgD-g

Video: https://youtu.be/wdQ1mMsaQwc


  • Followers: A bit more abstract as it is, I want to create a movement that echos the dynamic of big influencers on social media over specific generations like teens. White balls represent the mass population and the colorful finger stands for those KoLs/net famous people. Using hand tracking tool, the white balls follow along with the index finger with some acceleration. Recalling my first study, this last study can be seen as a potential cause of how appearance anxiety generates and permeates different strata of society.

Following around

Link to present: https://preview.p5js.org/huang42y/present/fui2Wk2t0

Link to edit: https://editor.p5js.org/huang42y/sketches/fui2Wk2t0

Video: https://youtu.be/KaQe0fqmWkw


The “Hopeful Day” Project

This past year, a lot has changed due to Covid-19. The “Hopeful Day” project was created with lockdowns and quarantine in mind. The series of small programs I created can help people de-stress and relieve boredom. Because we cannot attend a yoga class or visit a museum during a lockdown, I created interactive sketches which allow the user to do yoga with another (cartoon) human, as well as browse the paintings of an art exhibition. With the future being so uncertain, I created the “Hopeful Thinking” sketch which allows the user to hide their stress stemming from different times of the day. Finally, I created the “Hopeful Sky” sketch because of the saying “after every storm is a rainbow,” to refer to a better tomorrow and encourage everyone to stay hopeful.

Overall, I’m glad that the project was able to convey my central message. However, being very new to coding, I encountered a lot of different issues. Some being as small as forgetting a “)” and some that couldn’t be solved even after 100+ trial and error tests. For these, I had to be mindful of the deadline and substitute certain elements with an easier solution. Because of this, I couldn’t explore more ways to use poseNet and interact using more parts of the body. I was surprised to find that coding for different body parts/actions was very different. I discovered that there’s no “right” way to do things, and the best way to learn is by things trying out.

Sketch 1: “Hopeful Thinking” 

hopeful-thinking-static-image

Present link: https://preview.p5js.org/kxu/present/sLAb_x32a

Edit link: https://editor.p5js.org/kxu/sketches/sLAb_x32a

Video: 

vid-pic-1

VIDEO LINK

Description:

This interaction titled: “Hopeful Thinking” is a sketch that allows users to hide their stress away no matter what time of the day the anxiety comes to attack you. Users can move the word “stress” around using their nose and the text will slowly fade to the colour corresponding to the background until it’s completely hidden. It utilizes the coordinates of the nose as a fun little way to move things around on the screen. This study demonstrates the use of interaction and machine learning in creating satisfaction and solving small problems.

Sketch 2: “Hopeful Dreams” 

hopeful-dream-static-image

Present link: https://preview.p5js.org/kxu/present/4Ug1iGdvR

Edit Link: https://editor.p5js.org/kxu/sketches/4Ug1iGdvR

Video: 

vid-pic-2

VIDEO LINK

Description:

This sketch, “Hopeful Dreams” allows users to visit an art exhibition and interact with the display through their computer. The interaction takes the audience through a series of peaceful paintings titled “Blue Dream” while playing music in the background. It is a new way of showcasing one’s works or portfolio. Other than help convey the overall message of my project, I conducted this project to study how to display artwork/portfolio in a more interesting manner. Perhaps we can utilize this interactive technology to make a better first impression when submitting portfolios or resumes.

Sketch 3: “Hopeful Spirit” 

hopeful-spirit-static-image

Present link: https://preview.p5js.org/kxu/present/_R1-IXLHP

Edit link: https://editor.p5js.org/kxu/sketches/_R1-IXLHP

Video:

vid-pic-3

VIDEO LINK

Description:

Part 3 of this series of sketches is called: “Hopeful Spirit” and allows the audience to de-stress by doing yoga alongside another being. When users bring their hands above their head and hold it, the figure in the sketch will also do the same. This allows the user to develop the sense of exercising with other people at a time when going to the gym or a class may be unsafe. Although this example is very simple right now, my intention is to expand this to other activities like walking in which the user can exercise alongside other “people” as well so that everyone can interactively exercise in their homes.

Sketch 4: “Hopeful Sky” 

hopeful-sky-static-image

Present link: https://preview.p5js.org/kxu/present/xkBGNHyDv

Edit link: https://editor.p5js.org/kxu/sketches/xkBGNHyDv

Video:

vid-pic-4

VIDEO LINK

Description:

The last part of my study is a small interactive piece where the weather changes when the user brings their hands together. I wanted to conclude the study in a hopeful way to encourage everyone to look forward to the future. Because this piece was the first one I did (and my first ever coding experience), it has gone through 4 days of coding frustration, getting stuck on hundreds of error messages, and many transformations before it became a finished product. Although it is far from my original intentions, I’ve learned so much from testing out various theories and mapping out possible solutions.

Purple-iiiiiish~

Project Name: Purple-iiiiiish

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-24-55

Project description:

My personal preference for color is purple. And I think with the black background, I can actually make a visually pleasant art project. Therefore, I set the color of Red and Blue for random, consequently, the color varies. And I also implement ML5.Posenet library for movement tracking. What does the program do is that when you move around the square will automatically follow your movement and leaving a trace. Furthermore, when you get closer to the screen, the figure will be more circular, in the end, being a circle, and when you move away from the scree, the figure will squared off, eventually being a rectangle. In addition to movement tracking, I also learned the “map” function, which is usually being used when you want to renew the ratio of distance.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/-lzxUgQtB

Full screen: https://editor.p5js.org/YoungYoungYoung/full/-lzxUgQtB

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.