Group: Erman & Jing
Project 1: Birthday Filter.
Project 2: Mixing Face
Software and libraries:
- Text Editor
Project one is built based on Kyle McDonald’s CV examples
Experience 1: Kyle McDonald’s CV examples
We played with a collection of interactive examples using p5.js through the link(CV examples) Kate gave us. The examples are meant to serve as an introduction to CV and the libraries we can use. The examples in this link use p5.js to access live video. All examples are self-contained and can be run independently, so we tried all the examples and tried to learn the p5.js code.
The example I liked most is nose theremins and light painters that used our body as a pointer in p5.js. One key feature of this experiment allows people to use their body parts as pointers, instead of the mouse.
(experiences of trying example code online)
Beyond the example code, I made a few changes:
To change the amplification
input.amplification = 2;
To track other body parts:
Change the code “input.part = ‘nose’;” to other part of body you want to track:
(syntax for input.part)
The Creatability experiments include several musical instruments. Having multiple interaction modes can make creative coding projects more expressive and engaging.
(experience with creatiability musical instruments)
Instead of having body posts as input only, I want to have some output for the overall experiment.
(Things need to use when building this online project)
Then we found Tensorflow.js and Tone.js is beyond our capability that we couldn’t find example code for triggering music online. We decided to go back to our original idea of birthday filter.
We used photoshop to create images we need for the filter. We downloaded 3 celebrities my friend loves and wrote some words.
(Filter image1: birthday hat)
(Filter image2: background)
(Filter image3: boy1)
(Filter image4: boy2)
(Filter image4: boy3)
Experience 2 with Processing
We have found two interesting codes. One of them is Daniell Shifman’s motion detection. Other is Abhinav Kumar’s colorDrawing. They both work with Processing.
Motion Detection: This application detects the motion in the camera. Motion appears in white colour and turns to black when motion stops. A created object follows the motion. After seeing this application we decided to that if make some changes and make it leave a track behind we can draw on the screen with our motions.
We could make a few changes in the code, like changing the colour, shape, speed of the object.
ColorDrawing: This application was basically had the feature of what we could not make with Motion Detection app. After selecting a colour by clicking on it it starts making lines with colour and follows the same coloured images in the view. If you click on another colour it starts colour with that colour and keeps the previous line the same. It was hard to draw or write a synchronized camera because switching sides, but with some experience, it could be succeeded.
We made a few changes in the code. It was easy to change the size and shape of the tracing object.
We also tried to combine two of the code and customize the motion tracking app first. What we wanted was colouring with motion. We focused on motion detection and tried to modify its codes; however, codes did not match and gave an error for each attempt.
Image. One of our trails and errors. A red dot appears and does not move with motion. You can see its code here.
I imagined this tool could be used also for video calling. As we use emojis in our chats, we can create instant and live emojis while we are using our camera. We can combine features of the codes we found. When we use the camera, our creation can follow our body parts and can appear when other people or another object appears. You can create a mask or a make-up on your face and can keep it while you are seen on camera. Digital game design is also a possibility. There are many possibilities for CV for colour, motion, face tracking; however, lack of experience and knowledge with coding was a drawback.
Experience 1: Kyle McDonald’s CV examples_Nose theremins
- ml5.js does not depend on p5.js and you may also use it with other libraries.
- If you need to run the examples offline you must download the p5.js library and ml5 library or any other library you need.
- PoseNet on TensorFlow.js runs in the browser, no pose data ever leaves a user’s computer.
- PoseNet can be used to estimate either a single pose or multiple poses, meaning there is a version of the algorithm that can detect only one person in an image/video and one version that can detect multiple persons in an image/video.
- Information sources:
It would be nice to make an app or web page which people can draw pics. Their instant images can be used and they can just draw with their hand or body motions. The objects around them can be their colour palette. Saving the images and stopping and activating the brush would be necessary. Different filters could give different art outcomes and create different experiences. With some practice, painting with motion, image filters and additional images could be fun for video use.