Author Archive

Experiment 3: To Whom I Love (Version 2)

img_1109

Project Ideation:

Everybody wants to love, everybody wants to be loved.The topic of human affect and emotion always interests me. As a psych-major student, I’ve been keen to find out the most suitable color for love, even the most suitable shape for love. Everyone has their opinion on such topic. In my opinion, love shouldn’t be defined with a single color, instead, it should be mix of varied colors. Sometimes, it is more dark indicating that you  may experience some struggles in relationship, and sometimes, it could be bright indicating that you experience some sweet moments with others and sometimes, the color could be neutral meaning that it is bitter-sweet. In general, you could consider my project as a color palette of love.

Project Description:

In Experiment 1, I made an interactive project, To Whom I Love Version1,  with basically the similar layout as I did in Experiment 3. But the problem with Version 1 is that the color had been set as purple with fixed transparency. Therefore, I used analog rotation potentiometer in Version 2 to adjust the transparency and users are able to rotate the potentiometer trying to find a most fit color for love. In general, I use 3 of the potentiometer. One is responsible for the color switching of the heart in the middle and one is responsible for the size adjusting of the background circles, and the other is responsible for the transparency of the circles. With varied color and transparency dots  laid out one another, I perceived different combinations of color.

Development Images:

20431637050219_-pic    20441637050270_-pic

Final Work Images:

img_1107

20451637050580_-pic

Videos & Github:

Github: To Whom I Love Version 2

How does it work?

Experience Video

Circuit Diagram:

20461637051624_-pic_hd

References:

  1. Heart Curve in Java(Mathworld): https://mathworld.wolfram.com/HeartCurve.html
  2. Good practice seined value from Arduino to Processing: https://discourse.processing.org/t/good-practice-for-sending-data-from-arduino-to-processing/6339
  3. Nick & Kate digital future GitHub: https://github.com/DigitalFuturesOCADU/CC2021/tree/main/Experiment3/codeExamples/ArduinoToProcessing/CSV/IMUinput/PitchandRoll/Arduino/pitchandRoll_Send_CSV

Experiment 2: George, the Zombie Lamb (Halloween)

It’s getting closer to the Halloween, so I decide to start the project named George the Zombie Lamb. In general, there is one input and two outputs in the project. The input is the motion sensor, and the outputs are LED and servo. When the motion sensor detects movement, the servo gets to work and the LED automatically turns on. And it looks like a lamb turns his head around.

This project uses one principle of Calm Technology:

Technology should make use of the peripheral: the toy is easy to play with, and users do not need to waste energy on figuring out how does the toy work. And what they need to do is to put their fingers before the motion sensor, and correpondinly, the toy moves.

Experience Video | How it works?

Development Images:

img_0940 img_0941

Final Work Images:

img_0944 img_0945

Circuit Diagram:

editing-components

(As I did not find any icons representing motion sensors, therefore, I used Temperature sensor in the diagram.)

Code on Github:

https://github.com/YoungYoungYoung10/YoungYoungYoung10/commit/01297b8a7e0ea018e45daa452db9e8b5a15d6968

 

To Whom I Love

Project Name: To Whom I Love (Click)

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-21-07 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-20-16 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-21-52

Project description:

The project is another “click” experiment. As I learn more about movement recognition, I start to realize there are a lot more that I can teach the computer. Therefore, I started to teach the computer more of my gestures. The the project , To Whom I Love, is built on movement recognition program from Teachable Machine. When I put hands on my chest, referring myself, the word “I” pops up, and when I do “hand heart” gesture, the emoji of hear pops up. Finally when I point to the camera, “You” will appear on the screen.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/IwRW3tPyL

Full screen: https://editor.p5js.org/YoungYoungYoung/full/IwRW3tPyL

Hello! I’m Young.

Project Name: Hello! I’m Young. (Click)

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-16-17 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-15-54

Project description:

The project is a simple start of the “click” experiment. I learned that computers can do movement recognition through machine learning. Specifically, Teachable Machine is a great platform on which you could teach the computer to recognize your voice and movements. The the project , Hello! I’m Young, is built on movement recognition program from Teachable Machine. When I raise my arm, indicating that I’m saying hello, the banner pops up. And when I put my hands down, the banner disappears.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/X_oEEzjn_

Full screen: https://editor.p5js.org/YoungYoungYoung/full/X_oEEzjn_

Purple-iiiiiish~

Project Name: Purple-iiiiiish

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-24-55

Project description:

My personal preference for color is purple. And I think with the black background, I can actually make a visually pleasant art project. Therefore, I set the color of Red and Blue for random, consequently, the color varies. And I also implement ML5.Posenet library for movement tracking. What does the program do is that when you move around the square will automatically follow your movement and leaving a trace. Furthermore, when you get closer to the screen, the figure will be more circular, in the end, being a circle, and when you move away from the scree, the figure will squared off, eventually being a rectangle. In addition to movement tracking, I also learned the “map” function, which is usually being used when you want to renew the ratio of distance.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/-lzxUgQtB

Full screen: https://editor.p5js.org/YoungYoungYoung/full/-lzxUgQtB

Closer/ Further

Project Name: Closer/Further (Scroll)

%e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-23-53 %e6%88%aa%e5%b1%8f2021-09-28-%e4%b8%8b%e5%8d%886-23-27

Project description:

When you get closer to the Camera, the square in the middle gets bigger. Reversely, when you move further from the camera, the square gets smaller. The project is a presentation of how we can control with our body movements. I introduced the ML5.Posenet library in order to measure the distance of two eyes. When the distance gets larger, the square gets bigger. At the beginning, I had no idea of what should be made. As I watched more and more Coding train videos, I gradually grasp the essence of movement Tracking. This is a good start for me. In addition to movement tracking, I also implement “random” function to my codes, aiming to creating a more vivid and colorful visual effects.

Project Link:

Edit: https://editor.p5js.org/YoungYoungYoung/sketches/HzhFV202N

Full screen: https://editor.p5js.org/YoungYoungYoung/full/HzhFV202N

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.