Author Archive

Christmas in a box

Christmas in a Box
-Merel and Preeti










Project Description:

As holidays are around the corner, we both wanted to bring festivity to the experiment. Christmas is all about lights, music, and presents. So we’ve tried to get together the essence of Christmas through LED lights and deliver them in a box! Also, since most of us are away from our families, we wanted to celebrate this occasion with a touch of technology.

Function 1:
We have used a dark box to mimic the night as that’s when we see the stars and the Christmas lights at their best. The blue LEDs on the top of the box are connected to the LDR; as it becomes dark, the lights become brighter, and as the surrounding lights brighten up, the LEDs get dimmer.



Function 2:
The LEDs that are placed under the snow are connected to the audio sensor. They respond to the intensity of the music that is playing. Thus, the brightness of the LEDs fluctuates according to the decibel of the song.

screenshot-2021-10-26-at-10-55-23 screenshot-2021-10-26-at-10-55-45



we did not find the audio sensor on Tinkercad, therefore had to use a distance sensor

Function 3:
The star on the top of the Christmas tree is one of the most iconic ornaments in the decoration. We have used a LED light to represent it and can be easily turned on and off with the help of a push-button.

screenshot-2021-10-26-at-11-02-20 screenshot-2021-10-26-at-11-01-59


Experience video:

How it works video:


Our project uses two principles of Calm Technology:

Design for people first:
Our project conveys joy in the most straightforward matter, without it being dominant or overwhelming with technology. It invites the user to interact and immerse themselves in the experience of Christmas.

Technology should work even when it fails (Think about what happens if your technology fails):
If the technology fails in our case, it can still be used as a decorative piece. It will continue to spread the joy and soul of Christmas.

Link to the code:

Play Along

The series focuses on user’s facial movements through machine learning. The key idea is to interact with the user with the help of sound and hand gestures, to make regular movements more fun. The experiments have been designed by keeping in mind the  ‘Click’ and ‘ Scroll’ movements. While everybody has their own interpretation of these actions, the ‘Play Along’ experiment has been designed in a way to help the user interact with the computer through body postures and gestures without the use of a mouse/keypad. I have tried to combine actions with playful movements and with a touch of music.

Experiment 1: Coloured Claps

With every clap the colour of the ellipse changes




The idea behind the experiment is to change the colour of the ellipse with every clap. When the user brings his/her hands together with every action of ‘clap’ the ellipse switches colours. Once the user stands/ sit in front of the webcam, the system recognises the writs and when both the wrists come together the system generates an output of changing colours.



Present link:

Edit link:

Video link:


Experiment 2:Move it

Pushing the text by lifting your right hand




This experiment focuses on the scroll gesture. The user can interact by raising his/her right hand and pushing the text. The concept was how the text moves to the end and then bounces back to its initial position. This idea can be adapted for several ‘swipe’ feature

Learnings: PoseNet was identifying my left wrist as well and the text was bouncing back before reaching the x-axis.



Present link:

Edit link:

Video link:


Experiment 3: Pause and Play

Using Nose press to play and pause music

Free music:



With a defined ‘y’ axis, the user can use his/her nose to play and pause music. The user can move his head up and down the defined axis to control the music with their nose.

Learnings: I faced an issue with the music loading slowly even though the prompt commanded it to ‘preload’. I was able to make a slight change to my command and changed it to poseNet = ml5.poseNet(video, modelLoaded);


Present link:

Edit link:

Video link:


Experiment 4: Laser eyes

With the user’s eyes as the source and the laser beams move with the user’s eye movement.




This fun-filled experiment works in coordination with the user’s eye movement. The laser tracks the movement of the eye and moves along.

Glitches: I intended to make the lines squiggly, but they were not capturing the x and y properly



Present link:

Edit link:

Video link:




Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.