Experiment 5 Secret Santa By Group 6

Secret Santa

By Group 6: Kristjan Buckingham, Jessy (Xin Zhang)





Project description 

As socially distanced Christmas is becoming a reality for many families this year, it is important to find ways to stay connected and be present despite the distance. Secret Santa allows users to be present in each other’s homes through passive interactions. Every time a gift is added to one of three stockings, a message is sent to the other user’s home, so the recipient has the chance to catch a glimpse of their “Secret Santa” leaving them a present. Since many gifts will have to be mailed, this interaction mimics the anticipation of watching presents pile up under the tree.

The stockings are lined with conductive tape connected to a capacitive touch sensor which lights up LEDs to let the sender know the message has been sent. Once a message is received, this lights up different LEDs to notify the recipient that another gift has been added to their stocking from afar.

Since the LEDs lighting up is a relatively subtle effect that fades after a short time, the user who receives the message has to be lucky enough to catch it, which adds to the excitement of knowing that another present is on its way. The touch sensors could also be used in different ways to develop other meanings or secret messages for one another. Although Secret Santa focuses on gift-giving as the primary interaction, it is more about letting loved ones know they are being thought of even if it may not be possible to be together.

Experience Video 


How It Works Video 

Final project images 




Development images 






Link to the Arduino code hosted on Github

Player 1https://github.com/xinzhang-jessy/experiment5-secret-santa/blob/main/SecretSantaJessy_02.ino
Player 2: https://github.com/xinzhang-jessy/experiment5-secret-santa/blob/main/SecretSantaKristjan02.ino


Circuit Diagram 

Kristjan’s Circuit


Jessy’s Circuitscreen-shot-2020-12-04-at-9-07-50-pm


Project Context

Secret Santa is an interactive project, combining light and remote presence to build interpersonal communication.  From the perspective of interaction, it is also a multiplayer game based on remote presence. When there are more than three players, this installation could be more playful. As any player’s action can have the same effect on a remote device, players can guess among themselves who triggered the device.

Generally, light is a sign of presence, when people come home after a day’s work, the most common thing we do is to turn on the lights. ‘The use of light is also essential to show that you are at home and to manifest the presence of life.’ In other words, light can indicate the presence. In our project, we use light as a signal to indicate the container has been occupied and the presence of the object.

Additionally, for remote presence, nowadays, phones have provided synchronous voice or face-to-face communication and to some extent asynchronous messaging. This remote delivery and control give people an opportunity to overcome the geographic distance to get to know what happened in the other place, however, the way people previously conveyed emotions or relationships through objects is weakening. ‘We all have our own experiences of postcards and pictures hanging on refrigerators and mirrors in our homes. These common artifacts exhibit often a link between individuals.’ Through this work, we want to build a connection through a series of activities that happened mostly from life. As daily activities can increase people’s emotional resonance to increase remote communication. Although the legend of Santa Claus only exists in the children’s world, these recognized symbols could be the representation of the festival. This cognitive consensus is the basis for motivating people to communicate remotely. When one person touched or placed objects in Christmas stockings, sensory lights in the straps turn on in two places, it could be a reminder for the other one to do the same thing or could view as a way of communication.


  1. “Maintaining human connection in time of social distancing.” Mayo Clinic Health System, 23 Mar. 2020, https://www.mayoclinichealthsystem.org/hometown-health/speaking-of-health/maintaining-human-connection-in-time-of-social-distancing. Accessed 2 Dec. 2020. 
  2. Szklarski, Cassandra. “Experts advise preparing for a scaled-back COVID holiday season.” CTV News, 29 Oct. 2020, https://www.ctvnews.ca/health/coronavirus/experts-advise-preparing-for-a-scaled-back-covid-holiday-season-1.5165713. Accessed 2 Dec. 2020. 
  3. ‘Understanding Remote Presence,’ Konrad Tollmar & Joakim Persson, NordiCHI, October 19-23, 2002 





Splash of Greens By Group 2



Group 2 Members: Xin Zhang, Jie Guan, Unnikrishnan Kalidas, Grace Yuan

Project Description

Splash of Greens is a remote multi-user digital experience that lets users remotely water a plant-specific to them, while it grows in a virtual garden in real-time. All users with access to the desktop listener link can access the virtual garden and view the garden grow as each remote player waters their respective plant. A control bar regulates the amount of water poured onto the plant, which can be visible on the individual control screens of users. Upon repeatedly touching the ‘Water’ button provided, a green portion within the bar reduces signifying the maximum limit to which the plant can be watered, further visualized by a full-grown plant, visible on the user control screen and the desktop listener screen. In our project we have chosen four different plants namely, Blackberry, Snake plant, Coral Cactus, and a Monstera plant, all visualized finally to grow within the same virtual garden, but remotely watered by different users. In these tough times of the COVID – 19 people interact very often on screen; our project is a reminder to people about the earthly and humane things in life, as basic as watering a plant.

Final Images

listener phone-screen

Experience  Video 

How it works video 


Development Images

develop1 develop2 develop3 develop4






Code Links

Desktop Listener:

Present: https://editor.p5js.org/guanjie429102905/present/fqEpy0yGO 

Edit: https://editor.p5js.org/guanjie429102905/sketches/fqEpy0yGO

Controller 1 – Snake Plant:

Present: https://editor.p5js.org/xinzhang.jessy/present/aDOZc82RC

Edit: https://editor.p5js.org/xinzhang.jessy/sketches/aDOZc82RC

Controller 2 – Monstera:


Edit: https://editor.p5js.org/unnikrishnankalidas/sketches/ECe_449mM

Controller 3 – Euphorbia lactea:

Present: https://editor.p5js.org/guanjie429102905/present/_mFu7mcMS

Edit: https://editor.p5js.org/guanjie429102905/sketches/_mFu7mcMS

Controller 4 – Blackberry:

Present: https://editor.p5js.org/grace.yuan/present/zXHs51ira

Edit: https://editor.p5js.org/grace.yuan/sketches/zXHs51ira 

Project context

Morpheus, a character in a scientist fiction movie The Matrix, addresses that “The Matrix is everywhere. It is all around us. …” (1999). As humans, we can be considered as part of the Matrix, to interact with information, each other, and the Context. We think the presentation of interaction between humans and information on the Internet is similar to Matrix’s concept, encoding the physical and material touch on the screen to share over the Internet and decode to present on the computer-generated environment. Internet affects multifaced of human’s life, as Steyerl said, “Never before have more people been dependent on, embedded into, surveilled by, and exploited by the web. It seems overwhelming, bedazzling, and without an immediate alternative. The Internet is probably not dead. It has rather gone all-out. Or more precisely: it is all over!” (1). We noticed that during the pandemic in 2020, an increasing number of people depend on the Internet for working, studying, and shopping. The Internet has already “enter” every aspect of human life; it is everywhere in our space and all around us. ‘Splash of Greens’ provides a virtual collaboration space for users to grow plants through the Internet, with multiple controllers in different locations to send messages to the virtual garden on the webpage. The project shows how we transformed the materialized gardening behaviors into screens and information in virtuality, presenting communication over the Internet similar to the Matrix.

Teleporting An Unknown State is a project created by Eduardo Kac in 1994, providing the Internet’s experience as a life-supporting system. In a very dark room, a pedestal with earth serves as a nursery for a living plant. Through a video projector suspended above and facing the pedestal, remote participants send light from the sky of remote cities via the Internet, in real-time, to enable this plant to photosynthesize and grow in total darkness. Our project can also consider as a telepresent experiment with plants. Rather than project the physical sky to the plant, we teleoperate virtual water for the plant growing on the screen.

Moon is an online creating art experiment by Chinese artist Ai Weiwei and Danish-Icelandic artist Olafur Eliasson. Moon is a web-based artwork to invite users to leave their mark, drawn or written, on a virtual moon’s surface as a shared platform. Moon’s open call for creative input is a powerful statement about the potential for ideas to connect people across the world of breaking distance, political, social, and geographical boundaries in the Internet age. Although our work does not ask uses to leave a mark on the virtual space, it provides a controller for the users to grow the plants from various locations in the virtual space.

 Works Cited

“Moon • Artwork • Studio Olafur Eliasson.” Studio Olafur Eliasson, olafureliasson.net/archive/artwork/WEK108821/moon. 

Steyerl, Hito. “Too much world: Is the Internet dead?.” E-flux journal 49 (2013): 1-10. 

Teleporting An Unknown State, www.ekac.org/teleporting_ an_unknown_state.html. 

Wachowski, Lana, and Lilly Wachowski. The Matrix. Warner Bros., 1999. 

Cyber Box By Jessy

cover1 cover2

Project description 

Cyber box is an interactive box that consists of three types of action to trigger the interaction between the screen and the tangible tactile interface. And the whole process is sequential and narrative. In the first part: users will see a series of dialog box pop whenever they putting a sticker into a designated area. This action represents people making comments or labeling others on social media. In the second part, uses can control the volume of the sound which is consisting of people talking. Additionally, the change of volume is also demonstrated by the dynamic sound waves on the screen. These sound came from people’s daily conversations, and when they are piled or becoming louder, it will bring a sort of oppression that is uncomfortable for the listeners. In the third part, users can push the ‘button’ to magnify the virtual balloon on the screen and the ballon will eventually boom after repeating this action.

Experience Video

How It Works 

Final project images






Development images





Link to the Arduino code hosted on Github


Circuit Diagram


Project Context

With the prevalence of social media and digital applications, our relation with the virtual world is much more closer. People used to post their feelings and opinion on social media. In this virtual digital world, people can share comments, photos, posts, and these contents are viewed by strangers or acquaintances. Increasingly, this digital World gives rise to Cyberbullying. Cyberbullying is bullying with the use of digital technologies. It can take place on social media, messaging platforms, gaming platforms, and mobile phones  According to the Cyberbullying Research Center, about half of young people have experienced some form of cyberbullying, and 10 to 20 percent experience it regularly. It is quite easy to post images or text on social media, just like we paste a stick to a whiteboard which is so common that we do not need to think about it before doing. Similarly, people can easily post hurtful or spread rumors things online and pretend to be someone else, so Cyberbullies may not realize the consequences and even find it funny. Through the Cyber box, I try to connect the physical action with the behaviors that existing in the digital world, specifically the first one is pasting a stick repeatedly to see lots of dialog box which will make people feel depressive after the screen is filled with them. This represents the text cyberbullying on social media which may reflect badly later in people’s life. Even the text can cause great psychological stress, just like the situation made by the second interaction of Cyber Box. The controller is like a filter, but in a virtual world, it is difficult to filter the cyberbullying. The pressure produced by Cyberbullies is similar to the outcome of the physical act of pushing. It means the human mind is also under the final pressure when facing lots of ‘push’.

Even though we are constantly switching between the online world and the physical world, but through the virtual world we are mainly connected to the gestures or postures. The word “touch” is in the word “touchscreen,” but tapping and swiping a cold flat piece of matter basically neglect the sense of touch. You are capable of experiencing only a fraction of what your sense of touch allows you to during the long hours of manipulation with touchscreens. The goal of Cyber Box is to seek more connection between the screen-based world and the physical world.


1.Bullying Statistics, ‘http://www.bullyingstatistics.org/content/cyber-bullying-statistics.html’

2.Designing For The Tactile Experience, Lucia Kolesarova, ‘SmashingMagazine’,’https://www.smashingmagazine.com/2018/04/designing-tactile-experience/’

Light Measurement Cup by Jessy




Light Measurement Cup is an installation to measure the light intensity in certain circumstances. It mainly consists of two parts: the LED Matrix working as an external outcome reflection while the light sensors for collecting data inside the container. In detail, there are 3 light sensors installed in a transparent cup to collect the analog signal and then transfer sets of data to the Arduino, as a result, the output will be visualized by the external led matrix. From the beginning, the led matrix is defaulted to turn off when the cup is empty. And then in the process of filling the glass with the objects that block the light, the LED matrix will gradually light up. As things are featured with different light transmittance, like Coke and milk have different light transmittance, beans with different particles will produce different intensity even placed in the same transparent container. Additionally, the light in a natural environment also varies, like the light intensity will be different at noon and in the morning. So in this case, when the LED matrix is lighted up, its intensity will vary if moving the cup to a darker space. Finally, the light intensity of the led matrix will reach the peak when there is no light inside the cup and users can get this signal by the external matrix.

Experience Video

Working Process

Final project images



Development images




Link to the Arduino code hosted on Github


Circuit diagram


Project Context
In experiment 2, I try to think more about data conversion when designing the connection between the user and the physical space. All the real-world quantities are analog in nature. We can represent these quantities electrically as analog signals. An analog signal is a time-varying signal that has any number of values (variations) for a given time slot. In contrast to this, a digital signal varies suddenly from one level to another level and will have only a finite number of values (variations) for a given time slot.

So the goal of this installation is primally to deliver the analog data to the users in a visual way. It is developed based on the principle of ‘Analog to Digital Converter(ADC)’ and in this case the voltage across the 3 light sensors. Specifically, the output of ADC is 0-1023 while the analogWrite of Led matrix ranges from 0-255, so every voltage between 0-255 is mapped to a value between 0-1023. This conversion also works in designing of user behavior. People’s Interactions or behaviors are viewed as the imported ‘data’ to trigger the computation and affect the outcome in the process of filling the container with some solid or liquid stuff, the computation converts the ‘behavior data’ into the light intensity of the led matrix which is easy to analyze. According to the intensity of the Led Matrix and the number of lights on, users will know the light intensity inside the container. We create data trails both consciously and unconsciously. There are some other examples, when we use a measuring cup to measure a liquid, the scale on the side of the cup can give us an accurate and intuitive reading.

Some data is usually abstract, high-dimensional, and structured in a complex way because the user does not have any preconception of how such data could look, so no “natural” display is possible (in contrast to volume visualization, where, for example, isosurfaces at a constant data value can pull out representative “objects” such as the core of a thunderstorm or the region of bone in the human body).

So through this project, I am intended to use physical methods for making the data more visible, like using the number of lighting led to represent the abstract lights. It is lead to natural analogies in visualizing the data and can delivery the information more clearly so that help user to better understand the meaning of the data.



2. ’Chapter 41 Human-Data Interaction’, Interaction Design Foundation ’https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/human-data-interaction’

3. ‘An Interaction View on Information Visualization’, Robert Kosara, Helwig Hauser, Donnal L. Gresh

4. ‘An Interaction View on Information Visualization’, Robert Kosara, Helwig Hauser, Donnal L. Gresh


Body Band


Jessy(Xin Zhang)


Body band is an interactive and playful experiment using the body as a controller to build the interaction between player and screen-based sound. Standing naturally in front of the screen is the default state for each study. Then the player can use walking, raising hands, twisting the body, squatting to trigger the sound. In the first study, the performer turns on the sound in the form of walking, accompanied by a metronome. Each of the five studies has its own designed movement and they occur in our daily life, however, most of these body movements or posture can not be heard in the real space. I re-combined them with the sound of instruments or music to represent the relationship between the physical body and sound.


Normally physical movements and postures can be visualized clearly, we can understand performers’ moods via their body representation. But most of the physical movements can not be heard. Like footsteps are difficult to hear especially in a noisy street. However, we can still  recognize specific sounds without seeing it. For example, When the person behind you is clapping, you can tell by the sound without looking at it. From my perspective, the action generates the sound and movement is the visual representation of a sound which is significant for people to build cognition. Body band is an experimental interactive project that combining the musical sound with body movement. The whole interaction is a process of transforming the acoustic signals to visualized posture which producing a ‘audiolization of body movement’.

When I saw the video of an interactive sound exhibition Linewhich is designed by Swedish composer Anders Lind in 2016, it inspired me of a body controller as a media to make the interaction of sound. In the project, the traditional keyboard has been redefined as lines on the wall. Players’ gesture and location could be captured by sensors to produce digital music.

Some acoustic instruments are easy to get started on. It is not difficult to produce the sound for the first time when picking up ones, like guitar, piano et. Even through mastering a musical instrument is a complex process that may take years of practice to reach the level of virtuosity. The music consists of a lot of elements.The performer must understand how the music flows and then translate that understanding into bodily movements that render the sound in the physical world. Performers’ reflection of music varies considerably from performers and their style, their movements of playing instruments have their personal style and understanding. But an understanding of the music is origin from our perception and physical body. People naturally attune to the basic rhythm while listening to music-swaying their bodies, tapping their feet.

Additionally, people without professional musical theoretical are capable of identifying the instrument if they hear the sound. We can distinguish the sound characteristics of percussion instruments from stringed instruments, or we can also come up with a description of the action and guess how it could be produced when only hearing the sound. The perception of sound or music has action-sound relationships. Moreover, ‘Gesture’ as a term or concept can be fruitfully used when working with sound, in the performance of interactive, real-time generated music(s). The physical body movements and actions make the sound-producing more expressive and visual.

Study 1 Footstep metronome

Present link  https://editor.p5js.org/xinzhang.jessy/present/cVLOE0vsQ

Edit link  https://editor.p5js.org/xinzhang.jessy/sketches/cVLOE0vsQ

Video:  https://ocadu.techsmithrelay.com/WLSf       

(The sound material is from ‘Free Sound’, www.freesound.org)




Walking is one of the most basic human movements and it has a certain rhythm. In my first study, I associated footsteps with the sound of a metronome to show their similarity-rhythm. Whenever the player is walking, the virtual metronome will be played. Everyone has his own rhythm and it can be adjusted during walking. So the sound will also be changed according to different people, which is a representation of personality.

Study 2 Virtual MIDI

Present link  https://editor.p5js.org/xinzhang.jessy/present/aYX3VC8J-

Edit link  https://editor.p5js.org/xinzhang.jessy/sketches/aYX3VC8J-

Video  https://ocadu.techsmithrelay.com/SJrb




The ideal of study 2 is origin from the digital instrument MIDI and the instrument is simp-lied into 2D blocks which are easy to ‘play’. People without any professional musical experience can play a song by moving their bodies. Whenever touching the block on the screen with hands or arms, it will trigger the sound of MIDI.

Study 3 Volume Controller

Present link  https://editor.p5js.org/xinzhang.jessy/present/fOcFQIj4v

Edit link  https://editor.p5js.org/xinzhang.jessy/sketches/fOcFQIj4v

Video https://ocadu.techsmithrelay.com/zzhS 

(The sound material is from ‘Free Sound’, www.freesound.org)



During musical performance, the player will control their body movement so as to generate sound. In my third study, I use arm position to represent the sound volume and the volume can be adjusted by moving arms. As the arms moving to the high position, the organ becomes louder. In this piece, I want to use body movement to represent the sound visually.

Study 4 Touchable Triangle

Present link  https://editor.p5js.org/xinzhang.jessy/present/hMkplGr1P

Edit link  https://editor.p5js.org/xinzhang.jessy/sketches/hMkplGr1P

Video  https://ocadu.techsmithrelay.com/CQL4

(The sound material is from ‘Free Sound’, www.freesound.org)



In this work, the whole body can be considered as part of the instrument. Whenever the performer stands up and it will trigger the sound. Additionally, without any music experience, the player can produce their own music by moving the physical body.

Study 5 Swing Switch

Present link  https://editor.p5js.org/xinzhang.jessy/present/hirJCrC2i

Edit link  https://editor.p5js.org/xinzhang.jessy/sketches/hirJCrC2i

Video  https://ocadu.techsmithrelay.com/G4Mi   

(The sound material is from ‘Audiomicro’, audiomicro.com)



The digital world has helped us to define and understand more actions. The digital music players have the function of moving back and forward. They are normally represented by arrows going left and right. This inspired me to apply the principle in 3D physical space and using the body as a controller. When the performer moving to the left, it will trigger the first piece of music, while the other side will trigger the other one. These movements can also be visualized as dancing which has a close relationship with the music.


Naoyuki Houri, Hiroyuki Arita, Yutaka Sakaguchi ‘Audiolizing Body Movement: Its Concept and Application to Motor Skill Learning’, 2011

The Nordic House, ’Lines-Interactive sound art installation’


Anders Line, Anhttp://www.soundslikelind.se/

Espie Estrella, ‘An introduction to the elements of music’,’https://www.liveabout.com/the-elements-of-music-2455913’

Xiao Xiao Basheer Tome and Hiroshi Ishii MIT Media Lab’Andante: Walking Figures on the Piano Keyboard to Visualize Musical Motion,’2014

M. Leman and R. I. Gody.,’Why study musical gestures. In R. I. Gody and M. Leman, editors, Musical Gestures: Sound Movement, and Meaning’ Routledge, 2010.

Alexander Refsum Jensenius,’ACTION – SOUND, Developing Methods and Tools to Study Music-Related Body Movement’,2007

Jan C. Schacher,’ Moving Music – Exploring Movement–to–Sound Relationships’, 2016