Experiment 5 Secret Santa By Group 6

Secret Santa

By Group 6: Kristjan Buckingham, Jessy (Xin Zhang)





Project description 

As socially distanced Christmas is becoming a reality for many families this year, it is important to find ways to stay connected and be present despite the distance. Secret Santa allows users to be present in each other’s homes through passive interactions. Every time a gift is added to one of three stockings, a message is sent to the other user’s home, so the recipient has the chance to catch a glimpse of their “Secret Santa” leaving them a present. Since many gifts will have to be mailed, this interaction mimics the anticipation of watching presents pile up under the tree.

The stockings are lined with conductive tape connected to a capacitive touch sensor which lights up LEDs to let the sender know the message has been sent. Once a message is received, this lights up different LEDs to notify the recipient that another gift has been added to their stocking from afar.

Since the LEDs lighting up is a relatively subtle effect that fades after a short time, the user who receives the message has to be lucky enough to catch it, which adds to the excitement of knowing that another present is on its way. The touch sensors could also be used in different ways to develop other meanings or secret messages for one another. Although Secret Santa focuses on gift-giving as the primary interaction, it is more about letting loved ones know they are being thought of even if it may not be possible to be together.

Experience Video 


How It Works Video 

Final project images 




Development images 






Link to the Arduino code hosted on Github

Player 1https://github.com/xinzhang-jessy/experiment5-secret-santa/blob/main/SecretSantaJessy_02.ino
Player 2: https://github.com/xinzhang-jessy/experiment5-secret-santa/blob/main/SecretSantaKristjan02.ino


Circuit Diagram 

Kristjan’s Circuit


Jessy’s Circuitscreen-shot-2020-12-04-at-9-07-50-pm


Project Context

Secret Santa is an interactive project, combining light and remote presence to build interpersonal communication.  From the perspective of interaction, it is also a multiplayer game based on remote presence. When there are more than three players, this installation could be more playful. As any player’s action can have the same effect on a remote device, players can guess among themselves who triggered the device.

Generally, light is a sign of presence, when people come home after a day’s work, the most common thing we do is to turn on the lights. ‘The use of light is also essential to show that you are at home and to manifest the presence of life.’ In other words, light can indicate the presence. In our project, we use light as a signal to indicate the container has been occupied and the presence of the object.

Additionally, for remote presence, nowadays, phones have provided synchronous voice or face-to-face communication and to some extent asynchronous messaging. This remote delivery and control give people an opportunity to overcome the geographic distance to get to know what happened in the other place, however, the way people previously conveyed emotions or relationships through objects is weakening. ‘We all have our own experiences of postcards and pictures hanging on refrigerators and mirrors in our homes. These common artifacts exhibit often a link between individuals.’ Through this work, we want to build a connection through a series of activities that happened mostly from life. As daily activities can increase people’s emotional resonance to increase remote communication. Although the legend of Santa Claus only exists in the children’s world, these recognized symbols could be the representation of the festival. This cognitive consensus is the basis for motivating people to communicate remotely. When one person touched or placed objects in Christmas stockings, sensory lights in the straps turn on in two places, it could be a reminder for the other one to do the same thing or could view as a way of communication.


  1. “Maintaining human connection in time of social distancing.” Mayo Clinic Health System, 23 Mar. 2020, https://www.mayoclinichealthsystem.org/hometown-health/speaking-of-health/maintaining-human-connection-in-time-of-social-distancing. Accessed 2 Dec. 2020. 
  2. Szklarski, Cassandra. “Experts advise preparing for a scaled-back COVID holiday season.” CTV News, 29 Oct. 2020, https://www.ctvnews.ca/health/coronavirus/experts-advise-preparing-for-a-scaled-back-covid-holiday-season-1.5165713. Accessed 2 Dec. 2020. 
  3. ‘Understanding Remote Presence,’ Konrad Tollmar & Joakim Persson, NordiCHI, October 19-23, 2002 





Liquid Keyboard – DripDrops

Liquid Keypad – DripDrops
Kristjan Buckingham


The Liquid Keypad is a play on the conventions of Tangible User Interfaces, utilizing a series of “Liquid Keys” that the user can interact with by dipping their fingers into. Water is typically not a friend to exposed electronics, so there is a precarious aspect that adds a bit of tension to the experience. Although the water could certainly damage the electronics if not handled carefully, there is no actual danger to the user. The Liquid Keyboard is used to control effects in the program DripDrops, which mimics a soothing dripping effect that the user can manipulate, creating a direct connection between what the user is feeling by touching the Liquid Keys, and the visual and auditory representation on their device. Each Liquid Key plays a slightly different sound, inviting the user to explore different combinations of key presses. The drips also grow as long as the user holds their finger in the water, and the background “dampens” by getting darker as more keys are pressed. This calming interaction is meant to subvert the user’s initial hesitation and call attention to extreme affordances of water as a medium.

Experience Video: https://vimeo.com/478125809

How-It-Works Video: https://vimeo.com/478124814

 Project Images

A different drip sound effect plays when each Liquid Key is touched and the corresponding drop begins to grow while the user’s finger is submerged.


As the user dips more fingers into more Liquid Keys, the drips make a musical effect and the background “dampens” by getting darker.

Development Images





Link to Arduino Code for Liquid Keyboard:

Link to Processing Code for DripDrops: https://github.com/buckinghamk/DIGF6037_ex3_BuckinghamKristjan_Processing

 Fritzing Circuit Diagram


Project Context

When deciding what material to use for this project, I was considering the affordances of various substances and how those might be utilized in unexpected ways. Using conductive touch sensors presented its own set of limitations as well. The choice of using water as a Tangible User Interface was meant to be somewhat counter-intuitive to the traditional incompatibility of water and exposed electronics, challenging the expectations of the user.

There are some interesting examples of different approaches that others have taken to try and integrate water as part of a Tangible User Interface. Garden Aqua utilizes high pressure jet-streams of water to imitate levitation of objects (Wenjun). In this case, sensors track the user’s hand gestures to create a response and do not interact with the water directly.

Another study exploring liquid as a Tangible User Interface focused on the chemical composition of water and how different substances could be added to trigger different responses (Hotta). This exploration integrated a variety of materials with some interesting results, but again there is no physical contact between the user and the liquid.

In this project, the user only ever comes into contact with the water itself. Conductive tape is submerged in the water and triggers the capacitive sensor when the user dips their finger in. Precautions were taken to ensure the water has very little chance of coming into contact with the circuit board or laptop, but care on the part of the user is still required. I wanted to make sure there was no chance of harm to the user and very little chance of harm to the electronics.

I think there is an interesting cognitive play between the initial discomfort of seeing exposed wires attached to a tray of open water, and the familiar feeling of dipping one’s fingers into water. There is certainly a limited context where this kind of interaction may be viable, but I think it is an effective subversion of traditional interaction.

Works Cited

 Holmquist, L. (2019). The future of tangible user interfaces. Interactions. Retrieved November 9, 2020, from https://interactions.acm.org/archive/view/september-october-2019/the-future-of-tangible-user-interfaces

Hotta, M., Oka, M., & Mori, H. (2014). Liquid tangible user interface: Using liquids in TUI. HIMI 2014, Part I. Retrieved November 9, 2020, from file:///Users/Kristjan/Downloads/Hotta2014_Chapter_LiquidTangibleUserInterfaceUsi.pdf

Wenjun, G., Seungwook, K., Sangung, Y., Minkyu, C., Seungha, Y., & Kyungown, L. (2017). Garden aqua: Tangible user interface study using levitation. International Journal of Asia Digital Arts & Design. Retrieved November 9, 2020, from https://owl.purdue.edu/owl/research_and_citation/apa_style/apa_formatting_and_style_guide/reference_list_electronic_sources.html




Enter the void

Enter the void


“With no object no image and no focus, what are you looking at? You are looking at you looking.” ~ James Turrell

Have you ever watched a movie which doesn’t relate to you personally but you think if you could play a part in that situation? I had that experience when I saw the movie,”Enter the Void” for the first time. The movie talks about a guy who gets killed in a police drug bust but then he experiences death in his death and relates it to the same feeling he had after taking DMT when alive. This sounds confusing but it is more interesting. Then the whole movie revolves around the visual aesthetics of a tripper. I don’t know what certain substances are and what they do to the minds of us humans but the visualization people have under the influence is confusingly interesting. One thing I find common in all of them is how the perception of reality is altered under its influence.

This project is named on the movie, ‘Enter the void’ because yes, I got inspired from the film because of its confusing/hallucinating visuals. Looking at some of them does feels like a big empty space. One thing I liked from this short experience into coding is the ability to create Generative coded visuals, in multiple amounts and in little time which similarly represents  the visuals people had under the influence where as I experience it directly through art.

Final Output

This is no different than using a foil paper and attaching it to a flat surface but I think using shades reasonably justifies the concept in a way that (seems like a cheap version of a VR headset, really cheap one) you just cannot enter the void simply by looking at the visuals, you need a medium to feel the nature of it, hence the glasses.

I faced difficulties in the area that I only had one pair of glasses that were conductive in nature, so for each visual I had to change the clip and record that part. I look at the project as a prototype to a different possible outcome.

Development Stage

1  1.

1. By this stage I had already experimented using a foil paper to work as a controller but a new idea struck me to use a glass frame as an interface.

2 2.

2. Frame did worked as an interface but somewhere it stopped conducting electricity. It could be because of my poor skills at soldering the touch sensor or maybe because of the frame material itself. I solved the issue temporarily by wrapping foil around the frame and then clipped the alligator to it. It worked better than before.

3 3.

3. I have used 4 clips as a touch interface to interact. Working with coded visual allows me to create en-number of outputs but then I felt that keeping the experience simple and minimal might work better. This photo show’s that I used a (left to right) breadboard with Arduino 33 IOT, Touch Sensor, Alligator Clips with foil attached to it, Laptop using Arduino Software & Processing, Frames, Foil sheet.

4. I faced issues while reattaching the clip (more shades could have helped). The connection got lost on it’s own and I am not sure what could be the reason.

Github Code  https://github.com/Achal-OCAD/Experiement-3-


Project Context

Generating visuals creates a space of Pareidolia. I like the abstractness of it’s nature which allows me explore whatever I want to rather than being specific. This gives me the freedom to not only explore the visual outputs but forces me to think about how will the idea be projected in a space. I have had the chance to learn about various artists and go through their works. I am learning how visual outputs cannot be limited to a screen  but can be achieved with different materials.

Ivan Navarro, an artist from Chile works with light, mirrors and glowing glass tubes and creates an illusion of depth in a flat space along with portraying messages related to the locations he exhibits them at. His technique is interesting and I wished to explore that in the previous assignment which I could not but his concept remains with me. His works has that feeling of void and even the non-animating materials seems like they are in a loop.

Ginger Leigh as known as Synthestruct is a new media artist. Her works have shown me the possibility of bringing this similiar kind of works into large spaces and how can it be made interactive. I learnt how the generative visuals can be replaced with lights and interacted with depth sensing cameras. Through her works I found out that Arduino can be connected to TouchDesigner the same way we did it with Processing.

Caleb Ogg also known as Iso.Hedron is an artist who works with codes. I have been following his works for inspiration and the code I worked with in this project is largely inspired from his codes. I was clueless about how fill() could be multiplied with integers and Pi values to create an infinite loop.

I now am interested in learning about AR & VR methods because of the way I approached this concept. This experiment build a guiding bridge for me to understand how two different software’s interact with each other and how screen based items interact upon the commands from the outside world.


“TOP 25 QUOTES BY JAMES TURRELL | A-Z Quotes.” A-Z Quotes, https://www.azquotes.com/author/21907-James_Turrell. Accessed 6 Nov. 2020.
Guida, Nello. Enter The Void – Official Trailer 2010 [HD]. YouTube, 10 Aug. 2010, https://www.youtube.com/watch?v=_tG_b5zaT9Y.
Contributors to Wikimedia projects. “Iván Navarro (Artist) – Wikipedia.” Wikipedia, the Free Encyclopedia, Wikimedia Foundation, Inc., 21 Oct. 2013, https://en.wikipedia.org/wiki/Iv%C3%A1n_Navarro_(artist).
“Synthestruct.” Synthestruct, https://www.synthestruct.com/. Accessed 6 Nov. 2020.
“Login • Instagram.” Instagram, https://www.instagram.com/iso.hedron/?hl=en. Accessed 6 Nov. 2020.


Museums of Memory by Grace Yuan

Museums of Memory

Grace Yuan

Project Description

Museums of Memory is an interactive installation based on tangible interface and screen-based digital experience. Users are invited to explore three virtual museums by touching the objects in front of the screen. Each museum represents a piece of memory from my childhood, adolescence, and adulthood. The tangible interface consists of 3 black toy houses made of metal and a controller made with 4 crystal stone keys painted with metallic paint, all connected to the Adafruit MPR121 touch sensor and Arduino Nano 33 loT. The visual response on the screen is programmed in Processing 3.0, with assets created in various softwares including Blender, Rhino, Aftereffect, Photoshop, and Illustrator.

Users can choose to visit one of the three virtual museums on the screen by touching a toy house, and after entering the museum, users can move a virtual hand up, down, left, and right to interact with the artifacts by touching the crystal stones. The artifacts in the museums are objects that are important and iconic to me in my memory from different times. They illustrate the shifts of focus in my life – from nature and books to art and painting, and finally architecture and work.  The artifacts are displayed as holograms as a form of virtual archive. The choice of the tangible interface is based on the concept of a virtual museum. By connecting the toy houses with virtual museums, physical space is extended into the digital screen, creating a canvas for possibilities and creativities. The crystal stone controller is placed on top of a white box, mimicking the artwork on a pedestal in the museum setup.


Final Images

final-image-cover final-image



How it Works

Development Images




Project Context

Coming from an interior and architectural design background, I’m always interested in the idea of creating virtual space and extending the physical space. Between the two dimensional digital screens and three-dimensional real-life environments, I want to create work that brings the two together and opens up room for imagination. Recently, I was inspired by a mixed reality phone game that just came out in October 2020 – HoloVista. https://www.enterholovista.com/ In the game, the main character who is a junior architect, is tricked by an architecture firm and forced to be trapped inside a virtual house. The house is created by a computer algorithm based on the data of her memory, feelings, and social relationships. The game is beautifully designed and absolutely immersive, players are not only seeing the virtual house but also taking photos, posting on the character’s social media, and texting with other characters. I was fascinated by the game and it really inspired me with its game design and aesthetics. As I am passionate about 3d modeling and rendering, I wanted to create a virtual space as beautiful and creative as the game. The game also evoked strong emotion and sympathy for the character, which made me think of the subject of memory.

Another work that influences my project is the 2020 online graduation show of the Academy of Arts & Design department at Tsinghua University, China. https://exhibition.ad.tsinghua.edu.cn/ Due to Covid-19, many art schools in China managed to create an online exhibition of the student work and this one from Tsinghua University impressed me the most. It’s a web-based virtual exhibition displaying the student work with animation and interactions. Based on the media of the work, some students chose to showcase their work in a virtual museum setup and some chose to use the web page as a digital canvas. They used layerings, a variety of scales, light, and shadow to create depth and built an immersive environment of the exhibition space. I really appreciate the way they curated the entire exhibition and it inspired me to think of digital technology as a medium for creating space. Especially under the current circumstance, as physical interaction and accessible space became limited.

As I explored creating space for my memory in this experiment, for the next step, I would like to keep building on the study and create space for others, for people who need space to express themselves. I also want to create a platform for people to build their own virtual space, and  hopefully, this will be incorporated into my thesis project.

Github link


Circuit Diagram



Academy of Arts & Design department at Tsinghua University, exhibition.ad.tsinghua.edu.cn/.

“HoloVista.” AconiteCo, www.enterholovista.com/.

“Tangible Interaction.” The Interaction Design Foundation, www.interaction-design.org/literature/book/the-glossary-of-human-computer-interaction/tangible-interaction.

Pownall, Augusta. “Otherworld Is an Immersive VR Arcade with Interiors Influenced by James Turrell.” Dezeen, 21 Aug. 2019, www.dezeen.com/2019/08/21/otherworld-vr-arcade-london-red-deer/.