By Jessie, Liam and Masha
Project description:
Mood Drop is an interactive communication application which creates a connection between individuals in different physical environments over distance through interaction of visual elements and sound on digital devices. It allows people to express and transmit their mood and emotions to others by generating melody and images based on interaction between users.
Mood Drop enables multi-dimensional communication since melody naturally carries mood and emotion. It distinguishes itself from the ordinary day-to-day over-the-distance communication methods such as texting, with its ability to allow people to express their emotions in abstract ways.
Furthermore, Mood Drop embodies elements of nature and time which often plays a factor on people’s emotions. By feeding in real-time environment data such as temperatures, wind speed and cloudiness of a place, which affects variables within the code, it sets the underlying emotional tone of a physical environment. As people interact in the virtual environment which closely reflects aspects of the physical environment, a sense of telepresence of other people in one’s physical environment is created.
Code: https://github.com/lclrke/mooddrop
Process:
Modes and Roles of Communication
Having learned networking, we tried to come up with ideas of different modes of communication. Rather than every user having the same role, we hoped to explore varying the roles that could be played in a unified interaction. Perhaps some people only send data and some people only receive it, and we could use tags to filter the data to receive on the channel.
One idea we considered was a unidirectional communication method where each person receives data from one person and sends data to another person.
However, we didn’t pursue this idea further because we couldn’t justify this choice with a valid reason behind it apart from it’s an interesting idea. We eventually settled on the idea of creating a virtual community where everyone is a member and can have the same contribution.
Ideation:
Once we were settled on the idea that everyone has the same role and figured out PubNub, we started brainstorming ideas. We all were interested in creating interactive piece, which would involve visual part and sound. Thus, we explored the p5.js libraries to find some inspiration. Vida Library by Pawel Janicki gave us an idea of affecting the sound with motion detected by web camera. This would not work because it was impossible to do video chat through pubnub (hence, no interaction).
Another thought was to recreate Rorschch’s Test, which would allow users to see changing abstract image on the screen, so they could share their thoughts on what they saw between each other by typing it.
Finally we came up with the idea of creating an application which would allow users to express their mood through distance. By using visuals and sounds, participants would be able to cocreate musical compositions being far away from each other. We found a code, which was a foundation for the project, where users could affect the sound by interacting with shapes using their mouse.
Next we built a scale using notes from a chord, where frequencies were spaced in a way that the size of the shape generated by clicking would affect the mood of the transmitted sound. The lower part of the chord contains notes related to minor frequencies, while the top part focuses on the minor frequencies. The larger the circle, the more likely to play the lower minor roots of the chord. The final sound was simplified to one p5.js oscillator with a short attack and sustain to give it percussive characteristics.
Working on visuals
As we started working on the visual components of the piece we decided to try the 3D Library in P5.js. We were looking for a design that would have the strong and clean sense of interaction when the shapes connected in digital space. Also, we were imagining the sound as a 3d object, which can exist in multiple dimensions and can have many directions. Many shapes, colors and textures were experimented with.
Simplifying shapes and palette:
An important moment occurred when we all were interacting with the same page independently from each other at home. While working on small code details, we soon found ourselves playing with each other in an unplanned session, which created an exciting moment of connection. We pivoted away from maximal visuals and sound after this to focus in on this feeling as we thought this was important to emphasize. While working on the project beside each other, we were wondering why being in separate rooms was important to demonstrate the piece. This moment of spontaneous connection through our P5.js editor window made us understand the idea of telepresence and focus in on what we then thought was important to the project.
We decided to return to a simple black and white draft featuring the changing size of a basic ellipse. The newer visuals did not clearly show the parameters of the interaction as the relationship between shapes on screen were not as clear as a basic circle.
By inputting to many aesthetic details, we felt we were predefining aspects that could define mood for a user. We found black and white was the better choice for palette as we wanted to keep the mood ambiguous and up to user interpretation.
Project Context:
The aim was to create a connection between two different environments, and we looked to transfer something more than video and text.
Place by Reddit (2017)
This experiment involved an online canvas of 1000×1000 pixel squares, located at a subredditcalled /r/place, which registered users could edit by changing the color of a single pixel from a 16-colour palette. After each pixel was placed, a timer prevented the user from placing any pixels for a period of time varying from 5 to 20 minutes.
The process of cocreating one piece by multiple people from different places appealed to us, thus we also designed something that enables people to feel a connection to each other. To push this idea further, we decided to create something where visuals and sounds work in harmony as a coherent piece when people interact with each other. The interactions between people will be represented in the virtual space by the animation of interactions of visual elements they created and sound on a digital device.
Unnumbered Sparks: Interactive Public Space by Janet Echelman and Aaron Koblin (2014).
The sculpture, a net-like canvas 745 feet long and suspended between downtown buildings, was created by artist Janet Echelman. Aaron Koblin, Creative Director of Google’s Data Arts Team, created the digital component, which allowed visitors to collaboratively project abstract shapes and colors onto the sculpture using a mobile app. We applied simplicity and abstract shapes from this mobile app to our interface in order to make the process of interaction and co-creating more visible and understandable.
Telematic Dreaming by Paul Sermon (1993)
This project connects two spaces by projecting one space directly on top of another space. The fact that Sermon chose 2 separate beds as the physical space raises interesting questions. It provokes a sense of discomfort when two strangers are juxtaposed into an intimate space even if they are not really in the same physical space. The boundary between the virtual space and physical space becomes blurred because of this interesting play with space and intimacy.
Inspired by this idea of blurring the boundary of two spaces, we thought we could actually use external environmental data of the physical space which will be visualized and represented in a virtual space on screen in some way. The virtual space will be displayed on the screen which then exists in a physical space. In this case, not only is the user connected to their own environment, other people who are interacting with the person are also connected to this person’s environment by interacting within this virtual environment which is closely associated with data from the physical space. It blurs the line between the virtual and the physical space as these two spaces get intertwined and generate an interesting sense of presence within the virtual as well as the physical space as users interact with each other.
We eventually decided to add the Toronto live update weather API to mingle with our existing interaction elements. We used temperature, wind speed, humidity and cloudiness to affect the speed of the animation and the pitch and tone of the music notes. For example, during midday, the animation and music sound will happen after a faster speed than during the morning as the temperature rises, which also aligns with people’s energy level and mental state, and potentially emotions and mood.
References:
Manfred , M. (2012). Manfred Mohr, Cubic Limit, 1973-1974. Retrieved November 17, 2019, from https://www.youtube.com/watch?v=j4M28FEJFF8
OpenProcessing. (n.d.). Retrieved November 18, 2019, from https://www.openprocessing.org/sketch/742076.
Puckett, N., & Hartman, K. (2018, November 17). DigitalFuturesOCADU/CC18. Retrieved from https://github.com/DigitalFuturesOCADU/CC19/tree/master/Experiment4
Place (Reddit). (2019, November 4). Retrieved November 18, 2019, from https://en.wikipedia.org/wiki/Place_(Reddit)
Postscapes. (2019, November 1). IoT Art: Networked Art. Retrieved November 18, 2019, from https://www.postscapes.com/networked-art/
Sermon, P. (2019, February 23). Telematic Dreaming (1993). Retrieved November 18, 2019, from https://vimeo.com/44862244.
Shiffman, D. (n.d.). 10.5: Working with APIs in Javascript – p5.js Tutorial. Retrieved November 17, 2019, from https://www.youtube.com/watch?v=ecT42O6I_WI&t=208s