Hello from the other side!

img_1231

Jie Guan, Unnikrishnan Kalidas

Hello from the other side! Is an physical computing interface exhibiting telepresence. The idea is to physically transmit a simple hand wave from User A to User B. An Arduino nano 33 IoT connected to an HC-SR04 takes hand waves as a input signal from User A transmits it over the Wifi using the ARDUINO NANO 33 with PubNub and sends it over to a remote User B, who has a Servo motor regulated hand which moves from 0 degrees to 180 degrees, upon receiving the signal from User A. This Servo is attached to the palm of the arm, which in turn moves as a waving hand. We further enghanced this project by adding a web cam to the palm of the physically computed hand at User B’s in order to make the telepresence more visual. The idea for this project was inspired by the Shape transmittance project conducted by MIT Media lab for Tangible interfaces. We took this idea of using a hand gesture and transmitting it, into a asimple handwave received as a signal from a User, and physically creating a computed hand at User B’send. In today’s times when people rarely meet each other in reality, let alone wave at someone, we felt that the relevance of a simple hand wave (physically computed) would be justified in today’s time and context, where people meet more often virtually.

Experience Video

How it works

Github

Jie

https://github.com/jieguann/cacexperiment5/blob/main/DistanSensorControl/Jie/Jie.ino

Unnikrishnan

https://github.com/jieguann/cacexperiment5/blob/main/DistanSensorControl/Unnikrishnan/Unnikrishnan.ino

Diagram

sensor-diagram servo-diagram

Images

whatsapp-image-2020-12-04-at-4-58-09-pm

whatsapp-image-2020-12-04-at-4-54-51-pm whatsapp-image-2020-12-04-at-4-55-46-pm

 

Development images

 

pinout-nano33iot_latest wechat-image_20201204163441 wechat-image_20201204163446 wechat-image_20201204163449 wechat-image_20201204163452

Context

 

Sheridan defined that telepresence is the ideal of the users to feel presenting at a remote site physically through the human operator [3].Objectively,  telepresence extents humans’ feeling to a mediated environment through medium, rather than in the immediate physical environment [5].Draper introduces Synthetic Environment (SE), which purposes projecting the perceptual, cognitive, and psychomotor capabilities of the user into a distant, dangerous, or simulated environment, at the beginning of his essay on telepresence. Although both computer-mediated interaction and synthetic environment involve in human-machine interface design, SE subdivides into virtual reality systems, teleoperators, and telecommunications. Draper’s paper focuses onthe  phenomenon  related  to  the  association  of  SEs  and  the  term of telepresence, involving converting user’s self-perception into a computer-mediated environment [2]. Based on the research on the SEs, Draper expended the definitions of telepresence into three aspects: the simple, the cybernetic, and the experiential. In the simple telepresence definition, it refers to the ability to operate in a computer-mediated environment, controlling machines over distance.  The definition of cybernetic definition is that the index of the quality of the human-machine interface, for example,  the operational characteristics of the human-computer-telerobot interface. In the experiential definition, telepresence means that a user feels physically present within a computer-mediated environment.  Cybernetic telepresence focuses on projecting human capability into the computer-mediated environment, whereas experimental telepresence is the immersion of human consciousness into the computer-mediated environment [2].

 

The definitions of Teleoperator and Telerobot are the key to understand teleoperation. Teleoperator is a machine that enables the human operator to manipulate the distance objects through their capabilities [4].  Simultaneously, Telerobot is a robot receives instructions from teleoperator over the distance. It usually has sensors and effectors for manipulation and/or mobility [1]. Teleoperation is mean that using human intelligence to operate a robot  with  adequate  human-machine  interface  in  the  distance. Usually, two robot manipulators are consisted by a teleoperation system.   The master arm,  one of the manipulators,  is controlled by a human operator to generate commands to map the slave arm which is the remote manipulator [1]. With the development of the Internet, the application for teleoperation become increasingly more extensive and indispensable. It had been used widely in hazardous and less structured environments such as space exploration, undersea inspection and maintenance, and toxic waste cleaning [1].

 

Our project is a simple telepresence example; through the 2-dimension screen, users are able to project a low level of vision on the remote site. At the same time, the shake hand movement detected by the distance sensor site can affect the movement of the hand (which attached a webcam) on the remote site. It is an example of teleoperation. While combining this two definitions into our project, the users can see the real-time video move during shake hand, toward a low level of immerse telepresence and teleoperation experiment. On the remote site, the user can see the hand move while others shake hand; this provides a scene of feeling the other site’s user on there.

 

 

REFERENCES

[1]J. Cui, S. Tosunoglu, R. Roberts, C. Moore, and D. W. Repperger.  aReview of Teleoperation System Control. 2003.

[2]J. V. Draper, O. Ridge, O. Ridge, B. David, and J. M. Usher. Telepres-ence. 40(3):354–375, 2008.

[3]T. B. Sheridan. Teleoperation, telepresence, and telerobotics: Researchneeds for space.  InHuman Factors in Automated and Robotic SpaceSystems, pp. 279–291. National Research Council, Washington, DC,1987.

[4]T. B. Sheridan. Teleoperation, telerobotics and telepresence: A progressreport.Control Engineering Practice, 3(2):205–214, 1995. doi:  10.1016/0967-0661(94)00078-U

[5]J. Steuer. Defining Virtual Reality: Dimensions Determining Telepres-ence.Journal of Communication, 42(4):73–93, 1992. doi: 10.1111/j.1460-2466.1992.tb00812.x

Virus v/s Vaccines

images-09

Project by Achal, Jamie, Krishnokoli, Simran

PROJECT DESCRIPTION 

As the COVID pandemic rages across the globe, the only thing humanity currently yearns for is just a simple vaccine. Like a miracle, it is supposed to radically cure the ailment, and bring us back to our normal lives. While Pfizer and BioNtech are swiftly developing the golden elixir, the world can’t wait enough to get back on its feet. Sitting in isolation, often times away from friends, family and loved ones, we develop feelings of loneliness and lose touch with our social identity and being. In this context, we have tried to build a four-player game, which involves creating the first COVID vaccine. Each player is needed to load the liquid from the bottles to the syringe, by clicking on their individual syringes. The player who manages to complete this activity first will be declared the winner of the vaccine race.

images-07

images-08

 

EXPERIENCE VIDEO

BEHIND THE SCENES VIDEO

NETWORK DIAGRAM

images-10

 

FINAL PROJECT IMAGES

screenshot-2020-11-20-at-10-02-13-pm

image_3    screenshot-2020-11-20-at-9-52-39-pm

 

PROJECT DEVELOPMENT IMAGES

image-4        image-5

image-3        image-2

microsoftteams-image-1

screenshot-2020-11-20-at-10-27-37-pm

pubnub

 

CODE LINK

players-11Player 1 • edit • present

players-12Player 2 • edit  • present

players-13Player 3 • edit • present

players-14Player 4 • edit • present

PROJECT CONTEXT 

Social media and multiplayer games have proved to be extremely effective in connecting people, building a shared digital space for everyone, who wish to connect from the safe space of their homes. In this context we were looking forward to build a shared interactive digital environment for people to try out as they spend their time in isolation.

While working on this experiment we came up with multiple different ideas. Some of us thought about expressing emotion through graphics to interact with other people; others thought about making an online jam room or dance party controlled by individual body tracking; we also ideated about creating an online multi-user painting platform and making a landscape design stimulator. Unfortunately, most of these ideas dwindled down because of a lack of clarity or technical feasibility.

While endlessly ideating, drawing, and configuring a workable idea, we were finally inspired by Nicholas Puckett’s Pubnub class example which involved voting, where there are two areas for users to click and the numbers increase when it is clicked.

This example reminds us of some website-based games. When user is clicking the certain area, the number accumulates, and this function could be transferred to other features, for example, colours. And the accumulation of colour, similar to progress bar, reminds us of some racing games such as Road Rash and Need for Speed. Although these racing games are in 3d, we can make our project a 2d games with p5.js.

As for the background of the game, we decided to choose the COVID-19. At the beginning we thought of making an actually racing game; then we got an idea of the injection and vaccine after looking up a variety of covid-19 related news. The accumulation of colour stands for the liquid of vaccine, and this game is about filling the injection of vaccine as users press a certain key on keyboard.

To make this game more fun, more actions were added to the game. For game design there should be system of rewards and penalties () to make gamers motivated. So, there would be a congratulation effect for gamers to reward the winners. To add more interactions between gamers as well as to enrich the game, some disturbing actions are added to the game.

As for the technical methods of the project, PubNub is used for collection and feedback of different users of the game. The whole game is organised by keyPressed and if/else function (). By clicking different keys users can either increase the liquid in their injection or disturb other users by several effects. In a certain period of time, when someone’s injection has the most liquid, he/she wins, and here come the “congratulation” effects.

REFERENCES

  1. Katie Salen et al., “Rules of Play: Game Design Fundamentals”
  2. Lauren McCarthy et al., “Getting Started with p5.js”

Air Band

key-project-image

Band Members:  

Mairead Stewart- Synth

Bernice Lai- Accordian

Kristjan Buckingham- Drums

Clinton Akomea-Agyin- Guitar

Abhishek Nishu- Saxophone

Project Description:

AirBand is a networked series of digital ‘instruments’ that can be played through physical movement. To participate in AirBand, a player opens the p5 web editor and chooses to play one of the five instrument options: guitar, drums, saxophone, accordion, or synth. These instrument choices are flexible, allowing for any number of players on each instrument at a time. Next, the player is encouraged to dance around their space while the PoseNet face tracking library registers their position. A player moves from one side of the screen to the other to choose from a set of pre-loaded audio tracks of their chosen instrument, then claps their hands to make that track play. As more players join the band, their music can be heard by everyone else, meaning that though players may be on the other side of the world, they can still dance and play music with one another. While many aspects of the AirBand experience are the same for each instrument, the visual elements have been personalised so that players will see a different art experience on their screen for each instrument they play.

AirBand is not only a tool for creative expression, it is a real-time collaborative experience. As our social interaction increasingly moves to virtual spaces, experiences like these give participants an enjoyable alternative to the virtual meetings they are accustomed to. Rather than passively joining a video call, AirBand encourages participants to become active members of the group, physically contributing to a joint creative project.

Experience Video:

https://ocadu.techsmithrelay.com/K5QW

Behind the scenes:

https://ocadu.techsmithrelay.com/Ld9Z

Network Diagram:network-2

Final Project Images: 

Nishu:

final-image-copy

Mairead:

mairead_final_image

Bernice:

bernice_final_image-1

Kristjan:

kristjan_final_image

Clinton:

clinton_final_image-1-copy

Development images: 

img_0256-copy

img_0257-copy

%e6%88%aa%e5%b1%8f2020-11-19-%e4%b8%8b%e5%8d%882-59-09

screen-shot-2020-11-20-at-10-22-49-am

screenshot-2020-11-19-at-11-30-19-pm

Link to the code:

Kristjan’s Edit Link:

https://editor.p5js.org/kristjanb/sketches/ZCi4G6dDY

Kristjan’s Present Link: https://editor.p5js.org/kristjanb/present/ZCi4G6dDY

 

Mairead’s Edit Link:

https://editor.p5js.org/mairead/sketches/lc_bqnZK3

Mairead’s Present Link:

https://editor.p5js.org/mairead/present/lc_bqnZK3

 

Bernice’s Edit Link

https://editor.p5js.org/3149332/sketches/48Ny1AaI9

Bernice’s Present Link:

https://editor.p5js.org/3149332/present/48Ny1AaI9

 

Nishu’s Edit Link

https://editor.p5js.org/Abhinishu/sketches/H53lh0FlG

Nishu’s Present Link:

https://editor.p5js.org/Abhinishu/present/H53lh0FlG

 

Clinton’s Edit Link

https://editor.p5js.org/clinagyin/sketches/15PAtJyRd

Clinton’s Present Link:

https://editor.p5js.org/clinagyin/present/15PAtJyRd

 

Project Context:

AirBand is based on a number of projects that aim to create a shared space for remote musical collaboration. Although the challenge of connecting musicians virtually is hardly new, the pandemic has drastically increased demand for such a service.

Latency seems to be the primary issue most commonly addressed. With a reliable high-speed internet connection and direct input from an ethernet cable in favor of WiFi, some platforms such as Jamkazam are able to reduce latency to an acceptable degree, but it is not currently possible to achieve synchronous real-time playback (BandMix). The most popular workaround is to use a base track that each musician can play along to separately, then all of the recordings are layered together manually, rather than actually playing live (Pogue).

For those who do choose to navigate the latency issues to play together live remotely, another challenge that is less commonly addressed is access to instruments. It is often assumed that most musicians will have their instruments at home, but that is not always the case. The Shared Piano from Google Creative Lab, which was the main inspiration for this project, addressed this issue by allowing users to play music by inputting an electronic piano or by using the keyboard on their computer (Google Creative Lab).

21 Swings, created by Mouna Andraos & Melissa Mongiat and exhibited yearly in Montréal, does not connect users remotely, but it does allow various users to contribute to a shared musical experience in a physical way without the use of traditional instruments. Instead, a series of swings trigger distinct notes, prompting users to work together to create a composition (“21 Balançoires (21 Swings)”).

AirBand attempts to expand on the Shared Piano and 21 Swings by allowing each user to “play different instruments” by triggering distinct sounds remotely. PoseNet was incorporated in order to introduce more physicality to the experience, encouraging participants to dance and move around. Unique corresponding visuals can also be customized by each collaborator, creating a more tailored experience.

Whether playing over base tracks to simulate live jams, using technology to simulate instruments, or finding other ways to work around the lag, people are determined to find ways to come together and collaborate on music remotely (Randles). This endeavor is leading to some amazing innovations and many impressive creative solutions.

Works Cited

 “21 Balançoires (21 Swings).” Vimeo, uploaded by Daily tous les jours, 24 Apr. 2012, https://vimeo.com/40980676.

Google Creative Lab. “Shared Piano.” Experiments with Google, June 2020, https://experiments.withgoogle.com/shared-piano.

Pogue, David. “How to Make Your Virtual Jam Session Sound—and Look—Good.” Wired, 4 Jun. 2020, https://www.wired.com/story/zoom-music-video-coronavirus-tips/. Accessed 18 Nov. 2020.

Randles, Clint. “Making music at a distance – how to come together online to spark your creativity.” The Conversation, 13 Apr. 2020, https://theconversation.com/making-music-at-a-distance-how-to-come-together-online-to-spark-your-creativity-135141. Accessed 18 Nov. 2020.

“Virtual Jamming Tips.” BandMix, https://www.bandmix.com/virtual-jamming/. Accessed 18 Nov. 2020.

 

Simon Says ‘Message Received’: Light The Beacons

Group 04 Members: Candide Uyanze, Greg Martin, Jay Cooper, Patricia Mwenda

PROJECT DESCRIPTION 

Simon says: Light the beacons is an interactive sequential multiplayer game that uses a series of patterns which includes shapes, sounds and colours. The first player starts off a level by creating a short 3 pattern message which each player must then replicate and add a single new input. Players score points for successfully replicating the pattern, but getting the pattern wrong disqualifies the player and causes the game to restart a new level increasing the difficulty of each new level.

The game has 4 rounds and 4 different prompt messages to demonstrate the level of difficulty, the first round starts off with a warning message that has calm ambience and blue monotones to give a sense of calmness and least urgency. The next levels message is urgent and uses orange monotones with the shapes sides increased to send a message of more urgency. The third level has spiky shapes with yellow monotones that rotate and has a danger message, and the last level is the most difficult with a critical message that has red monotone shapes that are very spiky and rotate rapidly to give a sense of alertness and extreme urgency.

 EXPERIENCE VIDEO 

HOW IT WORKS VIDEO 

NETWORK DIAGRAM 

picture-1

FINAL PROJECT IMAGES 

0cb304c2-d641-4c9d-bd7a-24830828d799186c0c04-5875-4efc-8b85-efb70b2bbab853e4c126-e75b-4776-9eaa-fa8a9055019e559f33b1-c578-4950-a118-6c32a91e008b59151bed-2d02-4ba3-ac82-1148fa6759039f6d54ed-594e-4e6d-9aec-e436890d3f4362fab8ee-10aa-4751-bfa9-ef2714779feedfdfe467-e864-4c6d-bad4-60f13a5483066dc09165-a12f-4fdc-8d74-a7db3763e82dedf284cd-0cb8-46ab-823a-dc05771bae054815fcf1-f423-423b-91a9-0dba45c3d355

DEVELOPMENT IMAGES

34737ad2-d572-4f2c-821c-e3af6e8c6b53 151843a9-769d-4ef7-a288-e245075d5386280a159f-a0af-4a95-a83f-e123a94340eb

 

 

 

Initial Game Ideas

traditional-communication

Initial Communication Idea

8f0cde9e-e3b3-4696-9b78-45dc9cd3110cInitial Sketch Ideas

screenshot-2020-11-16-at-22-30-03screenshot-2020-11-16-at-22-30-22screenshot-2020-11-16-at-22-31-42screenshot-2020-11-16-at-22-29-03

First Iteration and Testing

screenshot-2020-11-17-at-19-20-55screenshot-2020-11-17-at-19-21-53screenshot-2020-11-17-at-19-23-04

Second Code Iteration and Testing

b7a8c882-cc78-4501-8065-e360bf9235d50f5b0185-d117-4385-9dca-55326c0f74a627fed456-4179-4ca4-8f96-20e88ead592fb6d16f15-93cd-49f8-aa98-f42625ccad6c

Third Iteration and Testing

LINK TO CODE

Present Link: https://editor.p5js.org/Grgmrtn/present/nHvUAClPa

                                                                                                                                              PROJECT CONTEXT
Early into ideation, we realized that we wanted to find a fun way of incorporating real-world communication methods, as it could work well with the shared spaces of play theme for this experiment. We determined that traditional methods of communication, especially those used across vast distances, would likely provide innovative ideas for the experiment. In seeking inspiration, African communication particularly dominated the discussion with its wealth of methods employed by different groups across the continent. Examining African traditional communication as a form of mass communication, Delta State scholar Elo Ibagere notes that, “it has served the African’s purpose effectively in various capacities… there remains an African communication system with a distinct character, fashion to suit Africans which may not be identified in any other place in the world” (4, 2020). In this same paper, Ibagere notes that: The effect of Westernization which has now translated to globalization is quite devastating to Africa in the sense of a lamentable and, sometimes, deliberate alteration or outright destruction of values and norms of African people and societies. Such impact has affected the communication system to the extent of almost obliterating it in the urban areas, with only vestiges of the system left in the cities. However, the system continues to remain paramount in rural areas where the population relies on the system to satisfy their communication needs. (4, 2020)

The revelation that these traditional models are both dying out and still heavily relied upon cemented their use within our experiment. From the plethora of communication methods before us, the talking drums stood out as the most captivating. Talking drums have their own method of communicating through rhythm and tone, typically using phrases to express an idea (Gleick, 18, 2011). Talking drums, such as the doodo, are employed primarily in forested areas, especially in the West African region, as their rhythms can travel far through hollow environments (Adeola, 2019). These drums could be used for a wide variety of purposes, from announcing a birth to warning of invaders. We decided to take the rhythmic percussion of the drums to create our own language of shapes, strung together to create messages.

Further research revealed that these drums were often used to send messages from village to village. Author James Gleick explains in the Talking Drums chapter of his book, The Information, “through the still night air over a river, the thump of the drum could carry six or seven miles. Relayed from village to village, messages could rumble a hundred miles or more in a matter of an hour” (19, 2011). This presented us with the gameplay element of communication: the importance of being able to replicate the pattern heard, to then be passed off to the next drummer, in what would play like a game of Simon Says (or variants such as Follow the Leader). Furthermore, Gleick explains the uniqueness of each pattern, as: “only some people learned to communicate by drum, [but] almost anyone could understand the messages in the drumbeats. Some people drummed rapidly and some slowly. Set phrases would recur again and again, virtually unchanged, yet different drummers would send the same message with different wording” (20, 2011). This allowed us to add an extra dimension of difficulty and customization by having each player add a beat to the message after replicating it, mimicking the variance in message from drummer to drummer.

Lastly, we did not want to ignore the digital aspect of this experiment, so we decided to augment the traditional communication method with digital elements, specifically communication through the colours and shapes of the screen-based objects. While colour psychology is perhaps more art than science, “researchers and experts have made a few important discoveries and observations about the psychology of color and the effect it has on moods, feelings, and behaviors” (Cherry, 2020). Cooler colours like blue and purple are associated with feelings of calm, whereas warmer colours like red and orange evoke feelings of aggression, danger, and excitement. These principles were applied to our level progression, building up in urgency towards the end. Additionally, we begin to adjust the shapes to match this progression. As the levels become increasingly urgent, the shapes transform their sides into protrusions that resemble spikes —intended to signify danger— and speed up, creating more visual information to process. These elements were added to increase the player’s stress, much how they might feel if they were really propagating an urgent message.

BIBLIOGRAPHY 

Adeola, N. B. (2019, July 24). How ancient Africans communicated over long distances way before the telephone. Face2Face Africa. https://face2faceafrica.com/article/how-ancient-africans-communicated-over-long-distances-way-before-the-telephone1.

Cherry, K. (2020, May 28). Can Color Affect Your Mood and Behavior? Verywell Mind. https://www.verywellmind.com/color-psychology-2795824.

Gleick, J. (2012). Drums That Talk. In The Information: A History, a Theory, a Flood. Fourth Estate.

Ibagere, E. (2010). Introduction. In Introduction to African traditional communication system. University Printing Press, Delta State University.

 

 

 

 

Splash of Greens By Group 2

coverpage

screen-shot-2020-11-20-at-4-52-04-pm

Group 2 Members: Xin Zhang, Jie Guan, Unnikrishnan Kalidas, Grace Yuan

Project Description

Splash of Greens is a remote multi-user digital experience that lets users remotely water a plant-specific to them, while it grows in a virtual garden in real-time. All users with access to the desktop listener link can access the virtual garden and view the garden grow as each remote player waters their respective plant. A control bar regulates the amount of water poured onto the plant, which can be visible on the individual control screens of users. Upon repeatedly touching the ‘Water’ button provided, a green portion within the bar reduces signifying the maximum limit to which the plant can be watered, further visualized by a full-grown plant, visible on the user control screen and the desktop listener screen. In our project we have chosen four different plants namely, Blackberry, Snake plant, Coral Cactus, and a Monstera plant, all visualized finally to grow within the same virtual garden, but remotely watered by different users. In these tough times of the COVID – 19 people interact very often on screen; our project is a reminder to people about the earthly and humane things in life, as basic as watering a plant.

Final Images

listener phone-screen

Experience  Video 

How it works video 

 

Development Images

develop1 develop2 develop3 develop4

 

Diagrams

 

screen-shot-2020-11-20-at-4-52-13-pm

screen-shot-2020-11-20-at-4-52-22-pm

Code Links

Desktop Listener:

Present: https://editor.p5js.org/guanjie429102905/present/fqEpy0yGO 

Edit: https://editor.p5js.org/guanjie429102905/sketches/fqEpy0yGO

Controller 1 – Snake Plant:

Present: https://editor.p5js.org/xinzhang.jessy/present/aDOZc82RC

Edit: https://editor.p5js.org/xinzhang.jessy/sketches/aDOZc82RC

Controller 2 – Monstera:

Present:https://editor.p5js.org/unnikrishnankalidas/present/ECe_449mM

Edit: https://editor.p5js.org/unnikrishnankalidas/sketches/ECe_449mM

Controller 3 – Euphorbia lactea:

Present: https://editor.p5js.org/guanjie429102905/present/_mFu7mcMS

Edit: https://editor.p5js.org/guanjie429102905/sketches/_mFu7mcMS

Controller 4 – Blackberry:

Present: https://editor.p5js.org/grace.yuan/present/zXHs51ira

Edit: https://editor.p5js.org/grace.yuan/sketches/zXHs51ira 

Project context

Morpheus, a character in a scientist fiction movie The Matrix, addresses that “The Matrix is everywhere. It is all around us. …” (1999). As humans, we can be considered as part of the Matrix, to interact with information, each other, and the Context. We think the presentation of interaction between humans and information on the Internet is similar to Matrix’s concept, encoding the physical and material touch on the screen to share over the Internet and decode to present on the computer-generated environment. Internet affects multifaced of human’s life, as Steyerl said, “Never before have more people been dependent on, embedded into, surveilled by, and exploited by the web. It seems overwhelming, bedazzling, and without an immediate alternative. The Internet is probably not dead. It has rather gone all-out. Or more precisely: it is all over!” (1). We noticed that during the pandemic in 2020, an increasing number of people depend on the Internet for working, studying, and shopping. The Internet has already “enter” every aspect of human life; it is everywhere in our space and all around us. ‘Splash of Greens’ provides a virtual collaboration space for users to grow plants through the Internet, with multiple controllers in different locations to send messages to the virtual garden on the webpage. The project shows how we transformed the materialized gardening behaviors into screens and information in virtuality, presenting communication over the Internet similar to the Matrix.

Teleporting An Unknown State is a project created by Eduardo Kac in 1994, providing the Internet’s experience as a life-supporting system. In a very dark room, a pedestal with earth serves as a nursery for a living plant. Through a video projector suspended above and facing the pedestal, remote participants send light from the sky of remote cities via the Internet, in real-time, to enable this plant to photosynthesize and grow in total darkness. Our project can also consider as a telepresent experiment with plants. Rather than project the physical sky to the plant, we teleoperate virtual water for the plant growing on the screen.

Moon is an online creating art experiment by Chinese artist Ai Weiwei and Danish-Icelandic artist Olafur Eliasson. Moon is a web-based artwork to invite users to leave their mark, drawn or written, on a virtual moon’s surface as a shared platform. Moon’s open call for creative input is a powerful statement about the potential for ideas to connect people across the world of breaking distance, political, social, and geographical boundaries in the Internet age. Although our work does not ask uses to leave a mark on the virtual space, it provides a controller for the users to grow the plants from various locations in the virtual space.

 Works Cited

“Moon • Artwork • Studio Olafur Eliasson.” Studio Olafur Eliasson, olafureliasson.net/archive/artwork/WEK108821/moon. 

Steyerl, Hito. “Too much world: Is the Internet dead?.” E-flux journal 49 (2013): 1-10. 

Teleporting An Unknown State, www.ekac.org/teleporting_ an_unknown_state.html. 

Wachowski, Lana, and Lilly Wachowski. The Matrix. Warner Bros., 1999. 

Liquid Keyboard – DripDrops

Liquid Keypad – DripDrops
Kristjan Buckingham

img_1033

The Liquid Keypad is a play on the conventions of Tangible User Interfaces, utilizing a series of “Liquid Keys” that the user can interact with by dipping their fingers into. Water is typically not a friend to exposed electronics, so there is a precarious aspect that adds a bit of tension to the experience. Although the water could certainly damage the electronics if not handled carefully, there is no actual danger to the user. The Liquid Keyboard is used to control effects in the program DripDrops, which mimics a soothing dripping effect that the user can manipulate, creating a direct connection between what the user is feeling by touching the Liquid Keys, and the visual and auditory representation on their device. Each Liquid Key plays a slightly different sound, inviting the user to explore different combinations of key presses. The drips also grow as long as the user holds their finger in the water, and the background “dampens” by getting darker as more keys are pressed. This calming interaction is meant to subvert the user’s initial hesitation and call attention to extreme affordances of water as a medium.

Experience Video: https://vimeo.com/478125809

How-It-Works Video: https://vimeo.com/478124814

 Project Images
 dd1

A different drip sound effect plays when each Liquid Key is touched and the corresponding drop begins to grow while the user’s finger is submerged.

dd2

As the user dips more fingers into more Liquid Keys, the drips make a musical effect and the background “dampens” by getting darker.

Development Images

dd3

dd4

dd5-1

dd6

Link to Arduino Code for Liquid Keyboard:
https://github.com/buckinghamk/DIGF6037_ex3_BuckinghamKristjan_Arduino

Link to Processing Code for DripDrops: https://github.com/buckinghamk/DIGF6037_ex3_BuckinghamKristjan_Processing

 Fritzing Circuit Diagram

 screen-shot-2020-11-11-at-1-14-26-am

Project Context

When deciding what material to use for this project, I was considering the affordances of various substances and how those might be utilized in unexpected ways. Using conductive touch sensors presented its own set of limitations as well. The choice of using water as a Tangible User Interface was meant to be somewhat counter-intuitive to the traditional incompatibility of water and exposed electronics, challenging the expectations of the user.

There are some interesting examples of different approaches that others have taken to try and integrate water as part of a Tangible User Interface. Garden Aqua utilizes high pressure jet-streams of water to imitate levitation of objects (Wenjun). In this case, sensors track the user’s hand gestures to create a response and do not interact with the water directly.

Another study exploring liquid as a Tangible User Interface focused on the chemical composition of water and how different substances could be added to trigger different responses (Hotta). This exploration integrated a variety of materials with some interesting results, but again there is no physical contact between the user and the liquid.

In this project, the user only ever comes into contact with the water itself. Conductive tape is submerged in the water and triggers the capacitive sensor when the user dips their finger in. Precautions were taken to ensure the water has very little chance of coming into contact with the circuit board or laptop, but care on the part of the user is still required. I wanted to make sure there was no chance of harm to the user and very little chance of harm to the electronics.

I think there is an interesting cognitive play between the initial discomfort of seeing exposed wires attached to a tray of open water, and the familiar feeling of dipping one’s fingers into water. There is certainly a limited context where this kind of interaction may be viable, but I think it is an effective subversion of traditional interaction.

Works Cited

 Holmquist, L. (2019). The future of tangible user interfaces. Interactions. Retrieved November 9, 2020, from https://interactions.acm.org/archive/view/september-october-2019/the-future-of-tangible-user-interfaces

Hotta, M., Oka, M., & Mori, H. (2014). Liquid tangible user interface: Using liquids in TUI. HIMI 2014, Part I. Retrieved November 9, 2020, from file:///Users/Kristjan/Downloads/Hotta2014_Chapter_LiquidTangibleUserInterfaceUsi.pdf

Wenjun, G., Seungwook, K., Sangung, Y., Minkyu, C., Seungha, Y., & Kyungown, L. (2017). Garden aqua: Tangible user interface study using levitation. International Journal of Asia Digital Arts & Design. Retrieved November 9, 2020, from https://owl.purdue.edu/owl/research_and_citation/apa_style/apa_formatting_and_style_guide/reference_list_electronic_sources.html

 

 

 

Frog You

Frog You by Jay

frogyou
Such a cute… frog?

Project description

Frog You is a game created in Processing that is designed for play with a tactile interface. This clone of the 1981 Konami arcade game, Frogger, places you in control of a randomly chosen animal (none of which are frogs), with a mission to cross a busy highway and use boats to cross a raging river. Why does the frogger cross the road? To get to the other side, of course! Upon reaching the end of a level, your character moves on to the next, where they face a distinctively new challenge, as the obstacles are randomly generated. At the top of the screen, you can see your Level and Score, which is your total consecutive forward jumps without being smooshed, encouraging careful, but persistent, gameplay. The game comes with a controller, powered by an Arduino using touch sensor input, designed to add an additional layer of difficulty as you learn to orient yourself to counterintuitive controls. Those of us who play games a lot can easily take for granted our familiarity with controllers, but with this game, I sought to use the tactile interface as a means of forcing the player to “re-learn” how to use a controller. The game is quite difficult early on, but its rapid iteration time (and the consolation prize of cute animal avatars) keeps the player invested long enough to improve.

Experience Video

https://drive.google.com/file/d/1QhWHIgnuXwmdcRN6qnVPaF5Yu6wgHku3/view?usp=sharing

How It Works

My microphone was determined to not work, so I’ve captioned the images with what I’m doing.

123914532_3524904954256838_2445554533621851887_n1
The game treats cars and boats as the same object, but detects collision differently depending on what lane they are in.
124032079_552802885585897_5594261537067028696_n1
The Arduino is taped to the back of the controller.
124064614_373377970531906_416616692752811729_n
The “buttons” point the direction your character moves, but their positions are inverted. This adds an extra layer of difficulty as the user has to learn how to use the controller.
124262192_1018799285259723_2348292965848266396_n
“Dammit! Oh, look, a giraffe.” The character image changes with every reset. There are 10 total character images, randomly selected, so the player isn’t as demoralized by losing.

Progress Pictures

123715588_274530833977456_5008266504148897249_n
Started by grabbing assorted cardboard from my basement
123735003_946237742448691_5576081387923882288_n
Cut away until I had something roughly the size of a controller. A bit awkward to hold, but that’s also kind of the point
123916975_383551679460074_3310705655851602773_n
What’s wrong with the control scheme? It tells you exactly what it does
124570331_1245097565868163_624833726674875065_n
Taped the Arduino to the back. Encourages users to hold the box in a way resembling a controller, rather than they would hold a phone.

Link

GitHub:

https://github.com/JayTheDaniels/Frog-You

For a standalone version of the game, check out my itch.io page!
https://jaythedaniels.itch.io/frog-you

Circuit Diagram

experiment3_circuit

Project Context

The context of this project exists less in relation to other works, and is more in relation to my own aspirations, particularly the upcoming thesis project. I had spent the last few months slowly trying to self-teach game design and development, but didn’t feel as though I was making much progress. Just before the start of Experiment 3, I realized this was due to not having actually made anything, instead only watching lectures and tutorials. For this project, I had my own side-mission of making a proper game as the screen-based interaction component, and using a tactile interface to augment the player’s experience in some way. Admittedly, I spent far more time conceptualizing the former than the latter, but there was one key idea that I wanted to explore using the two: how to effectively teach playing games.

Have you ever seen someone who doesn’t play games hold a controller? It’s almost always positioned awkwardly: index fingers placed on joysticks, thumbs off controller entirely, trigger buttons largely disregarded. Even those who play games more regularly may hold controllers in different ways. PC Gamer magazine found that amongst its own writing staff there were three different ‘styles’ for holding controllers, and that’s before introducing the awkward holding of a Nintendo 64 controller, with a joystick in the dead center (PC Gamer, 2020). Games teach players how to use the controls for the specific game, but few seem to actively teach how the controller should be held. For the inexperienced player, this can add an unexpected level of difficulty and may increase resistance to investing time to learn. For the experienced player, using more efficient styles such as ‘the claw’ grip, this can have negative impacts on health, as: “[w]hen you use the claw grip on a console controller (Xbox, Playstation, or any other), there are five muscles that may be involved. The fashion in which these muscles are engaged creates the potential for pain and injury risk—specifically, in your index finger” (Esports Healthcare). The variance in console manufacturer controllers further adds to the confusion, as the way one holds a Nintendo Switch is different from a PlayStation controller.

The goal of the tactile interface component was to create something that would force the player to learn how to use the game controller. In order to achieve this, the controller needed to be changed in a way that would prove troublesome for experienced gamers and novices alike. I quickly decided to change the typical layout of the controls, directional keys pointing outwards, to its reverse: directional keys pointing inwards. As I played through it, I had to be consciously aware of what each button did, as muscle memory was rendered unreliable. This demonstrates the importance of design decisions made for the screen-based interaction. I intended to encourage an iterative approach to learning the controls: provide players with a simple challenge (avoid cars, ride boats), give a reward for failing (random animal graphic), and provide metrics to show improvement (Level and Score). Because levels are generated semi-randomly, the player can continue for as long as they like, developing their mastery over the controller as they progress. In an effort to guide the player into holding the controller in a specific way, I decided to place the Arduino directly on the back. This was perhaps not optimal, but done as a response to a tester holding the controller with one hand, and using the other to touch the directional pads (which worked brilliantly and is still under consideration).

Lastly, this experiment provided me with the opportunity to apply some of the game design theory I had acquired, but had yet to put into practice. The experience of having put together a short game with a specific purpose invigorated interest in doing something similar for the thesis project. While it may not be clear yet what the purpose will be, I feel especially confident that I can produce a game and apply theory to it in a meaningful way for my thesis.

Works Cited:

Esports Healthcare (2020). Claw grip for controller users: 7 steps to stay healthy. Esports Healthcare. Retrieved from: https://esportshealthcare.com/claw-grip-for-controller/#What_is_the_claw_grip_for_controller_users

PC Gamer (May 16, 2020). How do you hold a controller? PC Gamer. Retrieved from: https://www.pcgamer.com/how-do-you-hold-a-controller/

 

Visual Drumstrokes by Unnikrishnan Kalidas

key-image

About the Project:

This experiment is created as add on to my drumming hobby, in order to use tactile hand touches on a conducting surface as a drum pad.  The idea is to not only emulate specific drum sounds for different electrodes but also to send out visuals for each signal that is being triggered. I feel I  have a particular sound I go for when playing the drums, and the standard sound that a kit emulates is not enough.  Due to the present COVID situation, I was not able to access a Drum studio, but later on, I plan to connect this system to a live set up with a screen to exhibit the visualization and an amplifier to emulate the additional sound I plan to layer onto the existing drum sound, using the help of Piezo sensors instead of the MPR121 due to the latency issue. This can, later on, be used for actual musical gigs. The idea is to create a more engaging experience of beats, in terms of sound and visuals. For this project, I plan to make a two-electrode system, one to emulate the bass drum sound, and one to emulate a snare drum sound.

A simple project that inspired me to do this was www.patatap.com,  an online website that produces random high fidelity sounds on keystrokes of the keyboard, along with a momentary animation for the beat, which is characteristics of a single beat.

Experience Video:

https://www.youtube.com/watch?v=H0P8gQQJouM

 

How it Works Video: 

https://www.youtube.com/watch?v=ee-6GHXOPGY

 

Final Project images :

touchpads

final-image

key-image

 

Developments Process:

Arduino Leonardo-

img_0030

 

MPR121 Sensor-

img_0033

 

Sample touchpads-

touchpads

 

Trial Video for Keystrokes:

https://www.youtube.com/watch?v=gpfnAl0rmS0&feature=youtu.be

 

GitHub link:

https://github.com/Unnikrishnankalidas/Tangible-interface

 

Circuit Diagram 

exp_3_circ

Project Context

The concept for Visual Drum-strokes came about, with the sole idea to use a tangible action like touch to produce a visual and aural signal, it could be music, instruments, or just even noise. My interest in drumming as a hobby is what further lead me to think in this direction and create a drum sampling machine that can utilize any capacitive surface as a point to provide signals for the same. As I had joined the course comparatively late, I had some restrictions in terms of the knowledge I had with coding and also the equipment available to me at my disposal. Thankfully I had few of the components already with me and with the help and support of Nick and Kate I was able to come up with something on short notice. The idea was simple, but I wanted to execute it in a neat manner, and hence I decided to associate my electrode signals from the MPR121, to not only produce a WAV or MP3 sound stored on the sound library of Processing but also to provide short and beat like animation with the limited knowledge of processing that I had.

A few years back I had come across this website called http://www.patatap.com which was basically a visual animator for keystrokes playing different sounds on the computer. This was clearly my inspiration to go about this project. Not only did the idea get me thinking the very simplicity of the animation shown on the page made the page seem highly engaging. So my idea was to recreate the same idea with my own sounds, and animation but with the help of Processing as I was already running short of time.

The other projects that inspired me and helped me to think creatively were:

https://www.instructables.com/Capacitive-Touch-Arduino-Keyboard-Piano/

https://create.arduino.cc/projecthub/user4573/copy-of-paper-piano-62302f?ref=tag&ref_id=capacitive&offset=15

https://learn.adafruit.com/capacitive-touch-drum-machine

 

 

 

Dhyāi Drishti by Simran and Krishnokoli

dhyai_drishti_cover-01

cover-image    img_5787

About the project

Dhyāi Drishti is an interactive installation, which can be optimized as a product to help induce a state of lucidity, relaxation, concentration and mindfulness. It is a simple device that responds to touch and creates a meditative atmosphere around the viewer. Dhyāi Drishti’s installation consists of a tactile yogic figurine along with a screen and speaker. The figurine is embedded with small aluminum buttons – positioned according to the chakras or ‘energy points’ in the body and embellished with their representative colours and icons. On touching these buttons, the viewer can experience a meditative visual of the selected chakra on the screen along with a ruminative music of ‘tanpura’ and ‘sitar’ in the background to activate the chakras in their body.

The term ‘Dhyāi’ is a Sanskrit word for concentration or mindfulness. It is the origin of the word ‘Dhyana’ in Hindi or Dhyān in Bengali which means meditation. The term ‘Drishti’ is a Sanskrit word for vision. Our project, ‘Dhyāi Drishti’ is intrinsic to the concept of chakras and their meditative prowess. It is supposed to help the viewer activate a particular chakra as they see the visual, hear the music and enter a meditative state. Each sound and visual is carefully designed and curated according to each chakra. Chakra, meaning “wheel” in Sanskrit, represents a series of energy points (prana) in the body. While they are in the body, they’re not physical centers. They could be considered “astral”, or in what is often alluded as our “subtle body”. The concept behind them is ancient — the first mention appears in the Rig Veda, dating back to approximately 1500 B.C.E. — some of the oldest writing in our civilization. Similar versions of the chakras are incorporated in Hinduism, Buddhism, Jainism, and several New Age belief systems. The chakras go up and down the spine, from the bottom up to just above the crown of the head. They each represent a step forward in evolving consciousness. Below is a reference to all the chakras and their intrinsic meaning.

dhyai_drishti_icons-02Muladhara: is the root chakra, it is red in colour and symbolizes safety, survival, grounding, nourishment from the Earth energy. Note that in chakra healing practices, red may denote inflammation at the physical level.

dhyai_drishti_icons-04Svadisthana: is the sacral chakra, it is burnt brick in colour; it carries meanings associated with emotions, creativity, sexuality, and is associated with water, flow.

 

dhyai_drishti_icons-03Manipura: is the solar plexus chakra, it is yellow in colour and symbolizes mental activities, intellect, personal power, will.

 

dhyai_drishti_icons-05Anahata: is the heart chakra, it is green in colour; it is connected with love, relating, integration, compassion.

 

 

dhyai_drishti_icons-06Visuddha: is the throat chakra, it is blue in colour and symbolizes self-expression, expression of truth, creative expression, communication, perfect form and patterns.

 

dhyai_drishti_icons-07Ajna: is the third eye chakra, it is indigo in colour; it evokes intuition, extrasensory perception, inner wisdom.

 

dhyai_drishti_icons-08Sahasrara: is the crown chakra, it is purple in colour; it’s associated with the universal, connection with spirituality, consciousness.

 

 

 

Experience Video

Behind The Scenes Video

Project Images

Experience 1

img_4036      cover-image

img_4023-copy      img_3997

Experience 2

img_5787     img_5824

img_5822     img_5785-copy

Development Images

img_4050     img_20201106_131214

imgpsh_mobile_save-1     imgpsh_mobile_save

Github Link

 

https://github.com/Krishnokoli/Experiment-3-Tangible-Interfaces

Fritzing Diagram

screenshot-2020-11-06-at-12-28-19-am

Project Context

The ideation and concept of Dhyāi Drishti was initially conceived from our particle code. While brainstorming on a probable concept for our project, we perceived the particle scatter as a meditative animation. While discussing more on the concept, we realised how often we tend to miss out on self actualisation and meditation. Since meditation is extremely important for not only well-being but also, mental clarity, concentration and a myriad other reasons, we figured, making an interactive tactile meditating aid, would definitely be a good idea after all. Also, since one of us fell sick during the project process, it really helped us calm down, and inculcate meditation into our daily activities (especially while coding).

One of our main inspiration for the project was the Unyte. It is a relaxation or stress-management program with a biofeedback device known as the iom2, it tracks the breathing and heart rate and guide through the practice. The iom2 measures Heart Rate Variability (HRV), a measure of the variation in time between heartbeats and is considered to be a strong indicator of meditative state.

While most of us are stuck at our homes, socially distancing to prevent further spread of the COVID-19 virus, we often find ourselves restless, sleepless, agitated, annoyed bored or unable to concentrate. Because social interactivity is so elemental of our nature, we find it very difficult to isolate. It is proven that meditation can help increase mindfulness and overall sense of well being in these trying times. Our project is a humble adventure to promote the simple activity of meditating. We look forward to increase the scope our project and create diverse interactions to make meditation more fun, easy and experiential. We would also seek opportunities in the future to work together on this project and further develop our concept to concrete output as an installation/product/tech-wearable.

Citations

“Journeys – Unyte.” Accessed November 7, 2020. https://unyte.com/pages/journeys.

“Pause | Mindful Movement for a Happier You.” Accessed November 7, 2020. https://www.pauseable.com/.

Peck, Bob. “The Chakras Explained.” Medium, April 1, 2020. https://medium.com/@bewherehow/the-chakras-explained-6aa43e1f0f5c.

Salehzadeh Niksirat, Kavous, Chaklam Silpasuwanchai, Mahmoud Mohamed Hussien Ahmed, Peng Cheng, and Xiangshi Ren. “A Framework for Interactive Mindfulness Meditation Using Attention-Regulation Process.” In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 2672–84. Denver Colorado USA: ACM, 2017. https://doi.org/10.1145/3025453.3025914

Sparks, Lloyd. “The Neurophysiology of Chakras.” Medium, June 13, 2019. https://medium.com/@lloydsparks/the-neurophysiology-of-chakras-3f20a0f5b3b5

“The Story of PAUSE.” Accessed November 7, 2020. https://www.ustwo.com/blog/the-story-of-pause.

Vidyarthi, Jay, and Bernhard E. Riecke. “Interactively Mediating Experiences of Mindfulness Meditation.” International Journal of Human-Computer Studies 72, no. 8–9 (August 2014): 674–88. https://doi.org/10.1016/j.ijhcs.2014.01.006.

 

Cyber Box By Jessy

cover1 cover2

Project description 

Cyber box is an interactive box that consists of three types of action to trigger the interaction between the screen and the tangible tactile interface. And the whole process is sequential and narrative. In the first part: users will see a series of dialog box pop whenever they putting a sticker into a designated area. This action represents people making comments or labeling others on social media. In the second part, uses can control the volume of the sound which is consisting of people talking. Additionally, the change of volume is also demonstrated by the dynamic sound waves on the screen. These sound came from people’s daily conversations, and when they are piled or becoming louder, it will bring a sort of oppression that is uncomfortable for the listeners. In the third part, users can push the ‘button’ to magnify the virtual balloon on the screen and the ballon will eventually boom after repeating this action.

Experience Video

How It Works 

Final project images

final1

screen

final4

final3

final2

Development images

dev1

dev2

dev3

dev5

Link to the Arduino code hosted on Github

https://github.com/xinzhang-jessy/cyberbox-experiment3.git

Circuit Diagram

circuit

Project Context

With the prevalence of social media and digital applications, our relation with the virtual world is much more closer. People used to post their feelings and opinion on social media. In this virtual digital world, people can share comments, photos, posts, and these contents are viewed by strangers or acquaintances. Increasingly, this digital World gives rise to Cyberbullying. Cyberbullying is bullying with the use of digital technologies. It can take place on social media, messaging platforms, gaming platforms, and mobile phones  According to the Cyberbullying Research Center, about half of young people have experienced some form of cyberbullying, and 10 to 20 percent experience it regularly. It is quite easy to post images or text on social media, just like we paste a stick to a whiteboard which is so common that we do not need to think about it before doing. Similarly, people can easily post hurtful or spread rumors things online and pretend to be someone else, so Cyberbullies may not realize the consequences and even find it funny. Through the Cyber box, I try to connect the physical action with the behaviors that existing in the digital world, specifically the first one is pasting a stick repeatedly to see lots of dialog box which will make people feel depressive after the screen is filled with them. This represents the text cyberbullying on social media which may reflect badly later in people’s life. Even the text can cause great psychological stress, just like the situation made by the second interaction of Cyber Box. The controller is like a filter, but in a virtual world, it is difficult to filter the cyberbullying. The pressure produced by Cyberbullies is similar to the outcome of the physical act of pushing. It means the human mind is also under the final pressure when facing lots of ‘push’.

Even though we are constantly switching between the online world and the physical world, but through the virtual world we are mainly connected to the gestures or postures. The word “touch” is in the word “touchscreen,” but tapping and swiping a cold flat piece of matter basically neglect the sense of touch. You are capable of experiencing only a fraction of what your sense of touch allows you to during the long hours of manipulation with touchscreens. The goal of Cyber Box is to seek more connection between the screen-based world and the physical world.

Citation

1.Bullying Statistics, ‘http://www.bullyingstatistics.org/content/cyber-bullying-statistics.html’

2.Designing For The Tactile Experience, Lucia Kolesarova, ‘SmashingMagazine’,’https://www.smashingmagazine.com/2018/04/designing-tactile-experience/’