Author Archive

In The Eyes Of…

exp3-key-image

Group Members: Anantha C, Prayag Ichangimath, and Joanne John.

Concept Statement

In The Eyes Of aims to showcase how cities can be experienced by people with various visual impairments. We specifically look at Low Vision, Macular Degeneration (MD), Glaucoma, and Chromesthesia. Both full blindness, and MD cause limited vision which can result in the inability to see peripheries, facial expressions, and low-light environments. Similarly, glaucoma results in patchy blind spots in both peripheral, and central visions and can also cause tunnel vision. In contrast, whilst chromesthesia is more a perceptual phenomenon than an impairment, it occurs when a person is able to visualize color and light in conjunction with sound. With this project, our aim is to engage with these visual impairments, and to bring about an awareness of these unfamiliar experiences by means of a touch interface.

The system consists of a display and a touch pad (the tangible interface) – which holds five different pathways, each of which have protrusions reminiscent of braille blocks that are present in the city. The display then responds to the path that is interacted with on the interface. By doing so, we emulate what someone with a visual impairment would see and feel as they traverse through the city.

Scaling down the pattern, a similar effect is provided to your fingers as it walks down the path and controls the video. Thus, as the finger moves forwards, the video reflects the selected experience of each path, whilst allowing the user to become familiar with the grooves and indentations of braille blocks.  

Through this project, we aim to bring awareness to the structures that exist in our cities that make them more accessible, as well as to bring empathy and understanding to the various perspectives and visions of those around us.

Experience Video // How it Works Video // Arduino Link // Processing Link

final images

 exp3-final-2

 

Process and Development

circuit diagram

exp3-fritzing

process images

exp3-miro

exp3-process-1 exp3-process-2

exp3-process-3 exp3-process-4

References

[1] Kuru, Ilker, director. Video Footage of a Pedestrian WalkwayPexels, 17 Aug. 2020, https://www.pexels.com/video/video-footage-of-a-pedestrian-walkway-5122599/.

[2] Mikhailov, Evgenij, director. Point of View of a Person Walking down a Street in MoscowPexels, 16 Oct. 2021, https://www.pexels.com/video/point-of-view-of-a-person-walking-down-a-street-in-moscow-9921179/.

Experiment Two: Zephyr

cc-exp2-key-image

Concept Statement

Zephyr is inspired by my childhood morning routine of sleeping in until the last possible second until my mom – whose patience was wearing incredibly thin – would warn us that if we didn’t wake up, she’d throw a bucket of water on us. She never did, of course, but the dread of being doused in freezing cold water was alarming enough to promptly get us out of bed. However, the temptation to continue sleeping through all your alarms – no matter how jarring – persists. I’ve realised though, that external stimulants – like my mom’s warnings – can in fact, be extremely effective.

Working towards a gentler, more soothing stimulant, I gravitated towards activating our sense of smell to help bring us out of sleep. Our sense of smell is an important, and often disregarded, factor in having a good night’s sleep [1]. Aromatherapy is often used to ensure both a restful night, and a refreshing start in the morning. This project utilises our sense of smell and essential oils, to bring you out of sleep in a serene manner.

Triggered by your first alarm, the machine is activated – opening up the container of essential oils, and starting up the fan that disperses the scent around your space. This fanning action continues until the user has awoken, and has drawn their blinds, thus letting in the sunlight necessary to halt movement, and reset the machine. In using scents, a calming environment is created which in turn can boost our energy and mood

final images

cc-exp2-final-image-2

cc-exp2-final-image-3

cc-exp2-final-image-1

Experience Video // How it Works Video

[Note: for the experience video, the day I was filming was extremely gloomy, so a lot of the video is pretty dark, and I had to substitute the sun with the ceiling light. I will re-film the video when the sun comes out again!]

Calm Technology Principles

The two Calm Technology principles that I’ve implemented are:

1.Technology should amplify the best of technology and the best of humanity

Zephyr was designed to perform in the shadows, and to awaken the user with a sense of calm and serenity. With the majority of its work being done while the user is asleep, the machine itself brings about no attention to it or it’s function, but instead quietly transforms the space for the user with little to no direct interaction. With the addition of a catch-all space, zephyr allows itself to be multi-functional, and is then able to serve the needs of the user even when it is not operating.

2.Technology should make use of the periphery

Keeping with its minimal aesthetic, zephyr can literally blend into the dark of night, and requires only the indirect action of drawing their blinds for it to fulfill its purpose. Thus, the user is not responding to the machine itself, but just continues their process of waking up. In doing so, the machine is not necessarily perceived by the user, instead the effects of its work (i.e., the scent that lingers) shapes the atmosphere that impacts the user, and their mood.

Process and Development

circuit diagram

cc-fritzing-circuit-diagram

process images

img_6556

img_6600  img_6602

links to process and development videos

cc-process-img-for-vid1  cc-process-img-for-vid2

Arduino Code  

https://github.com/joanne-john/CC-EXP2-ZEPHYR

References

[1] Suni, E. (2020, October 23). Smell and Sleep: How Scents Can Afect Sleep. Retrieved from https://www.sleepfoundation.org/bedroom-environment/how-smell-affects-your-sleep#:~:text=Because of the power of,psychological and physiological responses2.

 

 

HAPPY PLACE

HAPPY PLACE creates a virtual space that allows the user to jump out of their seat and move to the music that plays. Having interpreted the outline as four studies that take place simultaneously within one sketch – each individual study responds to an action that works collaboratively to finally produce a drawing or expression that illustrates the user’s movement. Focusing on the emotion of ‘happiness’, the project aims to create a fun and carefree atmosphere through the use of music, movement and colour. As the user engages with the sketch, they are able to switch on and off the music, adjust the volume, change the background colour and draw on the canvas with their body.

One of the hardest parts of this project was just being able to start. As a beginner in p5.js, having to incorporate poseNet to the project was both intimidating, and at times frustrating – especially when it caused unidentifiable errors. However, the process of writing a line of code, testing it out and debugging was helpful in understanding the language and how it works, and was beyond satisfying when the action and response actually worked. Though my code may be fairly simplistic, I wanted to primarily focus on understanding how the code works rather than trying out complicated techniques; and was happy to see that even still, I was able to produce studies that contributed towards my concept.

Links:  Collaborative Sketch Present Link   &   Collaborative Sketch Edit Link

Click One: Change Background Colour

As the beginning step that sets the atmosphere of the collaborative sketch, the first click option asks users to choose their favourite colour as the background. To do so, the user lifts their left wrist to the top-left corner of the sketch, which creates a series of randomized colours that the user can choose from as their new background. Following the theme of movement, the action of lifting the wrist or ‘clicking’ the randomize button emulates the disco dance move ‘The Point’ wherein the user demonstrates a similar motion to start and stop the background colour changes.

Links: Click One Video   &   Click One Present Link   &   Click One Edit Link

 

Click Two: Turn On & Off Music

Click Two allows the user to turn on the song ‘Good Day’[1] when their left elbow is in the bottom-half of the canvas, and turns it off when it’s in the top-half. The challenge, given the elbow’s natural position, is that the music plays by default, but seeing as the music would stop playing once the elbow moved, it seemed like the best option. Going forward, I would want to test out other methods to keep the music playing without needing the user to be in a specific position at all times, thus functioning like a proper switch.

Links:  Click Two Video   &   Click Two Present Link   &   Click Two Edit Link

 

Scroll One: Adjust Music Volume

Following Click Two, this first scroll option serves as the next step in being able to control the volume of the music that is being played. Following the right and left shoulder positions, the volume decreases to a value of 0.2, and increases to a value of 1. This scroll action acts as an equivalent to a volume slider that you would find on a computer, and works in promoting movement by the user by allowing them to play with the sounds that are created.

Links:  Scroll One Video   &   Scroll One Present Link   &   Scroll One Edit Link

 

expone-scroll-two

Scroll Two: Draw on Canvas

Sound can be one of the strongest links to memory and emotion. In choosing a song that produces a fun and lighthearted atmosphere, the user is able to move freely and dance to the music. Tracking the nose position, this scroll function results in a line that is drawn in randomized colours to follow the movement of your body. Thus, as the user moves around the canvas, the result of their motion creates a unique piece of art. This works in conjunction with the first scroll option wherein playing around with the volume results in changes in the artwork too.

Links:  Scroll Two Video   &   Scroll Two Present Link   &   Scroll Two Edit Link

 

[1] Good Day by Greg Street ft. Nappy Roots: https://www.youtube.com/watch?v=hjPLkPsLxc4&ab_channel=GregStreetVEVO

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.