BookHub by Greg, Mairead and Simran

fin2-01

Group Members 

Greg Martin, Mairead Stewart and Simran Duggal

Project Description

Book Hub is an Arduino-based system that enables people from around the world to stay connected by sending an alert when all participants are available to meet. In particular, Book Hub can help participants plan their next virtual book club meeting by allowing them to let the group know when they’ve finished that week’s reading. When a member of the book club has finished reading and would like to chat about the book, they can pick up their Book Hub box and turn it upside-down, initiating a change in the box’s LED display. Once all members have turned over their boxes and are ready to meet, an alert is triggered. In one household, the alert is sent in the form of a tea kettle turning on and beginning to boil water, in another, the alert is a pattern of blinking LEDs, and in the third household, the alert is a wall hook turning upside down to deposit a set of earbuds onto a chair. Once all the members have their books, tea, and earbuds, they’re ready to have a virtual book club.

In recent months, scientists have begun to study the effects of COVID-19 restrictions on mental health. A study by Lee et al. (2020), found increased levels of loneliness and depression in participants as compared to before the pandemic. By encouraging book club members to meet regularly, Book Hub can counter this trend in loneliness and hopefully improve the mental health of its users.

final_image

Experience Video –

How It Works

Network Diagram

img_0545

Final Project Images –

Greg –

img_0070

Mairead –

mairead_final_project_image

Simran –

final-image

Project Development Images

img_0017

mairead_development_image

img_5454

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Sketches

img_0073  img_0074

img_0075  img_0076

Code link –

Greg’s code –

https://gist.github.com/grgmrtn/f737c007c4067a98c8828a7bcea10bd1

Mairead’s code –

https://github.com/mc-stewart/Experiment-5

Simran’s code –

https://github.com/simranduggal96/Experiment_5_Simran

Fritzing Diagram

Greg –

greg-fritzing

 

 

 

 

 

 

 

Mairead –

experiment_5_diagram

 

 

 

 

 

 

Simran –

experiement-5_bb

Project context

The popularity of reliable remote working and teleconferencing software has far-reaching implications, both in work and in leisure. Further exacerbated by the COVID-19 pandemic, the wide-spread adoption of video-calling platforms has made it easier than ever for coworkers, families and friends to meet virtually with one another. According to Peek (2020), this new way of communicating will shape the future of work as many companies are likely to embrace fully or partially online working environments even after the lockdown restrictions are lifted. Though this is a novel approach to the workplace, remote working technologies have many advantages. For example, studies show that remote workers are more productive and have a higher happiness index compared to their in-office counterparts (Bloom et al., 2013). Working from home can even reduce employee turnover and can be instrumental in reducing distractions and noise (Peek, 2020). The workplace is not the only space where video conferencing technology can be beneficial. Shah et al. discuss the ways in which communicating with friends and family via video call can reduce loneliness, especially during a pandemic (2020). The authors describe loneliness as a public health issue, arguing that reducing loneliness is a health concern as well as a social concern (Shah et al., 2020). Thus, technologies that are designed to remotely connect groups of people are essential in fighting the mental health effects of COVID-19.

Of course, computer-mediated communication (CMC) can have drawbacks as well. One significant factor to consider when turning to video calling for social connection is the phenomenon often called ‘Zoom Fatigue’. This is the feeling of tiredness and irritation when using video calling software for too long. According to Nadler (2020), one explanation for this phenomenon may be the way the human brain is processing the visuals we see on screen. Rather than seeing a three-dimensional face, we may be interpreting the video as one two-dimensional plane with the background and foreground fused together. Facial expressions and emotions – vital to any work or social related interaction – would understandably take more brain effort to interpret and may result in increased levels of irritation and fatigue (Nadler, 2020). To combat this fatigue, many groups are attempting to blend virtual interactions with offline activities such as reading. An example of this is Book Baristas (Book Baristas, 2020), a virtual book club that encourages its members to connect with one another but also take some time off screen to read. Endeavours like this can help participants feel socially connected while also avoiding the fatigue of having extended periods of time on a video call.

references –

Bloom, N., Liang, J., Roberts, J., Zhichun, J. Y. (2013). Does Working From Home Work? Evidence from a Chinese Experiment. NBER Working Paper Series. Cambridge, MA: National Bureau of Economic Research.

Book Baristas (2020). News. Book Baristas. http://www.bookbaristas.org/p/bookish-news.html

Lee, C. M., Cadigan, J. M., Rhew, I. C. (2020). Increases in Loneliness Among Young Adults During the COVID-19 Pandemic and Association with Increases in Mental Health Problems. Adolescent Health Brief, 67(5), 714–717.

Nadler, R. (2020) Understanding “Zoom fatigue”: Theorizing spatial dynamics as third skins in computer-mediated communication. Computers and Composition, 58(1).

Peek, S. (2020). Communication Technology and Inclusion Will Shape the Future of Remote Work. Business News Daily. https://www.businessnewsdaily.com/8156-future-of-remote-work.html

Shah, S. G. S., Nogueras, D., Van Woerden, H. C., Kiparoglou, V. (2020). The COVID-19 Pandemic: A Pandemic of Lockdown Loneliness and the Role of Digital Technology. Journal of Medical Internet Research, 22(11).

 

 

TOUCH ME NOT by Abhishek Nishu and Simran Duggal

img_9629  img_3239-2

GROUP MEMBERS –

Simran Duggal, Abhishek Nishu

PROJECT DESCRIPTION –

Touch Me Not is an installation inspired from the plant “Touch me not” also known as the Shameplant from nature. Everytime we would go on a trek surrounded by nature’s beauty and might, we would still keep a look out for this small plant. It was nature’s way of reciprocating to our actions and we were very fascinated by this. If you haven’t come across one, it’s a small green leaf that withdraws itself on being touched. The installation looks to replicate this concept with withdrawing lights based on the proximity from an LDR sensor. But instead we have chosen to represent them as Lily’s and a miniature garden . Flowers are nature’s way of getting indoor. The structure is a combination of origami flowers complemented by other props including real flowers to extend a natural look to the installation along with an object adding movement by using the servo motor. The installation uses light levels to navigate movement around it. With a person or object in close proximity, it reduces or blocks the amount of light received by the sensor, hence being able to track movement around it. We have also built a correlation between the light sensor level and the voltage. Through this as the sensor amount goes down, it lowers the voltage, bringing down the brightness of the LED’s. In the absence of people just like a plant, the flower goes on to live its life by staying lit in the presence of light and goes to sleep in its absence. We further wish to explore this project by incorporating better mechanisms that would control the petals of the flower to open and close in relation to the LED’s.

EXPERIENCE VIDEO–

The first prototype is done using sensors with LED’s exploring the proximity fading in flowers.

https://ocadu.techsmithrelay.com/zmjd

The second prototype is a development of the first one by adding servo motor in an element while exploring proximity fading in flowers.

https://ocadu.techsmithrelay.com/Zysl

FInal project images–

1 – img_9621

img_9643 img_9641 img_9629

2 –img_3248 img_3250   img_3251

DOCUMENTATION IMAGES–

BEHIND THE SCENES VIDEO LINK –

https://ocadu.techsmithrelay.com/oHvE

1- Rough development of idea through sketchesimg_9662

2- Testing & building circuit

img_9578 img_3078

3- Began Structure development

img_3141 img_9579

4 –First flower sensor

img_0776 img_3184-2

 

LINK TO THE ARDUINO CODE HOSTED ON GITHUB–

https://github.com/AbhishekNishu16/Creation-Computation-code.git

CIRCUIT DIAGRAM CREATED ON TINKERCAD USING ARDUINO UNO–

start-simulating

PROJECT CONTEXT –

Have you ever wondered why the outdoors are so intriguing. It’s because of its dynamic nature. The outdoors is a combination of many patterns taking place on a day to day basis. Birds fly away from their nests only to come back to them at night, plants stand tall in the presence of light and preserve their energy at night only to perform the same action the following day and the sun rising and setting with each day. What makes these patterns beautiful together, is its synergy with each other. A plant wouldn’t stand tall during the day if the sun didn’t come out. It’s these interactions with each other that make patterns exist. 

We began this project from a very technical perspective. But as we slowly got introduced to more pieces of hardware, it began to present itself as a canvas open to building digital patterns based on the interaction between two or more products. This led us to see that there was one common characteristic to both the indoors and the outdoors. And that was energy. With nature being driven by its natural sources of light and energy, we invented a different form of energy that served our needs, Electricity. While we looked at multiple artists and their representation of bringing the outdoors indoor. Bruce Munro installation “Field of light” in the Uluru desert in Australia, not only portrayed visual satisfaction but also showed us a way how our different forms of energy can complement each other environments(Bruce Munro,2016). By replicating man-made energy in biophilic patterns he crafted a new dimension to the Uluru desert. Deserts were generally meant to sleep at night. But through his work he has inspired many to recreate the Field of light and give a new perspective to something we once upon a time just saw as sand.

Similarly we looked to create an outdoor experience, indoor. The concept of Biophilic design is widely used in the building industry. It is the use of patterns of nature in indoor spaces to increase occupant connectivity to the natural environment through the use of direct nature, indirect nature, and space and place conditions(Biophilic Design). Through our installations we look to patterns of nature and with the help of technology, make them more interactive with people in that space. Biophilic design can also reduce stress, improve cognitive function and creativity, improve our well-being and expedite healing; as the world population continues to urbanize, these qualities are ever more important(Terrapin). The purpose of Biophilic Design also helps explain why crackling fires and crashing waves captivate us; why a garden view can enhance our creativity; why shadows and heights instill fascination and fear; and why animal companionship and strolling through a park have restorative, healing effects. We draw inspiration from Biophilic design in bringing in nature into our spaces. 

 

Also inspired by “Hooray” by Hye Yeon Nam we were able to create our idea based on using shadows as a proximity alert. This helps bring an interactive element to our biophilic pattern. We hope that with the world urbanizing itself and moving forward, biophilic patterns take its place strongly in our future. 

CITATIONS:

Bruce Munro: https://www.brucemunro.co.uk/work/field-of-light/,2016

Arduino Sunflower: https://create.arduino.cc/projecthub/Rick_Findus/arduino-sunflower-c4fd84

Hooray by Hye Yeon Nam:  http://hynam.org/HY/hoo.html

Biophilic Design: https://en.wikipedia.org/wiki/Biophilic_design#:~:text=Biophilic%20design%20is%20a%20concept,and%20space%20and%20place%20conditions.

Terrapin: https://www.terrapinbrightgreen.com/reports/14-patterns/

 

 

On the Nose ! by Simran Duggal

img_6f44bce7d92b-1

Project Description –

‘’On the Nose” is an idiom which means precisely, at an exact time. It is a metaphorical representation of describing body tracking, how at the exact moment, our body is tracked by the webcam and is translated as piece of art. In this series of work, the interaction is connected but the way of depiction is engaging and will be an interactive experience for you all.

From this initial exercise of this Experiment, I had an understanding of how different mediums of creating visuals can be used in user to computer interaction. Taking a step beyond that, this part of the experiment focuses how the unique interaction between humans and a computer happen. It also reflects the functionality and unique of computer language as well as showing a sense of traceability. The main idea of this series of work is to see how different people interact with their computers. Also, In today’s time it is important to create captivating use interface designs and glean user experience to focus on how end-users can engage with ease.

All my 5 sketches involve the poseNet body tracking function to experience the interactivity with the computer to generate art, understand the way we are being tracked and play games. Keeping all the learnings from the class in mind, I was able to create these sketches which can further be developed into more intricate codes.

Sketch 1 – Nose Painting – create an artwork

download-3

Experiment Video – https://ocadu.techsmithrelay.com/orDz

Description – My first sketch is like a colouring book page which you are suppose to fill and create your own art work. The twist is that we have to do this using our Nose !  I have used poseNet to execute this sketch. This is basically a conversion of the mouse dragging function where I have replaced noseX , noseY instead of mouseX , mouseY. As you keep moving your face in front of the screen. The brush will keep following your nose and painting the circle. In addition to this, I have applied the colour changing technique to make the sketch more colourful and attractive.

Present Link – https://editor.p5js.org/simran.duggal/present/-FCmdak7G

Edit Link – https://editor.p5js.org/simran.duggal/sketches/-FCmdak7G

SKETCH 2 – Mr.lazy

download-8

Experiment Video –https://ocadu.techsmithrelay.com/FVHw

Description –  In the second sketch, I’m trying to get understand poseNet better and use the function is something more structured. I have tried to create a little comic character whose expressions can be controlled using your nose. As you move your face left and right the eyes will move along with you and as you more up and down the face will become small to big and vice versa. Through this sketch, I have understood how expressions can be controlled using body tracking. Therefore, I would like to further work on it later but adding different body parts and making it more engaging.

Present Link –https://editor.p5js.org/simran.duggal/present/gfbDHKP99

Edit Link  –https://editor.p5js.org/simran.duggal/sketches/gfbDHKP99

SKETCH 3 – Eye filter

screenshot-2020-10-05-at-3-52-31-am

Experiment Video –https://ocadu.techsmithrelay.com/X0We

Description –  This sketch is something that I wanted to learn and understand that how precise body tracking it. I was inspired by the actual face filters that we use while taking selfies. It’s very fascinating to see how these filters become a part of our face and fit and scale exactly according to our face. In this sketch, I have used eyes as a tracker which depicts how the scale of the circles change using the distance between the eyes when I come close to the computer and stay far way from it. It also scales accordingly when tilt my face.

Present Link -https://editor.p5js.org/simran.duggal/present/YmUlAFeLJ

Edit Link  – https://editor.p5js.org/simran.duggal/sketches/YmUlAFeLJ

SKETCH 4 – Line Art : Create an artwork

download-6

Experiment Video – https://ocadu.techsmithrelay.com/1ndq

Description –  In this sketch, I’m trying to give a platform to the user to create their own generative art using their nose. As you move around the screen, you will be able to create an artwork. Make sure that you try to convert the whole space by standing up so that the colours can change easily.

Present Link – https://editor.p5js.org/simran.duggal/present/8eYys5XMs

Edit Link  – https://editor.p5js.org/simran.duggal/sketches/8eYys5XMs

SKETCH 5 – scuba game

Experiment Video – https://ocadu.techsmithrelay.com/3Dx4

Description – 

Sketch description – In this sketch I have tried to create a game with the concept of a Scuba diver who is trying to get all the bubbles. The end result of the game it to get rid of all the bubbles. I was looking a balloon bursting game which inspired to work on it further. This game also reminds me of the game pacman. The interactivity is simple, using the nose to move around the scuba diver.

Present Link – https://editor.p5js.org/simran.duggal/present/VsjRDO-vh

Edit Link  – https://editor.p5js.org/simran.duggal/sketches/VsjRDO-vh

PROJECT DESCRIPTION –

Earlier, the words “input device” evoked in our mind only two specific objects: the keyboard and the mouse — the main instruments used to provide data to our computer. Keyboard and mouse are in the first input devices in the history of the computer. Nowadays, with the evolution of computers, we have a large set of input devices that changed the way we interact with the computer.

The use of gestures in interfaces has ranged widely from conversational interfaces with speech and gestures used together to interfaces using arbitrary gestures languages. However, when thinking about using gestures as part of an interface, it is important to consider what is being considered as a gesture, and what aspects of the working definition are important to the interface being designed

With the rapid growth in technology adoption and people using their phones and laptops most of the time, it is noticed that people are expressing more empathy and emotion when they interact. Designers must understand a variety of factors that influence the way that gestures will be used, including issues of performance, the influence of spectators, and the ways in which technology influences gesturing. Väänänen and Böhm highlight that “gestures used in human computer interfaces must have defined meanings, which is in direct opposition with gestures as they are used in daily life”.( Väänänen, K., Böhm, K,1993)  And as the technology will evolve, people willing to use such technologies. For example, mobile phones have such technologies that have made life convenient like the Swipe button to unlock our phone, shake the phone to turn on the flashlight as well as face recognition, speech expressions and eye movement etc.

In today’s time, especially when it comes to products, it is very important to create a versatile interaction in terms of technology, so that user can customize and connect with the product. Wearable technologies Fitbit, Samsung and Apple Watch are now becoming a very important part in a user’s daily like to keep a track of their wellbeing. The future of wearable technologies has a lot of potential to embed gesture recognition in a way that the product is smart and user friendly. Gestures like wrist rotation, nodding, pointing, shaking the head, finger touch are some of the interactions can be used. The project North Star is an example of open source reference platform for Leap Motion’s hardware and UX design–meant for manufacturers and designers with such experiences.

The next step for me in this Experiment would be to understand and explore more gestures in detail that can be used for body tracking interactions. I would also like to see how this body tracking can be in cooperated in products using sensors like it is in the famous Nitendo Wii.

References –

  1. How Human Interaction is Shaping the Future of Technology, Hector Ouilhet, January 2020
  2. The future of Human Interactions and Role of Technology, Steve Kraus, May 2019
  3. Leap Motion’s “ Virtual Wearables” May be the Future of computing , Jesus diaz, September 2018
  4. Gesture- Based Interfaces : Practical Applications of Gestures in Real World Mobile Setting , Julie Rico, Andrew Crossan and Stephan Brewster, 2011
  5. Väänänen, K., Böhm, K.: Gesture driven interaction as a human factor in virtual environments – an approach to neural networks. In: Earnshaw, R.A., Gigante, M.A., Jones, H. (eds.) Virtual Reality Systems. Academic, London (1993)