Recognizing the Bland Diet Regimen: A Comprehensive Overview

A boring diet refers to a particular consuming plan developed to minimize gastrointestinal concerns, such as gastritis, peptic ulcers, or gastroesophageal reflux condition (GERD). This diet primarily consists of conveniently absorbable and also non-irritating foods that assist calm the digestive system. Whether you’re experiencing a gastrointestinal condition or simply aiming to provide your tummy a break, this interesting post will certainly guide you through the essentials of a bland diet.

What is a Bland Diet plan?

A dull diet regimen is a short-term meal plan that concentrates on consuming mild foods that are low in fiber, fat, and also spice. This type of diet intends to minimize irritability as well as inflammation in the gastrointestinal tract, advertising healing as well as reducing pain. While the details foods permitted on a boring diet regimen may vary depending on your condition as well as personal preferences, the general principle stays the exact same.

A bland diet commonly includes easy-to-digest foods that are gentle on the tummy, such as:

  • Lean meats, like urotrin poultry, turkey, or fish
  • Soft-cooked eggs
  • Low-fat milk products, such as milk, yogurt, as well as cheese
  • Grains, like white rice, oat meal, as well as improved bread
  • Non-citrus fruits, such as bananas, melons, and also applesauce
  • Cooked veggies, like carrots, green beans, and potatoes
  • Smooth nut butters, like peanut or almond butter

It’s important activestin pret to note that private tolerances might differ, and it’s always best to consult with a healthcare expert or authorized dietitian before starting a dull diet regimen.

When is a Bland Diet Regimen Prescribed?

A bland diet is commonly suggested for people experiencing gastrointestinal conditions such as gastritis, peptic ulcers, GERD, or after specific surgical treatments including the digestion system. These conditions can cause inflammation and inflammation in the intestinal system, bring about symptoms like tummy discomfort, bloating, queasiness, and also heartburn.

A dull diet plan can aid reduce these signs and symptoms by minimizing the intake of foods that might cause or intensify digestion issues. While it’s not a treatment for these conditions, it can provide temporary alleviation and also promote recovery.

Along with clinical problems, a boring diet may additionally be suggested for people recuperating from belly health problems, such as gastroenteritis or gastrointestinal disorder, where the digestion system requires time to recover as well as reclaim its regular function.

What Foods to Prevent on a Bland Diet regimen?

While a bland diet regimen urges mild as well as non-irritating foods, there are certain foods as well as drinks that ought to be prevented. These include:

  • Spicy foods, like chili peppers, warm sauce, or curry
  • High-fat foods, such as fried foods or fatty cuts of meat
  • Citrus fruits as well as juices, like oranges, lemons, or grapefruits
  • Raw veggies, specifically those with high fiber web content like broccoli or cabbage
  • Caffeinated beverages, such as coffee, tea, or carbonated drinks
  • Alcohol as well as tobacco
  • Highly experienced foods, including garlic, onions, as well as strong seasonings

Avoiding these foods helps reduce the risk of exacerbating digestive symptoms and also enables the body to recover more effectively.

Tips for Following a Bland Diet Regimen

Below are some practical tips to assist you successfully follow a boring diet regimen:

  • Choose cooking techniques that entail marginal fat, such as steaming, steaming, or baking.
  • Opt for lean cuts of meat and also remove any noticeable fat or skin prior to cooking.
  • Period foods with herbs and spices that are mild on the tummy, such as parsley, basil, or dill.
  • Trying out different food preparation methods to include taste to your dishes, such as marinading or using low-sodium broths.
  • Keep moisturized by drinking a lot of water throughout the day.
  • Listen to your body as well as take notice of just how certain foods make you feel. Every person’s resistance to specific foods may differ.
  • Gradually reintroduce foods back right into your diet as soon as your signs and symptoms improve, under the support of a medical care specialist.

The Value of Seeking Specialist Suggestions

While a bland diet can supply short-lived alleviation and help in the recovery process, it’s critical to seek advice from a health care expert or licensed dietitian before making any type of substantial adjustments to your diet plan. They can give individualized assistance based upon your specific medical condition, general wellness, and nutritional needs.

The Bottom Line

A bland diet regimen can be an useful tool in managing digestive conditions and promoting digestion health and wellness. By concentrating on quickly digestible and non-irritating foods, you can provide your tummy the rest it needs to recover and also recoup. Remember, it’s constantly best to seek expert recommendations to make sure that the diet plan appropriates for your private needs.

Qemorpho

sd

kns

 

~~~~~~~~~   🎥 VIDEO LINKhttps://www.youtube.com/watch?v=417Jkt1xeKw     ~~~~~~~~~~~~

 

first-page

second-page

page-3-a

page-3b

img_1512

screenshot-2022-12-13-at-12-31-24-am

d

df

cfb

vfv


BIBLIOGRAPHY

Related Works Research

MrLuxurytaste. TouchDesigner & Arduino – Interactive Particles Controlled by Ultrasonic Sensors, YouTube, 8 Feb. 2021, https://www.youtube.com/watch?v=Q949m9bvlD8&t=1s.

Nature, Artificial. Artificial Nature: Archipelago @ La Gaîté Lyrique, Paris, France; Oct 10 2014 – Apr 2 2015., 1 Mar. 2015, https://vimeo.com/120987833.  

Sohyun Jun. Touch the Tree – TouchDesigner with Leap Motion, YouTube, 21 Apr. 2021, https://www.youtube.com/watch?v=HeFONh9K0go

Concept Building & Tutorials

Acrylicode. Easy Twist-Shape Visuals | TouchDesigner Step by Step Tutorial, YouTube, 18 Feb. 2022, https://www.youtube.com/watch?v=5usOjbSsrSQ&list=PLZeDM2TloijWODqRjQrE_WwoK1inYP1PX&index=14

Acrylicode. TouchDesigner Tutorial | Twisting Sphere with Changing Colors, YouTube, 11 Oct. 2021, https://www.youtube.com/watch?v=jdGHN01D8Qc

Alexmiller. How We Made an Interactive, Projection-Mapped Topographic Installation, YouTube, 30 July 2018, https://www.youtube.com/watch?v=07hiEtggHXw.  

Brunoimbrizi. TouchDesigner _06 Fluid Simulation, YouTube, 24 Dec. 2020, https://www.youtube.com/watch?v=2k6H5Qa_fCE

Details Studio. Fabric Simulation in Touchdesigner, YouTube, 29 Apr. 2020, https://www.youtube.com/watch?v=jVZW2Cte-IE

Graterol, Maria. Leap Motion and Particles in TouchDesigner, YouTube, 27 July 2021, https://www.youtube.com/watch?v=qePnS21jYTw

Noones img. Morphing between Objects – Instancing (Touchdesigner Tutorial), YouTube, 9 Nov. 2020, https://www.youtube.com/watch?v=2EwQSCZ0Hs8

Nose2bear. Light Replication – TouchDesigner Tutorial 44, YouTube, 12 Oct. 2021, https://www.youtube.com/watch?v=os5jV5FpwOw

Nose2bear. Outrun / Landscapes – TouchDesigner Tutorial 19, YouTube, 27 Apr. 2020, https://www.youtube.com/watch?v=K8vIg2t3wDo&t=147s

 

Your Safe Space
Exploring the relationship between Consumerism and Privacy

By Anusha Menon, Rim Armouch, Taylor Patterson, Shipra Balasubramani

1b

 

Introduction

Welcome to your Safe Space!
Consumer behavior has been heavily impacted by auto-generated ad placement based on algorithms that track purchasing behavior
and movement. Most of this type of advertising is subliminal and happens with daily interaction within our personal spaces. While these ads
and their timing can be useful and convenient, they can also become an invasion of privacy.

Our research and design focussed on the following questions :
How do we get an audience to understand just how far these algorithms have reached inside the home and
how do we get them to explore the borderline dangerous relationship between consumerism and privacy?


Related works Research

2b

Inspiration 1: Listening Post, Mark Hansen and Ben Rubin

Listening Post by Mark Hansen and Ben Rubin is an art installation that uses text messages from thousands of live chat rooms and broadcasts them over a collection of 231 small electronic screens which are placed in a grid like manner for display. “The piece is also about surveillance, privacy rights, and data malleability.” (Rondet, 2016). 
Ben Rubin founded the EAR. Studio back in 1998. He directed the New School in NYC where he was a Professor of Design. Mark Hansen, designer and artist, joined Ben Rubin at EAR in 2007. Both have collaborated and created multiple projects together. (Mark Hansen: Science Museum Group Collection (2022). The complex network connectivity, public display of otherwise private texts and the focus on privacy being invaded makes this installation an accurate supporting project to our installation.

http://www.digiart21.org/art/the-listening-post

 

2aInspiration 2: Lauren, by Lauren Lee McCarthy

Lauren Lee McCarthy is a well decorated associate professor at UCLA. She’s received grants and residences from several well-known organizations. Lauren is the creator of p5.js and sits on the board of the Processing Foundation making programming more accessible. Lauren’s installation consisted of Lauren becoming an AI, watching over individuals in their private living spaces. Lauren anticipates actions and triggers the AI accordingly.  “LAUREN is a meditation on the smart home, the tensions between intimacy vs privacy, convenience vs agency they present, and the role of human labor in the future of automation.” (Lauren Lee McCarthy (2017) LAUREN.)

Lauren triggers interactions from a computer to household items prompting them to do an action based on the individual’s request/ preference. During the process, the line between privacy and convenience is questioned. The ability to connect ‘virtually’ to items in a common space made this project a great source of inspiration to the brief and our concept.

https://lauren-mccarthy.com/LAUREN

 

The Historical Context 

Advertising started out as one of the lower priorities in a business.  It wasn’t until the infomercial days when companies saw revenue sky-rocket that it became a standard/prioritized part of the business. During this time, the ads were in your face and blatant. Over time, the advertising noise was getting too loud and there was a shift to subliminal advertising. As the world became more technologically advanced, subliminal advertising was more undetectable and done through artificial intelligence and tracking. This project highlights the modern-day version of advertising in an exaggerated form to show the impact and potential privacy invasion. 

 

The Socio-cultural Context

Understanding the way in which ads speak to their audience encouraged us to touch on the standard interaction of day-to-day items. The simple act of entering a room, the plant, the magazine and the coffee cup are all part of the socio-cultural norms of today. The incorporation of specific ads in different languages targeted the aspect of Artificial intelligence that is representative of how such platforms present information and collect based on the consumers behavior. 

 

The Concept

1cThis project is a representation of recurring issues in the context of Sociocultural and technological environments. The concept was developed after an idea dump of trying to determine what we could do to engage customers while indirectly sharing a message about present recurring issues affecting everyday life. How do we create an experience where mundane items are used to represent a significant issue, what kind of issues do we want to highlight, economic, environmental, sociocultural, or technological? These were a few of the questions that came to mind.

 

 

 

Design Considerations & Technical Description

plant1 ezgif-com-gif-maker-13 ezgif-com-gif-maker-14For this project we were inspired by the project Lauren, by Lauren Lee McCarthy. Our main research question revolved around the idea of how technology has become a very important part of every household, and how we as consumers fail to notice the invasion of privacy.

We decided to set up the installation in the DF lounge space, as the scenography is ideal for the typical home environment. We also focussed on identifying the nodes/objects that would seamlessly be placed in the environment, leaving room for the user to survey and interact. While working on the prototypes we felt that not all nodes should be made obvious to the user, rather some were intended to be hidden in the set.

Initially the following objects were intended to be included in the space: Book, Cup, Plant and Phone. While working on the prototypes and during our testing sessions, we realized the challenges that come with using phones as a sensor. Considering time, we decided to reduce the interactions to 3 objects: Book, Cup and Plant.

Video1: Interaction 01.mp4
Video2: Interaction 02 1.mp4

 


Installation Design
strange-networks-frame-2

We initially started with creating individual units, and conducted various tests on receiving data from the LDR sensors. As demonstrated in the
fritzing diagram, the LDR values are sent to the WebSocket, and further shared to P5.js that would then trigger the output. Each of the objects are placed over an LDR sensor, so when the object is lifted there is an increase in the LDR reading and that is said to trigger audio visual response on the TV. As the intent was to place some objects at the reach of the user and some hidden in the scene, two objects were placed on the coffee table and one above the bookshelf. This way the user is not tied to one area in the space, creating more scope for movement. We were also able to create interesting interactions for when multiple objects were displaced, creating an overlap and overload of audio visual content.

Installation dimensions: 3m x 3m
Number of participants: Single user

Hardware: 2 x Arduino nano 33IOT (with uploaded code), 5 x LDR Light sensor, 2 x Breadboard, 2 x Laser pointers, Pin plugs, Jumper plugs, Low profile jumper wires, Powerbank

Software: Arduino IDE, C++, P5.js, Glitch(WebSocket)

Set Elements: Coffee cup, Book(Magazine), Plant, Couch, Coffee table, TV, Table lamps(Ambient lighting)

Code Link:  https://github.com/anushamenon/Strange-Networks/tree/main

 

ldr-objects-1
Diagram 1: Sensors to Detect object displacement/ object moved

ldr-person-counter-1
Diagram 2: To Detect user Entry/ Exit the installation

 

User Experience

The set is designed to depict a comforting, homely living space, inviting the viewer to walk in and sit down or interact with the objects. There is a TV, a sofa and a coffee table, along with other smaller items that can be picked up. While there is no movement (through digital visuals or otherwise), gentle music can create a soothing ambience to highlight the concept of it being a “safe space”. The experience begins once the viewer walks through “the door” into the space.

As soon as the viewer walks into the room, they are greeted by a computer generated voice, resembling common AI assistants like Alexa or Siri. If the viewer takes some time to look around the room and doesn’t interact with any objects for approximately 15 seconds, they hear an ad for a “Smart Assistant”, e.g. a Google home, playing over speakers that are not visible in the space.

Upon interacting with any of the objects in the room, the first ad will stop, if it is playing and a new ad for brands relating to the item picked up will play, both on speakers and with visuals on the TV. As the user picks up more items, the resulting ads will overlap over each other, creating an incomprehensible noise to induce discomfort or irritation in the viewer. As the items are set down, the ads will stop – for example, if the viewer is holding both a plant and a coffee cup, both ads will play simultaneously, but if the coffee cup is put back down, the coffee ads will stop, while the plant ads continue to play until the plant is set down. Multiple interactions with the objects can generate a different set of ads each time, as if the digital environment is trying to find the user’s preferred brands within their object of interest.

Finally, when the user exits the space, any ads that are playing will stop and the TV will switch off. A computer generated voice bids the viewer goodbye and wishes them a happy shopping trip as if to encourage them to purchase the products they just saw.

strange-networks-frame-1

 

 

Future Development

future-version

Our larger vision for this project would be to create an immersive installation, where the user is allowed to explore and experience the space. There would be more items to interact with using seamless technology to work as imitation AI and have it learn the behavior and movement of individuals in the space within a certain time period.

The scenography is detailed to replicate that of a typical home environment, with various objects/nodes for the user to interact with. Every time a piece of data is collected from the user, there is an audio visual response to keep the user aware of their actions/movements. Playing with audio, the sound of a bell and tickers, projected on the walls of the room, is aimed to create some tension between the user and technology that surrounds them.

Upon exiting the exhibition, the individual would be faced with how much information was gathered and will be shown how many different platforms their information has been shared and saved to in an effort to clearly display the depth of privacy invasion in the home from AI.

Video: Future Version Video (With sound).mp4

 

Bibliography

Lauren Lee McCarthy (2017) LAUREN. Available at: https://lauren-mccarthy.com/LAUREN (Accessed: December 4, 2022).

Rondet, B. (2016) The listening post, 21st Century Digital Art. 21st Century Digital Art. Available at: http://www.digiart21.org/art/the-listening-post (Accessed: December 5, 2022).

Mark Hansen: Science Museum Group Collection (2022) Mark Hansen | Science Museum Group. Available at: https://collection.sciencemuseumgroup.org.uk/people/cp118841/mark-hansen (Accessed: December 5, 2022).

 

 

 

 

Project 3 – Strange Networks – Tyler, Mufaro, Mona, Gavin

 

project_3_final


project_3_final2 project_3_final3 project_3_final4 project_3_final5Video of Lines (2016)
Video of Arduino and Pure Data project (2022)


project_3_final7lostcity2The Lost City of Eslinas by Mari.K aka MadMaraca (2021)
project_3_final8jellyfishexperiment
Researchers fitted some moon jellyfish with a prosthetic “swim controller”.
Credit: Nicole Xu and John Dabiri / Caltech
jellyfishexperiment2The electronic swim controller made the modified jellyfish swim nearly three times faster than their normal speed.
Credit: Nicole Xu and John Dabiri / Caltech
project_3_final9
abrar-khan-civilizationgif
Civilization by Abrar Khan (2019)
project_3_final10project_3_final11landscape-2Our ideal scene illustration named, “Cyborg Jellyfish”, made with Ai, Blender, Procreate and Photoshop

project_3_final12Files for programming: https://github.com/tbeattysk/Cyborg-JellyFish
Files for Toronto cityscape: https://github.com/msvive/TorontoGIS-3D-model.git

doc5


project_3_final13

doc1doc2
doc3doc4

doc6
Link to full video

BIBLIOGRAPHY

Developconference.com. 2022. “Dom Clarke: Develop Conference.” Develop Conference Brighton. https://www.developconference.com/whats-on/2019-speakers/speaker-detail/dom-clarke. [Accessed 12 December 2022]. 

Guido, Giulia. 2020.  “The Surreal Photographs by Elia Pellegrini.” Collateral. https://www.collater.al/en/surreal-photographs-elia-pellegrini/. [Accessed 12 December 2022].

Moment Factory. 2022. “Massive Media Architecture at Resorts World Las Vegas.” Moment Factory. https://momentfactory.com/work/all/all/resorts-world-las-vegas. [Accessed 12 December 2022]. 

Pen, Kim Seung. 2022. Kim Seung Pen. https://kimseungpen.com/. [Accessed 12 December 2022].   

Soundslikelind. 2016. “Project: Lines – Interactive Sound Art Exhibition: Cycling ’74.” Project: LINES – Interactive Sound Art Exhibition | Cycling ’74. https://cycling74.com/projects/lines-interactive-sound-art-exhibition. [Accessed 12 December 2022].  

Soundslikelind. 2016. Sounds like Lind. https://www.soundslikelind.se/. [Accessed 12 December 2022].  

“Gallery.” 2018. LOT REIMAGINED: ACROSS AN EMPTY LOT. https://emptylot.weebly.com/gallery.html. [Accessed December 12, 2022].

Spiral Circus. 2022. “Silt for Nintendo Switch – Nintendo Official Site.” for Nintendo Switch – Nintendo Official Site. https://www.nintendo.com/en-ca/store/products/silt-switch/. [Accessed 12 December 2022]. 

Bushwick, S. (2020, January 29). Cyborg jellyfish could one day explore the ocean. Scientific American. Retrieved December 12, 2022, from https://www.scientificamerican.com/article/cyborg-jellyfish-could-one-day-explore-the-ocean/

Project3 – Strange Networks (Nicky Guo)

Related works 

ODD Ball

screenshot-2022-12-12-at-7-40-47-pm

ODD ball is an interactive and playful musical instrument. The ball makes beats when users bounce it and can be used as a MIDI controller. It is made of silicone and designed to be hooked up to an app to create sounds by bouncing and catching the ball. The harder the user bounces the ball, the higher the note or the louder the sound. Users can also connect multiple balls via the app. This will take your creative potential above and beyond, introducing tricks to create original-sounding music.

This project inspired me during the thinking process of making new forms of musical instruments, which I decided to have an interactive sphere that both professional musicians and amateurish music lovers could play with. In addition, the way users interact with the ball (touching/bouncing) in this project made me want to do something different – rotating by detecting the pitch and roll values in Arduino.

Glover Software& MiMu Gloves

screenshot-2022-12-11-at-4-14-27-pm

Glover is the magic glue enabling and empowering artists to gesturally interface with off‑the‑shelf interfaces. It integrates with a DAW‑based setup, adding a new dimension to the existing kit, and is also OSC (Open Sound Control) addressable, allowing it to control and manipulate other performance elements such as visuals and lighting. MiMu gloves — a wearable musical instrument providing a complex gestural interface. With the MiMu gloves and Glover software, you can use any posture, finger and hand gesture to control music wirelessly.

This cooperative design project inspired me that the music production and performance could be coordinated by more than one person. Using motion control to the instrument itself and others interact with the musical software on different devices, such as laptop and phone.

 

Conceptualization

My research question is how to create a cooperative music-making experience for musicians and amateurs by designing an interactive and playful musical instrument. The problem space that I have identified is that many live performers in recent years commonly play their music on the stage and lack interaction with the audience, or people just simply listen to the music that has been made by others, even though they may desire to be involved in music production but lack adequate knowledge. The issues of traditional musical instruments are not quite playful to operate, learning is time-consuming, and normally only one or two people can perform at the same time.

To resolve these problems, especially tackling the challenge of creating a strange network for multiple users, I intended to design a novel musical instrument that one user could play with, then the others could control the different aspects (sound effects, volume, switch, etc.) of the music through a simple interface on the laptop.

According to the early research, I decided to design a spherical instrument so users can interact with it by rolling around, which is a novel and playful approach to interacting with the ball. All the technologies (Arduino, LED, breadboard) should be fixed inside the ball to make sure it would not fall apart during users’ operation. Then most of the time was spent working on the sample sound production and connection between the serial monitor, MAX, and Ableton.

 

Design materials

  • Transparent bowl. This is the main material used for building the spheric instrument itself. It is transparent so that the lights that come from inside are visible to the audience, which enhances the connection between the instrument and the audience.
  • Bubble wrap. It covers the surface of the sphere to make the lights vague and greater in performance, improving users experience in a real live musical performance.
  • Packing tape. To fix the position of all the technologies in the instrument.

 

Technologies

  • IMU. Detecting the pitch and roll values in the Serial Monitor.
  • LED lights. Generating different RGB color based on the value of pitch and roll
  • Ableton. I created several simple soundtracks that users can choose to turn on/off as they like
  • MAX. The software connects Arduino to the nodes/Ableton. The pitch/roll values in the serial monitor will be detected in the coded interface in Max, then the values in MAX will trigger changes in MIDI effects in Ableton.

 

Assemblance

screenshot-2022-12-12-at-7-49-43-pm

Demonstration
screenshot-2022-12-12-at-7-39-24-pm screenshot-2022-12-12-at-7-39-43-pm screenshot-2022-12-12-at-7-39-54-pm

Link to the Arduino code:

https://github.com/Nickyggggg/Arduino-Project/blob/5c0f9772770b3b9c1e9f3e9993073b813e75f8e6/Strange%20Network

Code in MAX:

screenshot-2022-12-12-at-6-56-49-pm screenshot-2022-12-12-at-6-57-33-pm

 

Music production in Ableton and MIDI effects:

screenshot-2022-12-12-at-6-55-52-pm   screenshot-2022-12-12-at-7-24-44-pm

 

 

 

Reference

C, Caro. Gesture Control for Electronic Music, 1 Dec. 2022, https://www.soundonsound.com/techniques/gesture-control-electronic-music.

“Odd.: A Playful Musical Instrument in the Form of a Bouncy Ball.” ODD. | A Playful Musical Instrument in the Form of a Bouncy Ball, https://oddballism.com/en-ww.

 

Strange Networks (David Oppenheim)

Prototype for a network of analog and digital flames

This prototype combined several networks to explore the relationship between analog and digital objects and the use of fire as an interface element.

I followed an iterative approach to research, ideation, design and development, outlined in the documentation that follows. 

1. Context Research

This piece was inspired by (and ended up re-mixing) Randall Okita’s short film, “No Contract.”[1]

nocontact_heroimage

Photo credit: screenshot, “No Contract” (Randall Okita)

An excerpted portion of the film’s synopsis describes how it was made:

“All flames in the video are authentic and were recorded live, rather than created digitally. A single performer was lit on fire in front of an audience and performed a 30 foot a wire-assisted jump while burning.”

I was fascinated by the imagery of the two bodies, engulfed in flames, moving towards each other but also captivated by the scenes where we watch an audience watching the performer get lit on fire — I wanted to change the relationship between the audience and the imagery in my remix of Randall’s video.

Design inspiration for this project was also drawn from “Lightweeds” (2005) by Simon Heijdens.[2]

Photo credit: Simon Heijdens

Photo credit: Simon Heijdens

 As Heijdens says:

“Lightweeds is an ecosystem of living digital plants that overgrow the man-made space, moving and growing directly according to outdoor weather patterns.”

Weather sensors bring data from the outside world into the museum and Heijdens’ digital plants respond in real-time. They also respond to people passing by.

Lightweeds, like other outside-inside data-driven installations, provided conceptual inspiration to include oceanic (sea rise) data as an input into the interactive experience.

 

2. Conceptualization

When I watched the “No Contact” for the first time several years ago I sat mesmerized by the scene where two burning bodies seemed to hurtle through space in slow motion, moving towards each other but never colliding. I knew that Okita (a collaborator of mine) had wanted to explore themes of urgency and isolation in his film but that’s about all I knew in terms of his thematic intention.

Design Considerations

For this remix I wanted to play with the idea of a group of people watching this poetic imagery but situate the audience in a different relationship with the material by implicating them in a network of interactions.

While I was thinking of acts of self-immolation in the face of climate change while making this piece,[3] I wanted the audience to reach those themes on their own, preferring the risk that they wouldn’t make those connections (or make other ones) to being too literal or communicating my point-of-view too loudly through didactic design.

Screenshot of a portion of my design notebook

Screenshot of a portion of my design notebook

I eventually settled on a core interaction: lighting a match to light a candle. The intention was that a simple interaction with analog materials would connect the audience more intensively to the screen where bodies on fire were moving through space towards each other, and at the moment of lighting the match, towards the participant.

I also wanted to bring the outside world in and chose to contrast the fire with water, specifically the world’s rising oceans. More on that, below.

My main research questions were:

What does it feel like to light a match, then a candle, and then use flame as an interface element to control this durational video (an excerpt from Okita’s No Contract)? How can a series of small interactions with analog objects (wood, fire) in the physical world, accumulate and ultimately engender a feeling of connection with an abstract two-dimensional digital object (time-based media)?

Technologies and Materials Used

I chose to use an Arduino microcontroller to detect the act of lighting a match and to measure the change in light values as the participant lit candles and blew them out. The sensor values were networked using OSC input into TouchDesigner (TD) where I manipulated the video and sound. The intention was to bring the livestream of the ocean into TD as well, however I came up against some limitations, described below.

3. Prototype presentation and documentation

Location: Room 510 at 205 Richmond St. West (OCAD U)

Installation dimensions: Approximately 5’ wide by 10’ deep

Number of participants: Single or Multiple

Hardware: 1 x Laptop, 1 x Arduino nano 33IOT, 1 x Light sensor, 1 x Bread board, 1 x Micro USB cable, 9V battery, 2 x external monitors

Software/language: Arduino IDE, TouchDesigner, Python

Installation Design:

I repurposed a box from a previous group project (Emotive Objects) and used it to hide the electronics and provide a place for the sensor to be surrounded by candles. I covered the cut outs that had originally been positioned in opposite corners and fashioned a cut out in the centre of the box.

The sensor was extended from the bread board, threaded through the cut out and taped down so that it was positioned as unobtrusively as possible.

A 9V battery was used to provide external power to the microcontroller, allowing the entire prototyped circuit to standalone (separate from the laptop) and talk to TouchDesigner running on the laptop.

box

Job type: laser cut

Box size: 12”x 12”

Material: Baltic Birch

Material Thickness: 6mm

Network Diagram and User Experience Description

The following section outlines the prototype user experience that was demonstrated during the December 6th critique.

The network was as follows:

networkdiagram_strangenetworks_davido

Network diagram

The 5’ x 10’ space was setup with a table covered in black fabric as the central point of interest. On the table were a monitor and six unlit candles on one end, and the black box with embedded light sensor surrounded by six candles, on the other. Some loose matches and a match box were placed next to the box, alongside a champagne glass to hold the discarded match sticks. One of the candles was lit and a few burnt matches were floating in the champagne glass (water was added so that the burnt matches would make a sound when discarded into the glass).

Opposite the table, on the periphery of the installation space, was a smaller table with an iPad and Bluetooth speaker. A livestream of the ocean was displayed on the tablet with the sound coming through the external Bluetooth speaker.

The opening state of the installation was designed so that the video of the bodies on fire was glitching but would shift into a smoother playback state as candles were lit. As more candles were lit the playback speed increased up to 1.5 times normal. If the candles were blown out the video would slow down and eventually stop. If light values in the room fell below the opening state, the video would have (in theory) moved into a reverse playback state.

userflow_strangenetworks_davido

User flow diagram

 

still_candlesscreen_strangenetworks_davido

still_lightingcandle_sn_davido

Video documentation of user experience (You Tube) 

Link to code on GitHub (TouchDesigner file, Arduino code)

If I were to take this prototype forward, the live stream of the ocean would be projected onto the outside of the installation and respond to the audience milling about outside the core space, perhaps creeping up the sides of the installation’s exterior walls, the sound of the ocean getting louder as time passed and more people gathered. For this prototype it simply formed an ambient layer that could not be interacted with. Attempts to bring the stream into TouchDesigner were unsuccessful. I was able to bring a YouTube stream in but unable to avoid the stream being interrupted by commercials. Initial research suggests that public RTSP video stream might be the best way to proceed going forward.

4. Next Steps

This version 1 prototype showed me that there is something to the interaction of lighting a match and using the flame as interface element, however the first thing I would do if I were to take this forward would be several rounds of user testing to get a better sense of the sensations, emotions and thoughts that the installation in its prototype form was engendering in participants. I would start with asking people to talk aloud as they were moving through the piece and then follow-up with a series of open-ended questions. I would compare their perceptions with my intentions and tweak the design until I found the sweet spot where interactions evoked affective responses — autonomic, pre-subjective, visceral and preceding emotional states, or as Anable writes, “…forces that inform our emotional states”[4] — and some (but not all) participants thought of what it means to perform an act of protest such as self-immolation, and what it means to live in a world where they let others perform the act on their behalf.

 

Bibliography

[1] https://www.randallokita.com/no-contract

[2] http://www.simonheijdens.com/indexsmall.php?type=project&name=Lightweeds

[3] For example, Wynn Bruce who set himself on fire outside the United States Supreme Court in April 2022 https://www.nytimes.com/2022/04/24/us/politics/climate-activist-self-immolation-supreme-court.html

[4] Aubrey Anable, Playing with Feelings: Video Games and Affect (Minneapolis: University of Minnesota Press, 2018)., xvii

 

 

Click to Win – Racing game – Project 3

Click To Win – Racing Game

Yueming Gao — Prathistha Gera—Maryam Dehghani—Firaas Khan

doc-project301

doc-project302

doc-project303

doc-project304 doc-project305 doc-project306 doc-project307

doc-project308 doc-project309 doc-project310

ezgif-com-gif-maker-9 ezgif-com-gif-maker-8 ezgif-com-gif-maker-10

 

Another future possibility-

In the future, this may also be used to accommodate visiting siblings who must wait in a waiting room in children’s hospitals. Due to the possibility of germs on hospital equipment, this person will only use their phones to communicate and engage with others.


Link to codes 

Host Code

Player Code

Bibliography

Click Click Click. https://studiomoniker.com/projects/click-click-click.

Credits. https://clickclickclick.click/credits.

The Sweet Screen. https://studiomoniker.com/projects/the-sweet-screen.

https://momentfactory.com/work/all/all/changi-experience-studio-at-changi-airport.

About. https://momentfactory.com/about.

Sketch 5 – Wentian Zhu

Leap Motion and Touch Designer

td-screenshot

I used Leap Motion to get the data of my hand (x/y/x parameters) to control the rotation and the size of the model.

Video link: https://drive.google.com/drive/folders/1wk9JfA8lo1ICsoC15DdlgkWdrMKPcXlY?usp=share_link