MVP Console

MVP Console

untitled-1

Project Description

MVP Console is an interactive installation based on the tangible interface experience. Players are invited to play with three digital activities created in Processing: Core ball, Spinning Music, and Starry Drum. MVP Console is a handmade cardboard box with 5 aluminum foil buttons that attached the 5 pins from MPR121 touch sensor. In this project, MVP Console stands for Musical, Visual, Physical Console; it makes the players enjoy physical touching, music creating, and music visualization. The players could switch the card paper upon the box to play the matching activities.

The first activity is Starry Drum. Once the player touches the buttons, they would hear constant drum beats and the starry dots would appear on the screen. Every button represents a different part of the drum and the different sizes of the starry dots. The dots would fade away once the new dots appear. In this activity, the players are creating an image of a starry sky and play with the drum at the same time. 

The second activity is Spinning Music. The players could touch the button to play the songs and watching the spinning pattern, which is the music visualization. There are 5 songs overall, the spinning pattern would change color and speed that matches for each song. The songs I used are Woodkid’s music: “L’aérogramme de Los Angeles”, “Iron (Quintet Version)”, “I Love You”, “Iron” and “Run Boy Run”. The spinning pattern and song would stop once the player releases the button, it is a unique way to touch with music and visuals.

The last activity clones a popular phone game, Core ball. The players need to shoot the given needles to the rotating circle. Every shot needle would be stuck and rotate with the circle. If your needle hits an existing needle on the circle, it will be automatically judged lose by the system, you need to restart to continue. The screen would turn blue when you win the level, and red if you lose the game. The players need to press three aluminum button that labeled C, S, and SHOOT! to play, everything is drawn and labeled on the card paper. The players can see their level on the screen, and the number of needles left in the center of the circle. The higher the level, the more needles you need to shoot. 

Experience Video

How it works

https://ocadu.techsmithrelay.com/rcWR

Final project images

2

3

1

img_2463

img_2663

Development images

img_2781

img_2306 img_2309

img_2255

Link to the Arduino code hosted on Github

https://github.com/bernicelai/experiment3.git

Circuit Diagram

sketchf

 

Project Context

MVP Console is inspired by my favourite entertainment and learning device in my childhood, an electronic dictionary. It is like a small size laptop, it was heavy but convenient. Other than look up the vocabulary on it, it is my “all-purpose media player”. It could be the music player, and once the music is playing, the screen would show the music visualization, which is the spinning firework pattern. Although it has a dull green dot-matrix screen and it has only one song, its music visualization was like magic to me in my childhood. It is also my game console, it has one game, Snake, which always made me lose time track. Finally, it is also my musical instrument. Every button has a different sound effect, I used that to cover all popular songs in that period, I even used that to perform in the music class. The electronic dictionary is now obsolete, its functions are simple, but it gave me countless happy memories. I want to recreate my childhood memories.

Once I know the theme of Experiment 3, Tangible interfaces, the first thing pop into my mind is about music. It reminded me of the tap-tap games, they have the perfect combination of touching, visual, and music. Other than that, it also reminded me of the electronic devices that could imitate the sound of real musical instruments, which could be a more convenient way of composing music. It could also be about music visualization, combining the physical interaction with the seem music could be an inspiring experience. Music is important in my life. It affects my emotion, and it is completely connected with my vision. I composed with my vision and I listen to create images.

For this project, I want to explore the relationship between music, visual, and physical. I want to make an “all-purposed media player”, an “old game console”. Inspired by Tyler Crumpton’s Capacitive-Touch Arduino Keyboard Piano, I want to make the button with aluminum foil, tapes, and cardboard as my material. I made a cardboard box with 5 square holes on the top, and I sticked aluminum foil under each hole. There are three card papers with my drawing and holes that fit each game.  The players could switch the card paper on the box, like switching the disk of the game console. The touch of aluminum foil buttons is the Physical part of my MVP console. 

I used the sound effect of the button from my electronic dictionary to make music. Even the musical instrument are not with me, I could make music. It is kind of like the calculator cover music, which is now very popular. It is a creative new way to make music with unconventional methods. Each button of the musical calculators corresponding to a musical note. With two calculators, the players could try the classic piano-playing method of using both hands. (Lee) It is quite inspiring, as to play music with 5 buttons, I want to recreate it with the sound of drumbeats. When I was learning drums, with the practicing with the speed of beating, the countless beats in a second make me imagine a thousand-star sparkle, the stars blink with the rhythm. I want to combine those visual imaginations with the sound of the drumbeat. I purposely make the drumbeat unstoppable if the touch isn’t released, there is no single beat. To fit the speed of the drumbeat and the endless stars. Repeating the same sound could be beautiful, same as visuals.

Spinning Music is a musical visualization of the rotating pattern. The looping rotating pattern always makes me lost the time track for just staring at it. After watching the tutorial of the rotating pattern on Youtube, I think it could fit the music well, it could be the animation of music visualization. By changing the speed and color of it, it could work with music in different styles. That is one of the reasons I chose Woodkid’s music, he has various styles. His songs get to my emotions and speak to me a lot. The visual and the music reflects what he states “I like to translate sounds into images and images into sounds”(Dellarciprete) The pattern would keep rotating when we touch the button, when we stop it, even the pattern is static, we would feel like it’s rotating backward. It creates an optical illusion, which also indicated the melodies was stuck in our minds. Music affects our mood, and we are exploring 5 different emotions and memories in our mind in this activity. 

Coreball is a popular phone game, I fit it with the shooting sound effect. As there are 5 buttons overall and we only need 3, so I cut  3 holes on the card paper. After we are familiar with the game, we could form beautiful patterns, like the dandelion. The game rule is simple but quite addictive. Every shooting needle combines the human touch and sound effect. 

Overall, with the topic of the tangible interface, I make the viewers explore 3 types of touching. In Core ball, we used a single click to have one shoot and sound effect. In Starry Drum, the drum sound is constantly beating after we touch the button, we would not hear a single beat. In Spinning Music, the music start when we touch it, and we need to hold the button to keep the music playing, it would only stop when we release it. I want to make it like a real game console, with only a few buttons, but you can play multiple games. 

 

Citation

Crumpton, Tyler. “Capacitive-Touch Arduino Keyboard Piano.” Instructables, Instructables, 23 Oct. 2017, www.instructables.com/Capacitive-Touch-Arduino-Keyboard-Piano/.

Dellarciprete, Alex. “Woodkid’s The Golden Age – An Analysis of an Audiovisual Experience.” Medium, Medium, 12 Dec. 2017, medium.com/@alexdellarciprete/woodkids-the-golden-age-an-analysis-of-an-audiovisual-experience-44ec7d73047. 

Lee, Dami. “Calculators Are the Latest Instrument Used in Creative New Ways to Make Electronic Music.” The Verge, The Verge, 11 Oct. 2017, www.theverge.com/tldr/2017/10/11/16459502/musical-calculator-youtube-super-mario-theme. 

Frog You

Frog You by Jay

frogyou
Such a cute… frog?

Project description

Frog You is a game created in Processing that is designed for play with a tactile interface. This clone of the 1981 Konami arcade game, Frogger, places you in control of a randomly chosen animal (none of which are frogs), with a mission to cross a busy highway and use boats to cross a raging river. Why does the frogger cross the road? To get to the other side, of course! Upon reaching the end of a level, your character moves on to the next, where they face a distinctively new challenge, as the obstacles are randomly generated. At the top of the screen, you can see your Level and Score, which is your total consecutive forward jumps without being smooshed, encouraging careful, but persistent, gameplay. The game comes with a controller, powered by an Arduino using touch sensor input, designed to add an additional layer of difficulty as you learn to orient yourself to counterintuitive controls. Those of us who play games a lot can easily take for granted our familiarity with controllers, but with this game, I sought to use the tactile interface as a means of forcing the player to “re-learn” how to use a controller. The game is quite difficult early on, but its rapid iteration time (and the consolation prize of cute animal avatars) keeps the player invested long enough to improve.

Experience Video

https://drive.google.com/file/d/1QhWHIgnuXwmdcRN6qnVPaF5Yu6wgHku3/view?usp=sharing

How It Works

My microphone was determined to not work, so I’ve captioned the images with what I’m doing.

123914532_3524904954256838_2445554533621851887_n1
The game treats cars and boats as the same object, but detects collision differently depending on what lane they are in.
124032079_552802885585897_5594261537067028696_n1
The Arduino is taped to the back of the controller.
124064614_373377970531906_416616692752811729_n
The “buttons” point the direction your character moves, but their positions are inverted. This adds an extra layer of difficulty as the user has to learn how to use the controller.
124262192_1018799285259723_2348292965848266396_n
“Dammit! Oh, look, a giraffe.” The character image changes with every reset. There are 10 total character images, randomly selected, so the player isn’t as demoralized by losing.

Progress Pictures

123715588_274530833977456_5008266504148897249_n
Started by grabbing assorted cardboard from my basement
123735003_946237742448691_5576081387923882288_n
Cut away until I had something roughly the size of a controller. A bit awkward to hold, but that’s also kind of the point
123916975_383551679460074_3310705655851602773_n
What’s wrong with the control scheme? It tells you exactly what it does
124570331_1245097565868163_624833726674875065_n
Taped the Arduino to the back. Encourages users to hold the box in a way resembling a controller, rather than they would hold a phone.

Link

GitHub:

https://github.com/JayTheDaniels/Frog-You

For a standalone version of the game, check out my itch.io page!
https://jaythedaniels.itch.io/frog-you

Circuit Diagram

experiment3_circuit

Project Context

The context of this project exists less in relation to other works, and is more in relation to my own aspirations, particularly the upcoming thesis project. I had spent the last few months slowly trying to self-teach game design and development, but didn’t feel as though I was making much progress. Just before the start of Experiment 3, I realized this was due to not having actually made anything, instead only watching lectures and tutorials. For this project, I had my own side-mission of making a proper game as the screen-based interaction component, and using a tactile interface to augment the player’s experience in some way. Admittedly, I spent far more time conceptualizing the former than the latter, but there was one key idea that I wanted to explore using the two: how to effectively teach playing games.

Have you ever seen someone who doesn’t play games hold a controller? It’s almost always positioned awkwardly: index fingers placed on joysticks, thumbs off controller entirely, trigger buttons largely disregarded. Even those who play games more regularly may hold controllers in different ways. PC Gamer magazine found that amongst its own writing staff there were three different ‘styles’ for holding controllers, and that’s before introducing the awkward holding of a Nintendo 64 controller, with a joystick in the dead center (PC Gamer, 2020). Games teach players how to use the controls for the specific game, but few seem to actively teach how the controller should be held. For the inexperienced player, this can add an unexpected level of difficulty and may increase resistance to investing time to learn. For the experienced player, using more efficient styles such as ‘the claw’ grip, this can have negative impacts on health, as: “[w]hen you use the claw grip on a console controller (Xbox, Playstation, or any other), there are five muscles that may be involved. The fashion in which these muscles are engaged creates the potential for pain and injury risk—specifically, in your index finger” (Esports Healthcare). The variance in console manufacturer controllers further adds to the confusion, as the way one holds a Nintendo Switch is different from a PlayStation controller.

The goal of the tactile interface component was to create something that would force the player to learn how to use the game controller. In order to achieve this, the controller needed to be changed in a way that would prove troublesome for experienced gamers and novices alike. I quickly decided to change the typical layout of the controls, directional keys pointing outwards, to its reverse: directional keys pointing inwards. As I played through it, I had to be consciously aware of what each button did, as muscle memory was rendered unreliable. This demonstrates the importance of design decisions made for the screen-based interaction. I intended to encourage an iterative approach to learning the controls: provide players with a simple challenge (avoid cars, ride boats), give a reward for failing (random animal graphic), and provide metrics to show improvement (Level and Score). Because levels are generated semi-randomly, the player can continue for as long as they like, developing their mastery over the controller as they progress. In an effort to guide the player into holding the controller in a specific way, I decided to place the Arduino directly on the back. This was perhaps not optimal, but done as a response to a tester holding the controller with one hand, and using the other to touch the directional pads (which worked brilliantly and is still under consideration).

Lastly, this experiment provided me with the opportunity to apply some of the game design theory I had acquired, but had yet to put into practice. The experience of having put together a short game with a specific purpose invigorated interest in doing something similar for the thesis project. While it may not be clear yet what the purpose will be, I feel especially confident that I can produce a game and apply theory to it in a meaningful way for my thesis.

Works Cited:

Esports Healthcare (2020). Claw grip for controller users: 7 steps to stay healthy. Esports Healthcare. Retrieved from: https://esportshealthcare.com/claw-grip-for-controller/#What_is_the_claw_grip_for_controller_users

PC Gamer (May 16, 2020). How do you hold a controller? PC Gamer. Retrieved from: https://www.pcgamer.com/how-do-you-hold-a-controller/

 

Visual Drumstrokes by Unnikrishnan Kalidas

key-image

About the Project:

This experiment is created as add on to my drumming hobby, in order to use tactile hand touches on a conducting surface as a drum pad.  The idea is to not only emulate specific drum sounds for different electrodes but also to send out visuals for each signal that is being triggered. I feel I  have a particular sound I go for when playing the drums, and the standard sound that a kit emulates is not enough.  Due to the present COVID situation, I was not able to access a Drum studio, but later on, I plan to connect this system to a live set up with a screen to exhibit the visualization and an amplifier to emulate the additional sound I plan to layer onto the existing drum sound, using the help of Piezo sensors instead of the MPR121 due to the latency issue. This can, later on, be used for actual musical gigs. The idea is to create a more engaging experience of beats, in terms of sound and visuals. For this project, I plan to make a two-electrode system, one to emulate the bass drum sound, and one to emulate a snare drum sound.

A simple project that inspired me to do this was www.patatap.com,  an online website that produces random high fidelity sounds on keystrokes of the keyboard, along with a momentary animation for the beat, which is characteristics of a single beat.

Experience Video:

https://www.youtube.com/watch?v=H0P8gQQJouM

 

How it Works Video: 

https://www.youtube.com/watch?v=ee-6GHXOPGY

 

Final Project images :

touchpads

final-image

key-image

 

Developments Process:

Arduino Leonardo-

img_0030

 

MPR121 Sensor-

img_0033

 

Sample touchpads-

touchpads

 

Trial Video for Keystrokes:

https://www.youtube.com/watch?v=gpfnAl0rmS0&feature=youtu.be

 

GitHub link:

https://github.com/Unnikrishnankalidas/Tangible-interface

 

Circuit Diagram 

exp_3_circ

Project Context

The concept for Visual Drum-strokes came about, with the sole idea to use a tangible action like touch to produce a visual and aural signal, it could be music, instruments, or just even noise. My interest in drumming as a hobby is what further lead me to think in this direction and create a drum sampling machine that can utilize any capacitive surface as a point to provide signals for the same. As I had joined the course comparatively late, I had some restrictions in terms of the knowledge I had with coding and also the equipment available to me at my disposal. Thankfully I had few of the components already with me and with the help and support of Nick and Kate I was able to come up with something on short notice. The idea was simple, but I wanted to execute it in a neat manner, and hence I decided to associate my electrode signals from the MPR121, to not only produce a WAV or MP3 sound stored on the sound library of Processing but also to provide short and beat like animation with the limited knowledge of processing that I had.

A few years back I had come across this website called http://www.patatap.com which was basically a visual animator for keystrokes playing different sounds on the computer. This was clearly my inspiration to go about this project. Not only did the idea get me thinking the very simplicity of the animation shown on the page made the page seem highly engaging. So my idea was to recreate the same idea with my own sounds, and animation but with the help of Processing as I was already running short of time.

The other projects that inspired me and helped me to think creatively were:

https://www.instructables.com/Capacitive-Touch-Arduino-Keyboard-Piano/

https://create.arduino.cc/projecthub/user4573/copy-of-paper-piano-62302f?ref=tag&ref_id=capacitive&offset=15

https://learn.adafruit.com/capacitive-touch-drum-machine

 

 

 

Dhyāi Drishti by Simran and Krishnokoli

dhyai_drishti_cover-01

cover-image    img_5787

About the project

Dhyāi Drishti is an interactive installation, which can be optimized as a product to help induce a state of lucidity, relaxation, concentration and mindfulness. It is a simple device that responds to touch and creates a meditative atmosphere around the viewer. Dhyāi Drishti’s installation consists of a tactile yogic figurine along with a screen and speaker. The figurine is embedded with small aluminum buttons – positioned according to the chakras or ‘energy points’ in the body and embellished with their representative colours and icons. On touching these buttons, the viewer can experience a meditative visual of the selected chakra on the screen along with a ruminative music of ‘tanpura’ and ‘sitar’ in the background to activate the chakras in their body.

The term ‘Dhyāi’ is a Sanskrit word for concentration or mindfulness. It is the origin of the word ‘Dhyana’ in Hindi or Dhyān in Bengali which means meditation. The term ‘Drishti’ is a Sanskrit word for vision. Our project, ‘Dhyāi Drishti’ is intrinsic to the concept of chakras and their meditative prowess. It is supposed to help the viewer activate a particular chakra as they see the visual, hear the music and enter a meditative state. Each sound and visual is carefully designed and curated according to each chakra. Chakra, meaning “wheel” in Sanskrit, represents a series of energy points (prana) in the body. While they are in the body, they’re not physical centers. They could be considered “astral”, or in what is often alluded as our “subtle body”. The concept behind them is ancient — the first mention appears in the Rig Veda, dating back to approximately 1500 B.C.E. — some of the oldest writing in our civilization. Similar versions of the chakras are incorporated in Hinduism, Buddhism, Jainism, and several New Age belief systems. The chakras go up and down the spine, from the bottom up to just above the crown of the head. They each represent a step forward in evolving consciousness. Below is a reference to all the chakras and their intrinsic meaning.

dhyai_drishti_icons-02Muladhara: is the root chakra, it is red in colour and symbolizes safety, survival, grounding, nourishment from the Earth energy. Note that in chakra healing practices, red may denote inflammation at the physical level.

dhyai_drishti_icons-04Svadisthana: is the sacral chakra, it is burnt brick in colour; it carries meanings associated with emotions, creativity, sexuality, and is associated with water, flow.

 

dhyai_drishti_icons-03Manipura: is the solar plexus chakra, it is yellow in colour and symbolizes mental activities, intellect, personal power, will.

 

dhyai_drishti_icons-05Anahata: is the heart chakra, it is green in colour; it is connected with love, relating, integration, compassion.

 

 

dhyai_drishti_icons-06Visuddha: is the throat chakra, it is blue in colour and symbolizes self-expression, expression of truth, creative expression, communication, perfect form and patterns.

 

dhyai_drishti_icons-07Ajna: is the third eye chakra, it is indigo in colour; it evokes intuition, extrasensory perception, inner wisdom.

 

dhyai_drishti_icons-08Sahasrara: is the crown chakra, it is purple in colour; it’s associated with the universal, connection with spirituality, consciousness.

 

 

 

Experience Video

Behind The Scenes Video

Project Images

Experience 1

img_4036      cover-image

img_4023-copy      img_3997

Experience 2

img_5787     img_5824

img_5822     img_5785-copy

Development Images

img_4050     img_20201106_131214

imgpsh_mobile_save-1     imgpsh_mobile_save

Github Link

 

https://github.com/Krishnokoli/Experiment-3-Tangible-Interfaces

Fritzing Diagram

screenshot-2020-11-06-at-12-28-19-am

Project Context

The ideation and concept of Dhyāi Drishti was initially conceived from our particle code. While brainstorming on a probable concept for our project, we perceived the particle scatter as a meditative animation. While discussing more on the concept, we realised how often we tend to miss out on self actualisation and meditation. Since meditation is extremely important for not only well-being but also, mental clarity, concentration and a myriad other reasons, we figured, making an interactive tactile meditating aid, would definitely be a good idea after all. Also, since one of us fell sick during the project process, it really helped us calm down, and inculcate meditation into our daily activities (especially while coding).

One of our main inspiration for the project was the Unyte. It is a relaxation or stress-management program with a biofeedback device known as the iom2, it tracks the breathing and heart rate and guide through the practice. The iom2 measures Heart Rate Variability (HRV), a measure of the variation in time between heartbeats and is considered to be a strong indicator of meditative state.

While most of us are stuck at our homes, socially distancing to prevent further spread of the COVID-19 virus, we often find ourselves restless, sleepless, agitated, annoyed bored or unable to concentrate. Because social interactivity is so elemental of our nature, we find it very difficult to isolate. It is proven that meditation can help increase mindfulness and overall sense of well being in these trying times. Our project is a humble adventure to promote the simple activity of meditating. We look forward to increase the scope our project and create diverse interactions to make meditation more fun, easy and experiential. We would also seek opportunities in the future to work together on this project and further develop our concept to concrete output as an installation/product/tech-wearable.

Citations

“Journeys – Unyte.” Accessed November 7, 2020. https://unyte.com/pages/journeys.

“Pause | Mindful Movement for a Happier You.” Accessed November 7, 2020. https://www.pauseable.com/.

Peck, Bob. “The Chakras Explained.” Medium, April 1, 2020. https://medium.com/@bewherehow/the-chakras-explained-6aa43e1f0f5c.

Salehzadeh Niksirat, Kavous, Chaklam Silpasuwanchai, Mahmoud Mohamed Hussien Ahmed, Peng Cheng, and Xiangshi Ren. “A Framework for Interactive Mindfulness Meditation Using Attention-Regulation Process.” In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 2672–84. Denver Colorado USA: ACM, 2017. https://doi.org/10.1145/3025453.3025914

Sparks, Lloyd. “The Neurophysiology of Chakras.” Medium, June 13, 2019. https://medium.com/@lloydsparks/the-neurophysiology-of-chakras-3f20a0f5b3b5

“The Story of PAUSE.” Accessed November 7, 2020. https://www.ustwo.com/blog/the-story-of-pause.

Vidyarthi, Jay, and Bernhard E. Riecke. “Interactively Mediating Experiences of Mindfulness Meditation.” International Journal of Human-Computer Studies 72, no. 8–9 (August 2014): 674–88. https://doi.org/10.1016/j.ijhcs.2014.01.006.

 

Cyber Box By Jessy

cover1 cover2

Project description 

Cyber box is an interactive box that consists of three types of action to trigger the interaction between the screen and the tangible tactile interface. And the whole process is sequential and narrative. In the first part: users will see a series of dialog box pop whenever they putting a sticker into a designated area. This action represents people making comments or labeling others on social media. In the second part, uses can control the volume of the sound which is consisting of people talking. Additionally, the change of volume is also demonstrated by the dynamic sound waves on the screen. These sound came from people’s daily conversations, and when they are piled or becoming louder, it will bring a sort of oppression that is uncomfortable for the listeners. In the third part, users can push the ‘button’ to magnify the virtual balloon on the screen and the ballon will eventually boom after repeating this action.

Experience Video

How It Works 

Final project images

final1

screen

final4

final3

final2

Development images

dev1

dev2

dev3

dev5

Link to the Arduino code hosted on Github

https://github.com/xinzhang-jessy/cyberbox-experiment3.git

Circuit Diagram

circuit

Project Context

With the prevalence of social media and digital applications, our relation with the virtual world is much more closer. People used to post their feelings and opinion on social media. In this virtual digital world, people can share comments, photos, posts, and these contents are viewed by strangers or acquaintances. Increasingly, this digital World gives rise to Cyberbullying. Cyberbullying is bullying with the use of digital technologies. It can take place on social media, messaging platforms, gaming platforms, and mobile phones  According to the Cyberbullying Research Center, about half of young people have experienced some form of cyberbullying, and 10 to 20 percent experience it regularly. It is quite easy to post images or text on social media, just like we paste a stick to a whiteboard which is so common that we do not need to think about it before doing. Similarly, people can easily post hurtful or spread rumors things online and pretend to be someone else, so Cyberbullies may not realize the consequences and even find it funny. Through the Cyber box, I try to connect the physical action with the behaviors that existing in the digital world, specifically the first one is pasting a stick repeatedly to see lots of dialog box which will make people feel depressive after the screen is filled with them. This represents the text cyberbullying on social media which may reflect badly later in people’s life. Even the text can cause great psychological stress, just like the situation made by the second interaction of Cyber Box. The controller is like a filter, but in a virtual world, it is difficult to filter the cyberbullying. The pressure produced by Cyberbullies is similar to the outcome of the physical act of pushing. It means the human mind is also under the final pressure when facing lots of ‘push’.

Even though we are constantly switching between the online world and the physical world, but through the virtual world we are mainly connected to the gestures or postures. The word “touch” is in the word “touchscreen,” but tapping and swiping a cold flat piece of matter basically neglect the sense of touch. You are capable of experiencing only a fraction of what your sense of touch allows you to during the long hours of manipulation with touchscreens. The goal of Cyber Box is to seek more connection between the screen-based world and the physical world.

Citation

1.Bullying Statistics, ‘http://www.bullyingstatistics.org/content/cyber-bullying-statistics.html’

2.Designing For The Tactile Experience, Lucia Kolesarova, ‘SmashingMagazine’,’https://www.smashingmagazine.com/2018/04/designing-tactile-experience/’

Hybrid Plant in Augmented Reality Space

77

This project attempt to present a Tangible Interface on a physical plant with virtual on-screen contents in Augmented Reality space. Most AR experiments on mobile or on Mixed Reality glasses focus on screen-based interaction with gestures. However, I think these experiments have less engagement with the physical context. Moreover, most AR experiment has less connection between the physical and virtual elements, they just separate in the space. This project attempt to provide more engagement experiment with the physical plant and the virtual content for the topic Tangible Interface and anchor the virtual content onto the plant to expand the concept of Hybrid object. There are three sensors are installed in the plant, two of them are Ultrasonic Sensors and the other is a Photoresistor. The Ultrasonic Sensors are installed on the two sides of the plant, and the photoresistor is on the top of the plant. The interaction happens in how people touch different points around the plant with their two hands. The virtual plant is visualized as a particle in the AR space, and the touching of different points and combinations of touching with hand will affect the movement of the particles. Behind the sense, I used MQTT as the framework for the agent to agent communication. The sensors in the plant capture data from the context and this information are host and send by the ESP 32 to the cloud and then to the visualized devices. Unlike the traditional service to client communication, this method enables agent-to-agent to communicate directly by publishing and subscribe to the topic.

Experience Video:

How It Works:

Project images:

2

3

video-nov-06-3-00-11-am-1478-still001

Development images:

photo-2020-11-05-6-44-50-pm

Developing Process

photo-2020-11-05-7-05-46-pm

Arduino Sensors

 

unity

Unity Scene

The unity project file and XCode file for IOS

https://www.dropbox.com/sh/fi55jlgzia1oq0c/AADMRek_Kgsj_G5MfaCGC65ia?dl=0

Arduino Code:

https://github.com/jieguann/CreationAndComputation/blob/master/Experiment3/Arduino/Plant/mqtt_esp32/mqtt_esp32.ino

Circuit diagram:

diagram

Context:

Billinghurst mentions that the AR display and tracking techniques for interaction with AR space is limited, by passive viewing or browsing virtual information simply. Although some systems provide manipulation, adding and deleting virtual objects in the AR scene, they are still having less engagement with uses (Billinghurst).  To expand the interaction in AR, they present a new approach to Tangible Augmented Reality for designing AR interfaces. The concept of Tangible AR interfaces require a virtual object anchor to a physical one, they are equally important for interactions (Billinghurst). My project is an example of how I think of the tangible AR interface in the framework of Billinghurst. By touching different special points on the plant, the virtual particles are able to perform the various movements in AR space, and it is how I bring the physical interaction into the AR interface.

Lok addressed that most virtual environments (VEs) only includes virtual objects, and they presented a hybrid environment (HE) system for getting physical objects into VRs (Lok). The interaction with digital objects in physically manipulating is still an issue in an immersive virtual environment (Krause), and this also happens in the augmented reality space. Krause says that the users of the hybrid objects in a virtual environment can enhance the alignment of physical and virtual parts seamlessly. This project expanded the concept of a hybrid environment (HE) to a hybrid object (HO), which means that objects have virtual and physical bodies in a mixed reality space. The virtual particles, which visualized by the device, around the plant can be responded to the physical touch of the plant. Such physical action present on the virtual part of the object can be defined as the unity of virtual and physical.

The connection between the virtual and physical is one of the big challenges of this project, the Internet-of-Things framework helps me embody my project of the wireless connection. This IoT-enable plant performs a simple context-awareness and send the data to the visualized device. Unlike the traditional framework of using a server to client communication, this project test the MQTT framework in which to enable agent-to-agent communication (The Standard for IoT Messaging).

Works Cited

Billinghurst, Mark, Hirokazu Kato, and Ivan Poupyrev. “Tangible augmented reality.” ACM SIGGRAPH ASIA 7.2 (2008): 1-10.

Lok, Benjamin, et al. “Experiences in Extemporaneous Incorporation of Real Objects in Immersive Virtual Environments.” Proc. IEEE Virtual Reality 2004. 2004.

Krause, Frank-lothar, et al. “Usability of hybrid, physical and virtual objects for basic manipulation tasks in virtual environments.” 2007 IEEE Symposium on 3D User Interfaces. IEEE, 2007.

“The Standard for IoT Messaging.” MQTT, mqtt.org/.

 

Bumping Beatzzz by Abhishek Nishu and Clinton Akomea-Agyin

GROUP MEMBERS–

Abhishek Nishu and Clinton Akomea-Agyin

PROJECT DESCRIPTION–

Bumping Beatzzz is an installation inspired by the Adafruit Capacitive Touch Sensor Hat by Adafruit. The installation looks to replicate this idea by making use of the Touch Sensor and the Arduino. Instead of using fruits, we have chosen to explore different conductive materials. With one of our projects, we have explored different types of coins. Such as copper, steel, and regular metal. Also tried soldering the pins to coins and later learned that to solder you special need a rough surface.  While the other project explores aluminium foil as a conductive material.

This project makes use of all 12 pins on the Touch sensor with each of the pins owning a unique sound. The first 0-4 pins are drum beats, the next 5-8 comprise of vocals and pins 9-11 are made up of funky and jazz beats. When an individual taps on any of the conductive materials, either the aluminium foil or the different coins, this displays an accompanying visual on the screen. Depending on which coin/aluminium foil shape is pressed, this affects the visual experience one gets.

EXPERIENCE VIDEO

https://ocadu.techsmithrelay.com/5VMu

FINAL PROJECT IMAGES–

1-

final-1
Clint

2-

2-nishu
Nishu

DEVELOPMENT IMAGES–

HOW IT WORKS VIDEO LINK –

https://ocadu.techsmithrelay.com/YSzu

1-Rough development of idea through sketches

nishu-dev
Nishu
Clint
Clint

2- Testing & building circuit

Nishu
Nishu
Clint
Clint

3- Began Structure development

Nishu
Nishu
Clint

4-First glimpse of structure coming together

Nishu
Nishu
Clint
Clint

LINK TO THE ARDUINO CODE HOSTED ON GITHUB–

https://github.com/AbhishekNishu16/Bumbing-Beatzz.git

CIRCUIT DIAGRAM CREATED ON TINKERCAD USING ARDUINO UNO–

bumping-beatzzz-fritzing_bb

PROJECT CONTEXT–

For this project, we decided to explore the relationship between sound and visuals. We began with being inspired by the Calvilux by Jonas Fredemann Heuer. The Clavilux is an interactive instrument for generative music visualization, made of a digital piano with 88 keys and midi output, a computer running a vvvv patch and a vertical projection above the keyboard [1]. Each note played is represented by a new visual element in the form of a colour stripe. Each stripe would also capture the velocity and the length of each note. This got us thinking, what is the relationship between sound and visualization. And how does the combination of the two enhance our experience.

 Research shows us that soundtracks can influence the emotional impact, interpretation and remembering of visual information. And so a research by Marilyn G Boltz, Brittany Ebemdorf and Benjamin Field explores how visual information influences the perception and memory of music [2]. Listeners were presented with affectively ambiguous tunes paired visual displays of different formats and styles. After which each participant was to provide a set of perceptual ratings that evaluated different melody characteristics and qualities. Results showed that both affect and format of visual information differentially influenced the way a melody was perceived. And for some it also revealed that the effect of visual displays distorted melody recognition. The next research we looked at was the emotional impact of soundtracks.

Additional research shows that compared to audio-only presentation, audio-visual congruent presentation can lead to a more intense emotional response [3]. Listeners were presented with audio-only condition, audio-visual congruent condition, and audio-visual incongruent condition and then required to judge the intensity of emotional experience elicited by the music [3]. Each participant’s emotional responses to the music were measured using self-ratings and physiological aspects, including heart rate, skin temperature, EMG root mean square and prefrontal EEG [3]. Relative to the audio-only presentation, results showed that the audio-visual congruent presentation led to a more intense emotional response [3].

Through exploring the different effects of soundtracks on the emotional impact and remembering visual information, we can see music’s ability to transcend simply audio and extend into the visual

CITATIONS

[1] Audiovisual Interactions: the Impact of Visual Information on Music Perception and Memory, Marilyn G. Boltz, Brittany Ebendorf and Benjamin Field, Sep 2009 https://www.jstor.org/stable/10.1525/mp.2009.27.1.43?seq=1

[2] Jonas Friedemann Heuer, Calvilux, 2009 http://www.jonasheuer.de/?clavilux

[3] The audio-visual integration effect on music emotion: Behavioral and physiological evidence, Fada Pan, Li Zhang, Yuhong Ou, Xinni Zhang, May 2019 https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0217040

Inconvenient Interface By Mairead Stewart

Project Description

As the name suggests, Inconvenient Interface is a tangible user interface that actively discourages a viewer from performing any sort of interaction. The surface of the triangular interface is covered in hundreds of nails, making it painful and difficult to touch. Once the prism is moved or picked up, red LEDs on the outside of the prism light up, and discordant music begins to play out of a connected computer’s speakers. The monitor of the computer, which is blank while the interface is at rest, begins to show an eerie red shape that morphs and moves in relation to the position of the prism. As soon as the prism is set down, the computer becomes blank and silent, and the LEDs turn off once again.

The Inconvenient Interface is designed to be the least user-friendly tangible user interface possible. The sharp exterior, flashing red lights, discordant noises and off-putting visuals encourage a viewer to stay away. In contrast, most devices we use daily are precisely crafted for maximum user-friendliness. These devices, especially wearable devices, are designed to fit seamlessly into a user’s life both physically and metaphorically. This can have many benefits, but it can also open users up to privacy and security concerns. By creating the most hostile device possible, I am critiquing the ways that we accept the convenience but disregard the harmfulness of the devices we use every day.

Experience Video

How It Works Video

Final Project Images

documentation1 documentation2 documentation3Development Images

development1 development2 development3Arduino and Processing Code

https://github.com/mc-stewart/Experiment3

Circuit Diagram

experiment_3_diagramProject Context

The Inconvenient Interface is designed to be the least user-friendly tangible user interface possible, unlike most if not all wearable devices and WiFi-enabled devices today. As the internet of things (IOT) grows to include increasing numbers of our daily products and services, comfort and ease of use are top of mind for technology designers. When consumers can often choose between a cheap traditional item and a costly smart device, or when they are happy to go about their routine without wearable technology, manufacturers and advertisers need to get creative convincing consumers that these products are useful or even necessary (Salmon & Ridgers, 2017). According to a report by Ledger and McCaffrey (2014), half of Americans who have owned an activity tracker never use it, while one third stop using their tracker within the first six months.

Consumer retention certainly seems to be a concern for wearable technology companies, however there are many successful applications of smart technology. In the field of healthcare, smart devices can monitor biometric data, regulate body functions such as insulin levels, or even warn a patient of a hazard in their environment in an assisted living situation (Lukowicz, Kirstein & Tröster, 2004). Smart devices can encourage healthy fitness routines, track sleep patterns, and provide safety through personal security and monitoring systems. Even the fashion industry is on board; advancements in the technology of smart clothes means that clothing can now do everything from play music to capture biometric data to regulate body temperature (Hurford, 2009).

While the positive applications of smart devices are certainly significant, so too are some of the concerns. The more devices become internet-enabled, the more privacy concerns begin to arise. In a healthcare setting for instance, the legality around collecting biometric data becomes a problem. As wearable fitness trackers become more popular, regulators struggle to keep up with companies’ treatment of users’ health data (Newman & Kreick, 2015). Additionally, the existence of that much personal data opens consumers up to possible data breeches. Smart devices generate massive amounts of biometric data, location data and other private information, which can all be susceptible to an attack. Regulations and security protocols have not caught up to this new reality, meaning we may not even know how safe this data really is (Thierer, 2015). Despite the fact that IOT is in its proverbial infancy, many companies are already working on ways to incorporate advertising into the everyday internet-enabled devices around a consumer’s home (Aksu et al., 2018). Thus, not only could smart devices be a possible security and privacy threat, they may signal a future of ubiquitous advertising content within the home.

To call attention to both the advantages and concerns of smart devices, the hostility of the Inconvenient Interfaces prism asks viewers to think more deeply about the problems with other technologies they encounter. Hopefully, the antagonistic design will prompt a user to pause and evaluate their relationship not only with this device but with the smart devices they see in the world around them.

References

Aksu, H., Babun, L., Conti, M., Tolomei, G., Uluagac, A., S. (2018). Advertising in the IoT Era: Vision and Challenges. IEEE Communications Magazine, 56(1), 1–7.

Hurford, R. D. (2009). Types of smart clothes and wearable technology. In J. McCann & D. Bryson (Eds.), Smart clothes and wearable technology (pp. 25-44). Woodhead Publishing Limited.

Ledger, D., & McCaffrey, D. (2014). Inside wearables. How the science of human behavior change offers the secret to long-term engagement [report]. Endeavor Partner’s Archive. https://archives.yegii.com/asset/inside-wearableshow-science-human-behavior-change-offers-secret-long-term-engagement-1513

Lukowicz, P., Kirstein, T., & Tröster, G. (2004). Wearable Systems for Health Care Applications. Methods of Information in Medicine, 43(3), 232–238.

Newman, T., & Krieck, J. (2015). The Impact of HIPAA (and Other Federal Law) on Wearable Technology. Science and Technology Law Review, 18(4), 429–454.

Salmon, J., & Ridgers, N. D. (2017). Is wearable technology an activity motivator, or a fad that wears thin? Medical Journal of Australia, 206(3), 119–120.

Thierer, A., D. (2015). The Internet of Things and Wearable Technology: Addressing Privacy and Security Concerns without Derailing Innovation. Richmond Journal of Law and Technology, 21(2), 1–118.

 

Footwork by Candide Uyanze

Project Description

Footwork is a musical dance game where players are invited to press on footpads based on the pattern on the screen. Each dance pad is equipped with strips of copper tape connected to a touch sensor, coloured arrows, and foam sandwiched between two pieces of corrugated plastic to add some cushioning. When the user touches the correct pad in sync with the moving arrow, they are awarded points. Footwork is set to the song “1er Gaou” by Magic System.

Experience Video

Context

As I was pondering on Experiment 3’s theme, Tangible Interfaces, I was reminded of one of the assigned readings for the Critical Theory seminar: “Engenderings: Gender, Politics, Individuation” by Erin Manning. In it, the author discusses the politics of touch while weaving in anecdotes about tango dancing (Manning 2006).

Dancing, like music, plays an important role in many cultures; it’s a medium that can communicate emotions, histories, and traditions in ways language cannot. I was reminded of this while brainstorming for this experiment, when my little sister, who is in middle school, needed some inspiration for a school project on dances from a chosen time period. I suggested she present a Congolese dance, as a way of sharing our culture with her peers.

I started to think about how the pandemic has forced many places to close and cancelled social gatherings where people could dance together in a large group. We, instead, have to stay home and dance alone, with our household (partners, roommates, family members), or with others over Zoom.

Although I wouldn’t call myself a dancer, I really love dancing games. The great thing about them is that you don’t even need to know any moves ahead of time, overthink what you’re doing, or worry about how you look doing it. All you have to do is follow the directions. For those who are able-bodied, this presents a low-barrier of entry.

As a kid, I was very fond of this Flash game hosted on Télétoon’s website called Hip Hop Don’t Stop, despite never making it past Level 3. The player had to press the correct keyboard arrows, which also generated dance moves from the three “dancing divas”. Later, my sister and I discovered Just Dance on the Wii at a family friend’s social event. We fell in love with the game series and purchased our first console and Just Dance games sometime later. In the following years, Just Dance created a version for the  Xbox Kinect, where holding a controller is no longer necessary. Last year, I got to try BeatSaber at my previous university’s Tinkering lab, the latest iteration of these dance games.

That said, one of the most iconic examples of a dance video game, in my opinion, is Dance Dance Revolution, DDR for short. DDR is a particular genre of game called Bemani, a type of rhythm/music video game produced by Konami, a Japanese conglomerate (Behrenshausen 2007, 237). Other Bemani games include GuitarFreaks and DrumMania, a precursor of Guitar Hero (Wilson 2010). Bemani games, which turn the body into a performance, can typically be found in public arcades and make use of metal foot pads as dance floors (Behrenshausen 2007, 237). DDR was a near-instant hit when it debuted in Japan in 1998, and saw similar success in the United States and around the world (Behrenshausen 2007, 237).

Dance Dance Revolution‘s worldwide phenomenon has spurred versions for home video game consoles, replicas on mobile such as Tap Tap Revenge, a modding community, and even an open-source alternative called StepMania.

I’ve always been intrigued by Dance Dance Revolution, but never actually got to play it in person (at an arcade or on a home console). Now, as more and more arcades are forced to close because of COVID-19 (Thamer 2020), my chances of doing so are slim to none. Although, as mentioned earlier, we can now find VR and computer vision versions of these types of games, I wanted to recreate the classic DDR game right here in my living room, and give it my own flair.

In all of my years playing dance games, I don’t remember any of them incorporating any African musical styles like the ones I grew up hearing. I decided to set my version of a dance game to the tune “1er Gaou” by Magic System, released in 2002. Magic System is a Zouglou musical group from Abidjan, Côte d’Ivoire, and this song made the Top 10 on music charts in Belgium and France, while also charting in Switzerland. This song holds a special place in my heart because although I don’t entirely understand the French-Ivorian patois lyrics, the song was played at almost every Congolese wedding reception/party/social gathering I attended as a kid, and it seems like it’s the case for others within the African diaspora.

How it Works

 

Thanks to Dance Dance Revolution’s popularity, I was able to find tons of tutorials on how to build my own dance pads at home. However, a lot of these tutorials required some form of construction, and I didn’t have the capacities, time, tools, or expertise to undertake this safely. After much searching, I was able to find two “cheap and easy” methods to create the pads:

  • A tutorial by PBS Kids that used folded carboard as a spring and pad to activate a light (PBS Kids Design Squad 2007), and
  • A tutorial on Adafruit’s website that used packing foam as a spring and corrugated plastic as a pad (Beaudet 2018)

I tried the latter matter first, since I had purchased corrugated plastic and it seemed sturdier than cardboard. I used dish foam as the spring, and folded thin strips of it to add a bit of space between the top and bottom pieces of plastic. In both tutorials, aluminum foil is added at the top and bottom, and the foot press “completes” the circuit, allowing for the foot press to be registered. From what I understood, this is how traditional dance pads work. However, I quickly realized that this wouldn’t work for a touch sensor (because the skin has to touch the conductive part… duh!).

My next dilemma was deciding which conductive material to use, since this material would have to be visible in order for me to touch it with my foot. I wasn’t a fan of how the aluminum looked and how prone it was to wrinkling and tearing. I opted instead for long strips of copper tape along the pad, which gave a nicer look. I repurposed the black construction paper I used for Experiment 2 to cover the corrugated plastic. I also cut up some paint chips that I took while I was at Home Depot to make the arrows.

Once the pads were done, I moved on to the code. On the Arduino side, I reused the code provided in class to send comma-separated-values, but limited the output to 4 values.

cu - original-gameplay-design
The original gameplay for “DDR in Processing” by Jon Castro

On the Processing end of things, I was lucky enough to find an existing Dance Dance Revolution game by Jon Castro created in Processing that used the arrow keys on a keyboard (Castro 2017). Since I wanted the visuals to be displayed on my TV, I began by changing things like the size of the game so that it would 1920 pixels x 1080 pixels (the original game was 600 x 800). Because the sizing of the other elements wasn’t relative to the screen size, I had to spend a few hours adjusting them. I then redesigned certain aspects of the user interface (colour scheme, arrows, etc.) to match the footpads I built, and swapped the music to the one I selected. I also wanted to spice up the visuals by adding a music visualizer. To do so, I incorporated parts of a music visualizer code by Andre Le (Le 2014).

cu - redesigned-gameplay
My redesigned interface

The customization was smooth sailing until I ran into a major hiccup: converting the inputs from keypresses to the comma-separated data from the sensor. This was important because the original code for this game used the key presses to print success/failure messages and track the score. I was able to do part of this conversion thanks to instructions and code by tobyonline. The web page explained how to use the Java Robot Class to simulate keypress events using the Arduino serial port (tobyonline 2018). Although I was able to emulate the arrow key presses using the Arduino serial data (which was clear from my computer’s interface and the Processing’s Console), it didn’t translate as well in the Processing sketch. Some of my foot touches seemed to be detected in the game, while most were not.

All in all, though I was disappointed that this last part of the project (the visual feedback) didn’t work as well as I had imagined, I still had a lot of fun creating this. Moving forward, I will likely stick to the version that uses the keyboard arrow keys. 😅 If I were to do this again, I would probably try sewing a fabric mat, which would be easier to store and would allow for the wires to be tucked away nicely. I also think a touch sensor may have been the wrong type of input for this project, as the sensor is very sensitive to touch and would sometimes register nearby movement as a touch. Perhaps a sensor that registers pressure would have been best, or perhaps another kind of (and simpler) visual output could have been considered.

Code

Circuit Diagram

cu - circuit-diagram #3

Works Cited

Beaudet, Paul. 2018. “DIY Wireless DDR Dance Pad with Bluefruit EZ-Key.” Adafruit Learning System. August 22, 2018. https://learn.adafruit.com/diy-wireless-ddr-dance-pad-bluefruit-ez-key/overview.
Behrenshausen, Bryan G. 2007. “Toward a (Kin)Aesthetic of Video Gaming: The Case of Dance Dance Revolution.” Games and Culture 2 (4): 335–54. https://doi.org/10.1177/1555412007310810.
Castro, Jon. 2017. DDR In Processing by Jon Castro. Processing. https://jon-castro.itch.io/ddr-in-processing.
Manning, Erin. 2006. “Engenderings: Gender, Politics, Individuation.” In Politics of Touch: Sense, Movement, Sovereignty, 84–109. Minneapolis, UNITED STATES: University of Minnesota Press. http://ebookcentral.proquest.com/lib/oculocad-ebooks/detail.action?docID=322593.
PBS Kids Design Squad. 2007. “Build | Dance Pad Mania.” PBS Kids. 2007. https://pbskids.org/designsquad/build/dance-pad-mania/.
Thamer, Sarah. 2020. “Future of Arcade Games May Be in Jeopardy Because of Pandemic.” WDSU New Orlans, May 21, 2020, sec. News. https://www.wdsu.com/article/future-of-arcade-games-may-be-in-jeopardy-because-of-pandemic/32627525.
tobyonline. 2018. “Arduino Controllers That Send Keypress Events.” Robot Resource (blog). January 22, 2018. http://robot-resource.blogspot.com/2018/01/arduino-controllers-that-send-keypress.html.
Wilson, Jeffrey, L. 2010. “The 10 Most Influential Video Games of All Time – 6. GuitarFreaks (1999).” PC Magazine. June 11, 2010. https://web.archive.org/web/20120513032212/http://www.pcmag.com:80/slideshow_viewer/0,3253,l=251652&a=251651&po=5,00.asp?p=n.
Wu, Helen. n.d. “Dance Dance Revolution.” Tiffany Lai (blog). Accessed November 3, 2020. http://tifflai.com/dance-dance-revolution/.

Bibliography

“A Pair of Homemade DDR Dance Pads.” 2005. Computer Tips. September 24, 2005. http://computertips.toups.info/dance_pad/index.html.
Cheok, Adrian David, Xubo Yang, Zhou Zhi Ying, Mark Billinghurst, and Hirokazu Kato. 2002. “Touch-Space: Mixed Reality Game Space Based on Ubiquitous, Tangible, and Social Computing.” Personal and Ubiquitous Computing 6 (5): 430–42. https://doi.org/10.1007/s007790200047.
Clark, Andy. 2020. “DDR DIY: How to Build Your Own Dance Game with a Raspberry Pi | Popular Science.” Popular Science. January 9, 2020. https://www.popsci.com/story/diy/build-arcade-dance-game/.
Daphne00Z. n.d. “D.D.Tap – Interactive Game Platform Using Processing, Arduino and Twitter.” Instructables. Accessed November 3, 2020. https://www.instructables.com/DDTap-Interactive-Game-Platform-using-Processi/.
Huang, Melanie. 2019. “Building a DIY Dance Dance Revolution.” Medium. April 27, 2019. https://medium.com/@melhuang_/building-a-diy-dance-dance-revolution-e136265bbbfc.
———. (2017) 2020. Melaniehuang/Dancedancerevolution. Arduino. https://github.com/melaniehuang/dancedancerevolution.
Le, Andre. (2014) 2014. Andrele/Starburst-Music-Viz. Processing. https://github.com/andrele/Starburst-Music-Viz.
Lordi, Katherine. 2019. “‘Dance With Your Hands’ With Tap Tap Revolution!” Medium. March 31, 2019. https://medium.com/@ktl008/dance-with-your-hands-with-tap-tap-revolution-31c0f02f35d7.
McCloskey, Brendan. 2018. “2018 Summer Build/Arduino Project – Pump It Up Dance Pad – Brendan McCloskey.” Brendan McCloskey (blog). July 20, 2018. https://www.brendanmccloskey.com/2018/07/20/2018-build-arduino-project-pump-it-up-dance-pad/.
Promit. 2018. “DanceForce V3 DIY Dance Pad for DDR.” Promit’s Ventspace (blog). April 9, 2018. https://ventspace.wordpress.com/2018/04/09/danceforce-v3-diy-dance-pad-controller/.
riverbanks1. 2018. “R/DanceDanceRevolution – My Self Made DDR Pad, Gallery and How-to in Comments.” Reddit. https://www.reddit.com/r/DanceDanceRevolution/comments/85guhf/my_self_made_ddr_pad_gallery_and_howto_in_comments/.
snydeemm019Follow. n.d. “DIY Dance Dance Revolution Using Makey Makey.” Instructables. Accessed November 3, 2020. https://www.instructables.com/DIY-Dance-Dance-Revolution-Using-Makey-Makey/.
Super Make Something. 2016. USB DDR Dance Pad (Arduino, Pull-Up Resistors) – Super Make Something Episode 9. https://www.youtube.com/watch?v=nXjj9IXUaA4.
TahRobinFollow. n.d. “DIY Arduino Capacitive DDR Pad.” Instructables. Accessed November 3, 2020. https://www.instructables.com/DIY-Arduino-Capacitive-DDR-Pad/.
Tilley, Thomas. 2005. “Wooden DDR Mat – Thomas Tilley.” Thomas Tilley. 2005. https://tomtilley.net/projects/ddr-pad/.
Xu, Mingxi, and Zachariah Kobrinsky. 2019. “Week 7: Midterm! Halloween Dance Revolution!” Zachariah Kobrinsky. October 30, 2019. http://zachariahkobrinsky.com/intro-to-physical-computing/2019/10/29/week-7-midterm-halloween-dance-revolution.

 

TOUCH MUSIC by Patricia Mwenda

PROJECT DESCRIPTION

Touch music was initially supposed to be a karaoke mic that uses a touch sensor to produce sound but after I started the project it became touch music, a wine bottle that lights up the screen with a visualizer that moves to the beat of the music. The songs I used for the project are Jahera by Lisa Oduor Noah and Niambie by Xenia Manasseh both of whom are my personal friends and well known Kenyan artists.

In these strange times artists have been finding creative avenues to showcase their work online for example having live virtual performances and hence the idea of touch music was born. It creates a sense of bringing music to life in otherwise ways that haven’t been thought of before, we usually just plug and play be it a speaker or headphones but with touch music one can interact with the music that is otherwise intangible.

As is the new norm I used objects I could easily find around the house wine bottles, some beaded jewellery to add to my Kenyan culture, ribbon and spiral conductive wire. The idea behind using a wine bottle is because it has a feel of a mic and as someone is interacting with it they can hold it up like a mic and have their own home karaoke session.

EXPERIENCE VIDEO

HOW IT WORKS VIDEO AND IMAGES

Touch music uses an Arduino placed on a solderless breadboard as the micro-controller that is attached to a MPR121 Capacitive Touch Sensor that connects to the spiral conductive wire on the wine bottle. When the wire on the wine bottle is touched it plays the music and opens the visualizer on screen using the Processing software.

liildypxro6rulozpuhcw_thumb_a900

When released the music and visuals go off (Model:Lisa Oduor Noah)

unadjustednonraw_thumb_a901

When touched the music and visuals go on (Model:Lisa Oduor Noah)

FINAL PROJECT IMAGES

clkwjsxlqegndzywaaclka_thumb_a8f8tc8rnmxtakjgijiu8e4pw_thumb_a8f4

s9kbiissqoqabhvausilpq_thumb_a8f6gagqkipros8smy8ps0mg_thumb_a8f2

img_8257img_8256-3

l02

l01

x01

x02

DEVELOPMENT IMAGES

                   img_8203 img_8205 img_8214

img_8206 img_8215 img_8216

img_8219 img_8221 img_8222

img_8224 img_8225 img_8229

img_8231 img_8240 img_8242

LINK TO PROCESSING CODE HOSTED ON GITHUB

https://github.com/kananamwenda/Touch_Music

CIRCUIT DIAGRAM ON FRITZING

screenshot-2020-11-06-at-01-56-22

PROJECT CONTEXT

Music is one thing that affects our moods slow music can make one feel sad and upbeat music can make one feel happy and everyone has their own relationship with music. Touch music is a way to recreate how we think and view music one controls it with just a touch but not in the way we are used to through our screens or tv remotes but through a bottle something we most likely use every other day if not everyday. I chose to use Kenyan music because it is a reflection of my culture bringing to light just a small part of me to the rest of the world, often times I’ll be asked questions about where I’m from and I believe I always have a story to tell.

The song by Lisa who is in my experience video is a love song that she sang in English and Luo her mother tongue and Xenia’s song is also a love song she sang in English and Swahili, Kenya’s national language. I was inspired by the visuals of the artists that perform at the Ultra Music Festival the visualizers seen up close is really something else. It was interesting to get out of my comfort zone and design my visualizer using code as opposed to a design software.

Music visualization is the process of interpreting sound with images. It maps the qualities of a recording or composition with graphics by interpreting digital or electronic signals. The method used to translate aspects of music into visual qualities determines the way the visualization looks and responds. There are thousands of different music visualizers. Each has a different different interpretation of what sound looks like. Thinking about how they work can tell us a lot about the way we interpret music. I designed my project in such a way that it makes people not only see the music but also feel it by having control over when the music is playing and when it isn’t.

REFLECTIONS AND OBSERVATIONS

This experiment was especially interesting but it had its challenges as well, I had initially hoped to add LEDs that light up to the beat of the music but couldn’t quite get it maybe if I spend more time exploring on it further I may eventually get it up and running something I’m looking to work towards.

I also tried to integrate my Industrial Design background in coming up with the design of the touch sensors and playing around with objects around the house. I’d really like to further explore on this idea and create more interactivity into it. Overall the work had its challenges and good moments and it was quite the learning experience as well.

I got my friends and family to interact with my project some I instructed what to do and others I did not some instinctively touched both bottles at once which was something I thought about because they played both songs at the same time, I’m still trying to further explore on more ways to add interesting interactivity to this project further from what it is now.

BIBLIOGRAPHY

Hahn, M. (2019, December 5th). Landr. Retrieved from https://blog.landr.com/music-visualization/

Manasseh, X. (2019, December 19). Niambie. Retrieved from http://www.youtube.com/watch?v=xTh0p8gSHgs

Noah, L.O. (2020, November 05). Jahera. Retrieved from http://www.youtube.com/watch?v=xTh0p8gSHgs

Shiffman, Daniel. 2019.Lorenz Attractor in Processing (Java). The Coding Train. https://editor.p5js.org/codingtrain/sketches/ULA97pJXR.