Author Archive

Sound and Heart Beats – Interactive Mat

music beats & heart beats

Music Beats & Heart beats by Alicia Blakey

 

Music and Heart Beats is an interactive installation that allows users to wirelessly send sounds and interact with a digital record player. Through the installation, you can either send someone a sound beat or a heartbeat. Listening to certain music or the sound of loved ones hear beat has proven to help improve mood and reduce anxiety.

If a user opens the application connected to the interactive record player they can see when others are playing songs. The digital record player then starts spinning and is initiated when a user interacts with the app that corresponds with the installation. The pin on the record player is indicated by LED lights that music is being played or this fun interaction can also be initiated through touch sensors as well.

This art installation also conceptualizes the experience of taking a moment to initiate with your senses of hearing and touch to have fun and take a few minutes out of your day to feel good and listen to sounds that are good for your body and mind.

 

img_1913

 

 

Ideation

Initially, I had a few variations of this idea that encompassed the visual of music vibrations and heartbeat blips. After the first iteration, the art and practice of putting on a record engaged with the act of listening more.  The visual aspect of watching a record play is captivating with in itself.  I always notice after someone puts on a record they always stay and watch the record spinning. There is something mesmerizing with the intrinsic components of this motion. I wanted to create an interaction that was more responsive with colour, light and sound. Expanding on the the cyclical nature of the turntable as a visual the intent was to create an environment.

 

rough-draft-proj-4

heart-beat-to-song-beat

img_2780

 

Development

While choosing materials I decided to use  a force sensitive resistor with a round, 0.5″ diameter, sensing area. This FSR will vary its resistance depending on how much pressure is being applied to the sensing area. The harder the force, the lower the resistance. When no pressure is being applied to the FSR its resistance will be larger than 1MΩ. This FSR can sense applied force anywhere in the range of 100g-10kg. I also used a WS2812B Neo pixel strip envoloped in plastic tubing. The LED strip required 5v power while the feather controller required 3v power. To make running power along the board easier I used an AC to DC converter that converted 3v and 5v power along both sides of the breadboard.

 

 

img_2784

Coding 

When initializing the video it with testing it proved more optimal that the  video sequence sat over the controller by changing the z index styling.  My next step was to apply a mask styling over the whole desktop page to prevent clicks altering the p5 sketch. Styled the controller.js to be in the same location as both desktop and mobile so it could share pub nub x y click locations.  The media.js  file would connect with the controller.js for play and stop commands. One of the initial issues was a long loading time for the mobile client. The solution was distinguishing with inline javascript a variable that we can use to stop the mobile client from running the inload audio function.T he mobile and desktop sites were not working on iphone only on android.  Pub nub would initiate on android mobile phones but in the end could not debug the mobile issue. If the desktop html page was loading its media.js  while a mobile client was trying to communicate with it the overlying result was unexpected behaviour. A possible solution would be to apply in the desktop a call back function; this would tell the mobile client it is loaded.

 

 

 

screen-shot-2018-11-28-at-3-38-34-pm

 

Materials

  • Breadboard
  • Jumper cables
  • Flex, Force, & Load Sensor x 3
  • YwRobot Power Supply
  • Adafruit Feather ESP32
  • Wire
  • 4×8 Canvas Material
  • Optoma Projector
  •  6 x 10k resistors
  •  3.2 ft Plastic tubing

 

I decided to use a breadboard instead of a proto – board this time due to the fact that the interactive touch sensitive mat was large. In order for the prototype to remain mobile I needed to be able to disconnect the LED’s and power converter. It was easier to roll the mat up this way and quickly reconnect everything. Since I was running over 60 LEDS I used a 9volt power supply to run through the converter.  I originally tested with the 3.7k resistors but found the sensors were not really responsive. I then replaced and tested with the 10k resistors and the mat had varied greatly in sensitivity and was more accurate.

 

The outcome of my project was interesting people were really encompassed in just watching the video  projected onto the interactive mat.  Being able to control the LED’s was a secondary approach that users seemed to enjoy but just watching the playback while listening to music seemed to cause a state of clam and happiness. The feedback and response to the instillation was very positive. It was noted that the projection was hypnotic in nature. The installation was designed to bring a state of calm and enjoyment.  Although the LED’s were very responsive with the touch sensors there was some flickers on the LED’s I think due to an issue with the converter dwindling. I had purchased used but after using the Ywrobot converter would purchase new for other projects.  Other comments suggested that I add another interaction into the p5.js sketch to enable users to control the motion of the record through the video with the sensors. The overall reaction was very promising for this prototype. I’m extremely happy with the conclusion of this project. There was a definitive emotional reaction that the project was designed for. 

https://github.com/aliciablakey/SoundBeats2HeartBeats.git

 

screen-shot-2018-11-27-at-12-36-24-am

 

References

https://learn.adafruit.com/force-sensitive-resistor-fsr/using-an-fsr

http://osusume-energy.biz/20180227155758_arduino-force-sensor/

https://gist.github.com/mjvo/f1f0a3fdfc16a3f9bbda4bba35e6be5b

http://woraya.me/blog/fall-2016/pcomp/2016/10/19/my-sketch-serial-input-to-p5js-ide

 

 

 

 

 

 

 

Voice Kaleidoscope

 

screen-shot-2018-11-12-at-10-53-43-pm

 

 

screen-shot-2018-11-12-at-10-45-30-pm

 

Overview

Voice Kaleidoscope takes voice patterns from the microphone in the computer and outputs onto an LED circular matrix to make colours and patterns. Created for people on the autistic spectrum who have trouble interpreting facial expressions.  This tool was created for Pattern thinkers who have Autistic Spectrum Disorder.

img_2764

Concept

Voice Kaleidoscope was created as a tool to help communicate emotion through patterns and colours. Facial emotion perception is significantly affected in autism spectrum disorder (ASD), yet little is known about how individuals with ASD misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. Autism spectrum disorder (ASD) is a severe neurodevelopmental disorder characterized by significant impairments in social interaction, verbal and non-verbal communication, and by repetitive/restricted behaviors. Individuals with ASD also experience significant cognitive impairments in social and non-social information processing.  By taking Voice expression and representing it as a pattern this can facilitate as a communication tool.

 

 

kaledoscopebackgroundillustration12

 

There are many variations of the way voice is utilized into patterns . I was curious about the fluctuations in voice and emotion. What was interesting was seeing sound waves translated into frequency.  I wanted to see what these patterns would look like and how it could help me conceptualize the design around my own project.  Through a HAM radio club I found someone who was willing to talk to me about sound frequency and patterns and the beautiful patterns of sound through an oscilloscope.

 

screen-shot-2018-11-12-at-11-35-56-pm

Ideation

Early in the process I was pretty secure on my concept. Seeing a friend with a family member who relates more to colours and patterns I always wondered why there wasn’t a tool to facilitate the interpretation of human emotions for people who sometimes deal with these barriers. It was also very important for me to get out of my comfort zone in regards to coding.  I wanted to embark on a journey of learning even if I was afraid of not sticking with what I already knew I could execute. Utilizing output from P5.js to arduino I knew would be much more challenging than the input infrastructure I had gotten more comfortable with. I was adamant about this also being a journey of taking chances and true exploration. This project was about communication and growth.

img_2717

While researching aspects of pattern thinking and ASD tools in classrooms my project went through an Initial metamorphosis. At first I thought of this design as a larger light matrix with literal kaleidoscope features, further in the thought process decided this communication tool should be more compact. Easy to fit into a back pack for most carrying mechanisms. Earlier versions also had construction plans for a complex face with cut out shapes.

 

screen-shot-2018-11-12-at-11-46-38-pm

 

Process

I started with the code right away, I knew my biggest hurdle would be to get P5.js working with arduino. I started to think about the architecture of the project. My first design process was to start thinking about through the flow of how voice would move through P5.js and into the arduino. What would that coding look like.

Initially had to decide how the microphone was going to be incorporated into the design. I began exploring adding a mircophone to the breadboard or  using the microphone in the computer and vice versa. At this stage in the process got started on serial control application right away. There were many issues with the application crashing.  The first step was to design the voice interface in p5.js this was a dificult task. I wanted to incorporate the same number of LEDS into the design with out it being over complicated and messy.  While designing the interface I began testing the microphone with the interaction of the p5.js. I was trying to encapture the animation of voice flickering in the p5.js sketch and started to look up code variations of turning the built in microphone on.

After this was set up and working I moved back to the json and serial control app. There were still connection issues.  In the first stage of coding I was having connection issues in the console not being able to stay on an open port, I continued to test through using variations of turning the serial control on then getting it to stay on a specific port. I discovered the port kept changing frequently. Decided to reinstall and that fixed the issue temporarily.

fritzing-4-boards

Putting together the board and LED lights:

For the Led matrix I decided to use three LED pixel WS2812B rings.   With initial testing of the rings and deciding how to power and design my breadboard I had the rings seperate.

screen-shot-2018-11-12-at-10-28-59-am

I had to figure out how to daisy chain the components to lead through 1 data in and out wire to the arduino.  While powering up the lights I discovered that an outside power source of 5 volts wasn’t enough. I did some online sleuthing and discovered that if I used a 12 or 9 volt power source and ran it through a DC to DC power converter that would be better for my LED’s.

screen-shot-2018-11-12-at-11-55-28-pm

Coding the Arduino:

During this process had to discern what the light patterns were going to look like. Went through many colour variations and patterns and decided to utilize a chase pattern and colour variations for loudness. Depending on how loud or soft the voice was would discern how many times the light went around the rings. Had to test variations of brightness.Even with the 9 volt power source the LED’s were running the power quickly and flickering.The rings proved to be very different operationally than the strips.

Finalizing the board and testing:

Once the lights and board were operational I dove into testing the p5 files with the arduino.There were many calibrations between the p5 and the arduino.  At first I could see that the port was open but was not sure if there was communication in the google console.  Since I couldn’t  use the serial monitor in arduino to see anything I initially had a hard time discerning if the arduino was connecting. I could see numbers running in the console and an open port but was still not able to get an open connection. Went back to researching what notifications I should see in console if arduino is connected. Found the connection nofication but still could not get running after going over code. finally with a reboot my microphone and p5.js files were connecting with the arduino and I could see my voice patterns in the matrix.

Presentation

This experiment brought my learning experience to a whole new level of json and serial communication. I learned the ins and outs of not just input but output as well. Even though there were many connective issues working through these problems made me a better coder and builder.  Getting feedback in regards to expanding on a much needed communication tools and seeing these thought processes expand the lives of other people was valued feedback to keep along this throught process and to continue exploring methods of assisting people through technology.

Added notes on future expansion for this project:

  • To make different sizes of this device for wearables or larger for environments such as presentations.

  • Incorporating a study conducted on voice patterns, light and how that incorporates with autism and  pattern oriented thinkers.

  • To expand on p5.js interface to reflect any findings in the study and expand on design based on these findings.

References

Article on Autism

https://www.forbes.com/sites/quora/2017/07/05/research-shows-three-distinct-thought-styles-in-people-with-autism/#3102323a221e

P5.js to Arduino

https://gist.github.com/dongphilyoo/1b6255eb2fb49f17c7a2ce3fd7d31377

Serial Call Response

https://www.arduino.cc/en/Tutorial/SerialCallResponse

Article Autism and Emotions

http://theconversation.com/people-with-autism-dont-lack-emotions-but-often-have-difficulty-identifying-them-25225

Paper on Emotions and Pattern Resarch

https://pdfs.semanticscholar.org/7e7e/d9bbf56ac238451a7488389731f58dc7a715.pdf

References p5

https://p5js.org/reference/

 

1

 

 

 

 

Tiny Trotters

 

screen-shot-2018-10-29-at-10-41-28-pm

 

 

 

 

A digital spin on an old-fashioned toy. Push toys are meant to help children and encourage them to walk more by offering a fun interaction.  Tiny Trotters is an interactive push toy with a light up pixel indicator in the wheel. When the toy is around others it becomes a game instilling togetherness. Like a stop light when walking at night Tiny Trotters can be used in unison when together they are green; if they haven’t connected in a short period of time the toys turn yellow then red to indicate to go back. If veering away from each other the red indication and bright LED and can be considered a safety feature for children that wander off as well.

screen-shot-2018-10-29-at-10-43-49-pm

 

screen-shot-2018-10-29-at-10-45-18-pm

screen-shot-2018-10-29-at-10-46-43-pm

screen-shot-2018-10-29-at-10-48-10-pm

screen-shot-2018-10-29-at-10-49-59-pm

screen-shot-2018-10-29-at-10-51-31-pm

screen-shot-2018-10-29-at-10-52-25-pm

screen-shot-2018-10-29-at-10-53-49-pm

screen-shot-2018-10-29-at-10-55-19-pm

screen-shot-2018-10-29-at-10-57-03-pm

 

screen-shot-2018-10-29-at-11-15-27-pm

screen-shot-2018-10-29-at-10-58-30-pm

screen-shot-2018-10-29-at-11-17-31-pm

screen-shot-2018-10-29-at-11-21-00-pm

screen-shot-2018-10-29-at-11-02-01-pm

 

https://github.com/aliciablakey/Pin-wheel.git

 

REFERENCES

http://www.seeeklab.com/en/portfolio-item/
https://m.youtube.com/watch?v=RKBUGA2s9JU https://www.teamlab.art/w/resonatingspheres-shi-mogamo/ http://www.cinimodstudio.com/experiential/projects/dj-light#videoFull https://www.arduino.cc/en/Tutorial/MasterReader. https://www.youtube.com/watch?v=t3cXZKBO4cw https://www.instructables.com/id/Arduino-Photore-sistor-LED-onoff/http://www.cinimodstudio.com/experiential/projects/dj-light#videoFull https://arduinomylifeup.com/arduino-light-sensor/ https://www.youtube.com/watch?v=CPUXxuyd9xw http://www.electronicwings.com/arduino/ir-communi-cation-using-arduino-uno https://www.instructables.com/id/Infra-Red-Obsta-cle-Detection/

 

 

 

 

 

Collabcreature – collaborative mural

Project Title:  Collabcreature

Team: Alicia & Jingpo

Website URL;

https://bit.ly/2IHgjxI (Links to an external site.)

Smartphone:

https://bit.ly/2zTQXKe (Links to an external site.)

screen-shot-2018-10-08-at-11-04-03-am

 

Project Description:

Collabcreature is an interactive screen-based digital painting game,  letting people use their images and creativity to produce a collective art. This is a collaborative interaction that can be played with as little as 3 people at minimum or 30 people maximum at a time.

Concept

Collaboration is all about teamwork, we wanted to experiment with an interaction that not only brings people together but initiates cognitive sharing. We were very interested in the amalgamation of the way people draw and think differently.  Even though we all conceptualize in our own individual way the whole of our differences can create something truly unique and beautiful.

Our research was conducted around collaborative murals and the role they play in bringing communities closer together; for instance when a neighborhood shares a wall mural and paints together. Other scenarios included designers having a preview of how other people work in their group and sharing an experience.  The potential benefit could be that it instills initial cohesion before starting an important project. We were very interested in creating something that effortlessly brought people working together in a positive scenario.

Instructions:

Using your smartphone or comupter, open the URL in Google Chrome.

  1. Read the homepage instructions.
  2. Get your canvas.
  3. Fill the canvas with certain constraints you are given
  4. Wait until the other players finish their drawings.
  5. Put 3 – 20 phones side by side to see the big picture
  6. If playing remotely with other collaborators upload your image and check the mural online.

Input: Touch screen drawing tool with different brushes for texture and creativity.

Output: Digital collective mural, an ability to see a singular entity transformed Into a bigger piece of art.

 

 

Process:

Stage 1 Brainstorming:

•Drawing improvisation game

•Broken telephone voice mobile game

•Make an animated mural together

Game maze

 

Stage 2  Concept Development

As a group the exploration of animation in a collaborative mural was the decided thought process we carried through with. We also hypothesized about how that interaction would work and what would be the best streamlined process of creating a visual that people engaging in would understand more effortlessly. The idea of making it a light show with animation after the individual canvases were completed was something we were also very interested in.   Due to time constraints, we knew we would not be able to create an animation to cohesively operate across all screens and came to the conclusion that was out of the scope of the project. For the broken telephone concept we discussed getting an automated voice message through text, then instructions to play the game through text message. The next idea was to delve into a motion-sensing maze that could move with the user. To conclude this stage in the process we made an executive decision to prototype a collaborative mural that could turn into any creature to facilitate sharing and understanding through an application. We also felt after initial exploration this was the project we could build within the given time constraints.
collabcreatur-process-day1

Process journal Day1: Research & Preparation

Goal: to create an interactive experience for multi screens.
Inspiration: We got the idea of creating murals using digital screens. Above is the following example of one of our references. People are drawing digital painting with the LambdaVision tiled display at EVL.

Inspiration regarding the line up multi smartphones on a desk.
Pinch is wonderful Multi-screen Interactive Application. People need to place the screens next to each other (in any alignment), ‘pinch’ the screens together and it will link them to work as a single large display. Please refer to the above reference.
(Japan Expo 2014 • Multi-screen Interactive Application)

collabcreature process day 2

 

Process journal Day 2: 

Day 2: Sharing findings & Brainstorm Ideation. Exploration of canvas size and developing constraints. Planning canvas that can easily be matched up for collaborative mural across 2 phones or more. Challenges thus far making drawing board and tools, p5.js, uploading individual drawings and merging into one big screen,PHP.

Process journal Day 3:
Sketching multiple configurations to further plan canvas. Storyboarding and conceptualizing interaction .

process-journal-day-4

Process journal Day 4:
Wire-framing and Coding: Designing prototyping and coding. During this process we were considering how to bring the canvases together. At this point considering the various ways we could possibly push separate pages to each individual user. Researching sorting code for random files to be pushed through to our web page. We were also trying to work through the details of how many users we could use. How many canvasses we have to individually make and how we were going to make it responsive for mobile and web.

 

Process Journal Day 5, 6, 7

Putting images into the canvas and continuing to build draw functions.Trouble shooting through the sorting script and further research into php. At this point we were trying to prioritize if we should continue with the sorting script and php. With the canvases, we tried to backtrack and resize for mobile.While testing we were having a hard time using on the phone. During the final phase before presentation we could not get the gallery page to work with  php and sorting script but still had the ability to create collab art without saving the final canvas and uploading.

Process journal Dy 5,6,7

Stage 3  Game Testing and Critique

Observations

  • During exhibit classmates were looking around they were very curious about others progress
  • Interesting use of canvas some people drew in the lines
  • others wanted to expand
  • Interesting to see how the players used the guidelines on the canvas.
  • Element of surprise as people stepped away to see how the whole mural looked from everyone’s collaboration.

Takeaways for future iterations

  • Improve functionality of canvas
  • Add animations for further interaction
  • Investigate ways to expand library of canvases from different creatures to no guidelines at all.
  • A better function for sorting pages to incorporate further collaboration for more interesting possibilities.

 

References

Drawing Tool:

https://www.youtube.com/watch?v=i6eP1Lw4gZk

https://www.youtube.com/watch?v=RUSvMxxm_Jo

Upload Image:

https://www.youtube.com/watch?v=lbKMZa-CZ_Y

(Collective painting session on a giant touchscreen)

https://www.youtube.com/watch?v=oA6eowqUSc0

(Japan Expo 2014 • Multi-screen Interactive Application)

https://www.youtube.com/watch?v=5IPB-Bde6X0

https://p5js.org

 

 

 

 

 

 

 

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.