Experiment 1: Mushrooms, Ferns and Grass

Francisco Samayoa, Isaak Shingray, Annie Zhang
Nick Puckett
Atelier II
January 28, 2019
Project Description

Our project – Mushrooms, Ferns and Grass – was inspired by the book of the same name. It is essentially an interactive painting with an ever changing aesthetic. With the user’s input, the painting shifts, adds glitch effects, changes textures, and switches colours. All with an audio track playing in the background: Pass the Hours by Toronto’s own, Mor Mor. It is supposed invoke emotion, similar to that of a psychedelic trip. With each development, the experience feels like a descent down the rabbit hole. With a name like that, how could we not be inspired by Lewis Carroll’s Alice in Wonderland?

The project was created using p5.js and utilizes PubNub. One computer is running the sketch, while two other computers are controlling the effects on screen, similar to an installation in a public space. One input is translating the background image from left to right using the mouseX and mouseY coordinates. Another input is changing the colour tint on screen using the arrow keys. The secret is in the arrangement of each function. Since we are not clearing the screen after every function, the result is an amalgamation of several functions all coalescing into one surreal painting. This was not intentional at first, but through careful experimentation we were able to birth a new creation. One better than our previous ideas.


Rationale

At first we had our sights set on using Unity. It seemed like a long shot, but we felt the end justified the means. Should we use Unity, we would have a immersive 3D environment that fully utilized the book to its fullest potential. Imagine being able to walk around and interact with different mushrooms and trees. However, we hit various walls due to our inexperience. Around the halfway mark, we decided to switch over to p5.js. It was better to use what we already knew, since we realized we bit off more then we could chew. While we couldn’t create a 3D environment, we decided to approach it differently. We had a canvas to “paint a picture” essentially. From there we delegated the tasks and merged our works together into one, surreal interactive painting. In the end we simply experimented with different functions, and the result is something we’re proud of.


Photos & Video

img_1775img_1779

img_1778img_1786
img_1791

img_1796

img_1799

img_1805

img_1800


Code

https://github.com/avongarde/Atelier-II/tree/master/Assignment%201


Project Context

Van Hemert, Kyle. “These Psychedelic Paintings Were Made Entirely From Code.” Wired, Conde Nast, 10 July 2018, www.wired.com/2013/10/psychedelic-digital-paintings-made-entirely-with-code/.

This project helped guide our aesthetic for the final product. If you see the pictures of Ferris' project, they are in most ways similar to ours. The glitchy effect was caused by using a Perlin noise function, whereas we opted for more basic methods like offset functions and stacking objects in different orders. If we were to continue working on our project, we would incorporate a Perlin noise function for a more surreal effect.

“Human/AI Collaboration.” Deep Dream Generator, deepdreamgenerator.com/.

We stumbled upon this website on our quest for inspiration. The infamous Deep Dream Generator, built by a team of programmers from Google. Google's AI neural network was notorious for creating artwork that sold for thousands of dollars. The realm of possibility for AI collaboration is immense. The website offers a variety of styles to tap into, like the Deep style or more psychedelic Dream style. If we hadn't proceeded with Unity or p5 we would've utilized the website's tools to bring our vision to fruition. Based on our initial Alice in Wonderland theme, we would've experimented with the Dream style.

The Soundscape Experience

Experiment 4: Final Report

Francisco Samayoa, Tyra D’Costa, Shiloh Light-Barnes
DIGF 2004-001
December 5, 2018

47394798_301736113794454_1847392723955351552_n


project details

  1. User starts at any location on the map
  2. They put on the headphones
  3. Depending on the position of the head, the headphones will play a particular sound.
  4. If a particular sound is playing then the screen will display an associated image or video
  5. The user must experience all the audio and visual data in order to try and make a connection between the media and a place that exists in the real world before their opponent.

project inspiration

Our project seeks to encompass elements of sound, sensory perception, and nature in a installation based artwork. Our process began by discussing the types of sensors we wanted to explore, eventually we decided on the MPU 9250 Accelerometer and Gyroscope. Our proposed idea, is to integrate the sensor into headphones, which can then change the sound and visual experiences of the user. As the user navigates the room and moves their head about, the world in which they are digitally immersed in will enhance the real world. Essentially, it is a 4D-sound experience. If we had more time we would add visual elements such as 360 video, or a 3D Unity-based environment.


Background Work

In the end, we were able to get the MPU sensor working with Arduino so that we receive X,Y and Z coordinates. After scouring the web for solutions to library related issues, and calibrating the code in order to receive accurate data readings. Additionally, we were able to take these coordinates and map them to a camera in Unity, so that the sensors orientation changes the perspective of the viewer. This meant we had the pitch, yaw and roll functioning for the gyroscope component. For the purpose of this experiment we disabled with roll since the user wouldn’t be rolling their head per se. However, there were a few “blind spots” the camera couldn’t pick up, such as the 180 degree mark. The interface was fully functional, for the most part. The main issue was the data overload kept causing Unity to freeze, so our solution was to reset the Arduino.

Links To Contextual Work

How to play multiple audio files

Research on Arduino + Gyroscope

Good Link explaining how IMU sensor works

Tutorial for IMU and Unity set up

How to set up MPU and Arduino Library

Unity + Arduino Library


Goals & Features

  1. Engaging UX Design: We want to solidify the conceptual ideas of the soundscape experience to create an immersive UI/UX design. We want this to include, elements of storytelling and gamification.
  2. Sound-Sensory Experience: Moving forward we will have to test that the sensor can allow smooth transition between sound files. The media assets will consist of ethereal soundtracks, data visualizations (natural algorithms and patterns), and nature related images.
  3. Integrate Wearables: Also, we need design a way to integrate the sensors into the wearable technology ( the headphones).
    The MPU unit was housed in a sewn leather pouch. There was velcro attached underneath in order to stick to the top of the headphones. This way, the wires were completely out of sight since it was hanging from above. For debugging purposes we wanted to the MPU unit detachable from the headphones. In the end, we were successful.
  4. Discuss concepts of nature, ecosystems and natural algorithms: Lastly, we want to think about how these concepts can work together to create a narrative and game play element.
    Using a blindfold we acquired, we were able to gameify the experience. With the blindfold on, the user would have to guess which environment they were placed in. We would randomly select 1 of 9 unique ecosystems, including 2 songs created by Francisco and Shiloh. These include a war scene, temple, beach, rain forest, and city intersection.

Pictures & videos

screen-shot-2018-11-26-at-6-19-54-pm

img_4926img_4925

46854703_644056072656401_7774253524338081792_n 47573880_349446675631454_7184122986348150784_n 47571987_341695319963113_3360251171074736128_n 47380658_359783828089595_3941993869863813120_n

20181204_125016

 

 

Arduino Basketball: Arcade Basketball

Francisco Samayoa, Isaak Shingray, Donato Liotino, Shiloh Light-Barnes

Kate Hartman

DIGF-2004-001

November 15, 2018


Project Description

Our proposed sensing method is pressure sensing fabric and digital input buttons. We will be using pressure sensing fabric, conductive fabric, conductive thread, wires, and electrical tape. For props we will use a basketball and a basketball net. In terms of sensor construction, the shot success sensor will be a broken circuit woven into the mesh of the netting that will be completed when the ball passes through the net.  The backboard sensor will be constructed of pressure sensitive material in order to provide an analog signal.  Finally the foot position switches will be incomplete circuits that upon be stepped on by the player will be completed. The backboard and foot switches are both analog, and mesh is digital.

In the end we had to glue conductive fabric onto the basketball because the fabric that was already on the ball was insufficient to complete the circuit. The mesh had to be made tighter in order for the ball to be sensed by the conductive thread. The foot switches were initially digital but we made a conscious decision to change them to analog. Rather than having players where aluminum foil on their feet the players will simply have to step on them.

On the screen there will be a scoreboard that coincides with the points scored. There will also be a timer of 1 minute, where the player will have to score as many points as possible in the allotted time. The score of each throw is calculated based on whether or not the basketball passes through the hoop, the power with which it hits the backboard, and the distance sensor that the player is standing on. This will simulate the actual basketball experience, with the 2-point and 3-point lines. When hitting the pressure sensing fabric on the backboard with enough power, a 1.5x multiplier will be applied to the basket scored. If there was more time, we would add a power score in relation to the amount of pressure the backboard is sensing.

Our vision for the presentation consists of attaching the net to the whiteboard and setting up the foot switches on the ground. The scoreboard will be displayed using a projector. In relation to the image the sensors on the ground will be placed in a semi-circle in front of the net, both at the different distances. Look for this product at your local arcade, coming soon!

Project Code

https://github.com/avongarde/Atelier/tree/master/Assignment%203


Photos + Videos
November 13

ball breadboard net net1

November 15
 img_1579 img_1578 img_1577 img_1576 img_1584 img_1585


Project Context

Demortier, Sylvain. “Sigfox-Connected Basketball Hoop.” Arduino Project Hub, 4 Sept. 2017, create.arduino.cc/projecthub/sylvain-demortier/sigfox-connected-basketball-hoop-47091c?ref=tag&ref_id=basketball&offset=0.

This project helped guide our aesthetic for the final product. If you see the picture of his project, it is hanging from the wall with the Arduino tucked behind the backboard. This would be ideal because it wouldn't get damaged and in the way of play. If you look closely you'll also see the positive and negative wires (in his case an IR receiver and emitter) on the side of the net. This would indicate that the ball triggers the score when passing through the hoop. This is the approach we opted for as well.

Instructables. “Arduino Basketball Pop-a-Shot: Upgrayedd.” Instructables.com, Instructables, 10 Oct. 2017, www.instructables.com/id/Arduino-Basketball-Pop-a-Shot-Upgrayedd/.

Another Arduino-based Basketball game. This project was visually impressive as well. The creator even placed the scoreboard on the backboard itself! Visually this was a project we wanted to imitate as well. However, this one uses a distance sensor to the count the buckets. While we decided to use pressure sensing fabric, we did like the idea of a digital scoreboard. And so we decided to reference this example's scoreboard approach, but we used p5.js to create it instead of a quad-alpha numeric display.

Experiment 1: Digital Rain

https://github.com/avongarde/Atelier/tree/master/Assignment%201

For this project, I wanted to emulate the digital rain from the Matrix. The Matrix is one of my favourite movies, and the iconography associated with it is the falling green code. The code is a way of representing the virtual activity inside the Matrix – a simulated world – on screen. For my interface, I used the p5.js programming language and recreated the same effect with my own spin on it. I wanted to show that the user could directly affect the code and its state. The falling green code is associated with uniformity and equilibrium. However, if the mouse veers off to the right of the screen, the code turns red and begins wandering off in many directions. Thus, symbolizing the code’s corruption. In conclusion, I used the idea of digital rain and turned it into a visual association of a programmer (the user) encountering code with/without errors. P5.js was perhaps the best language for this project because of my previous knowledge of JavaScript and Processing. P5.js is essentially a sketchbook for your browser and is accessible for artists and designers.

My father and I are movie buffs, and the Matrix has stood the test of time as one of our shared picks. I’ve always linked the movie to computer programming, and now that I’m learning it in school I wanted to explore the possibility of emulating something from it. When I began to learn p5, I instantly sought out ways to recreate the digital rain. I came across a tutorial on YouTube that showed you exactly how to do it. Actually, it was on The Coding Train, a channel I am quite familiar with. But before that, I wanted to create as much of it as I could on my own. I ended up creating the Symbol class on my own, using the String fromCharCode() method and displayed one symbol on the screen. From there I populated the screen with symbols using an array, with much dissatisfaction. After that, it became increasingly more difficult, even with my knowledge and some help from a friend. I ended up referencing the tutorial but – including the code corruption aspect – most the final code is original. One aspect I wish I improved on was making the canvas full screen.

https://www.youtube.com/watch?v=S1TQCi9axzg
https://www.w3schools.com/jsref/jsref_fromcharcode.asp

screen-shot-2018-09-26-at-2-38-28-pm

Figure 1. One symbol centered on the canvas

screen-shot-2018-09-26-at-2-42-18-pm

Figure 2. A stream of symbols cantered on the canvas falling from above

screen-shot-2018-09-26-at-2-41-20-pm

Figure 3. Version 1.0: Green symbols populating the screen and falling from above

screen-shot-2018-09-26-at-2-44-55-pm

screen-shot-2018-09-26-at-2-43-37-pm

screen-shot-2018-09-26-at-2-45-04-pm

Figure 4abc. Version 2.5: MouseX is incorporated; affects the fill colour and symbols’ x-value

video-1-1