Hunting Game: Progress Report

 

Jin Zhang(3161758) & Siyue Liang(3165618)

Inspiration

In our previous projects, we paid more attention to the visual effects aspect. Most of our projects are a bit weak in terms of interactivity. For this final project, we decided to explore more on the physical interactions. We want to build a shooting game by using physical materials and make it realistic. Our idea is to create a shooting theme related game where the player uses a laser pointer / a toy gun to aim for the center of the targets.

 

Contextual Work

The duck hunting game:

https://www.youtube.com/watch?v=YyM6MmBM14w

 

A light sensor would be placed in the center of the target and a servo motor is attached to the base of the target.  When the player put the pointer to the center, the target would fall flat. Some of the details still need to be refined and we are still in the testing phase.

 

Materials & Techniques

  • Light sensors/pressure sensors
  • Servo motors
  • Light pointers/laser light
  • Cardboards and aluminum foil

 

Since our idea for this project is pretty different from what we did for previous projects, we might not be using any same materials or techniques as before. We are still thinking what kind of sensor should we use as switches in this game.

 

Code & Wiring

 

#include <Servo.h>

Servo servo1;

int forcevalue = analogRead(A0);   

int pos1 = 90;

 

void setup() {

Serial.begin(9600);   

servo1.attach(9); //servo at digital pin 9

//servo.write(0); //initial point for servo

 

}

 

void loop() {

 forcevalue = analogRead(A0); //attached to analog 0

 Serial.print(“Sensor value = “);

 Serial.println(forcevalue);

 

 //int value = map(reading, 0, 1023, 0, 255);

 if (forcevalue = 100){

 for (pos1 = 90; pos1 <= 180; pos1 += 30) {

 servo1.write(pos1);

 delay(1);  

 }

}

}

%e5%b1%8f%e5%b9%95%e5%bf%ab%e7%85%a7-2018-11-26-%e4%b8%8b%e5%8d%886-18-53

Work in Progress

wechatimg4015

wechatimg4017

The sensor works as a switch to control the servo motor and sound.

wechatimg4019

wechatimg4021

The value of the sensor would go from 0 and suddenly to 1023 for some reasons.  We were confused and figured that there might be something wrong with the wiring or the code.  We are still testing it and trying to find a solution for this issue.

The Soundscape Experience

The Soundscape Experience


Tyra D’Costa | Shiloh Barnes | Francisco Samayoa

screen-shot-2018-11-26-at-6-19-54-pm

Project Inspiration

Our project seeks to encompass elements of sound, sensory perception, and nature in a installation based artwork. Our process began by discussing the types of sensors we wanted to explore, eventually we decided on the MPU 9250 Accelerometer and Gyroscope. Our proposed idea, is to integrate the sensor into headphones, which can then change the sound and visual experiences of the user. As the user navigates the room and moves their head about, the world in which they are digitally immersed in will enhance the real world.

Goals

  1. Engaging UX Design: We want to solidify the conceptual ideas of the soundscape experience to create an immersive UI/UX design.We want this to include, elements of storytelling and gamification.
  2. Sound-Sensory Experience: Moving forward we will have to test that the sensor can allow smooth transition between sound files. The media assets will consist of ethereal soundtracks, data visualizations (natural algorithms and patterns), and nature related images.
  3. Integrate wearables: Also, we need design a way to integrate the sensors into the wearable technology ( the headphones).
  4. Discuss concepts of nature, ecosystems and natural algorithms: Lastly, we want to think about how these concepts can work together to create a narrative and gameplay element.

Background Work

So far, we have able to get the MPU sensor working with Arduino so that we receive X,Y,Z coordinates. This required, scouring the web for solutions to library related issues, and calibrating the code in order to receive accurate data readings. Additionally, we were able to take these coordinates and map them to a camera in Unity, so that the sensors orientation changes the perspective of the viewer.

Links To Contextual Work

How to play multiple audio files

Research on Arduino + Gyroscope

Good Link explaining how IMU sensor works

Tutorial for IMU and Unity set up

How to set up MPU and Arduino Library

Unity + Arduino Library

Project Details

  1. User starts at any location on the map
  2. They put on the headphones
  3. Depending on the position of the head, the headphones will play a particular sound.
  4. If a particular sound is playing then the screen will display an associated image or video
  5. The user must experience all the audio and visual data in order to try and make a connection between the media and a place that exists in the real world before their opponent. 

Sketches

img_4916img_4924img_4924img_4926

Process

46837114_564283070696082_6498849018557235200_n 46812922_473012876557775_1587641447014727680_n

46801407_337187233779982_3353290089944842240_n 46854703_644056072656401_7774253524338081792_n 46796952_337902460325535_4724015759863316480_n

Air Printing: Drawing Physical Objects with Leap

Experiment 4: Progress Report:

Air Printing: Drawing Physical Objects with Leap

 

Salisa Jatuweerapong, Sam Sylvester, Melissa Roberts, Mahnoor Shahid

Atelier I: Discovery 001

Kate Hartman, Adam Tindale, Haru Ji

2018-11-27

 

Inspiration

We started with an idea of drawing in the air and transmitting art onto the screen with the movements. At first, we thought of using an accelerometer or conductive paint proximity sensors. We didn’t want any sensors to be attached to the hand. Through research and feedback, we discovered the Leap Motion Controller and a project called “Air Matter”.

“Air Matter” is an interactive installation by Sofia Aranov. The installation takes a new approach on traditional pottering with Leap Motion Controller. The viewer draws a 3D pot in the air which is then 3D printed. An Arduino is also used with potentiometers to control aspects of the model.

 

Context

Related imageThis project is an exploration of alternative interfaces and virtual and physical space.

We took the “Air Matter” installation as our main inspiration. Instead of drawing a vase, we decided to draw a sculpture made of thin rectangles. This idea was based on the disappearing sculptures by Julian Voss-Andreae– which, depending on the point of view, seem to disappear into thin air. Our project “conjures” physical objects from thin air, yet the physical objects it creates disappears back into thin air (conceptually. Our final design isn’t print thin enough for that to actually work). There’s something to be said about the transfer of objects from physical, to virtual, back to physical space, and its permanence and materiality in each layer.

Related interfaces include: webcam motion tracking, Kinect, and a variety of glove interfaces (Captoglove game controller, Mi.Mu). We chose to explore Leap as it seemed an exciting challenge; as well, we wanted to explore extremely non-invasive, non-physical interfaces (no gloves).

Other work that is being done on Leap includes Project Northstar, a new AR interface that aims to redefine the AR experience. Otherwise, the Leap team is focused on creating accurate hand tracking software to be used as a tool for any other projects.

Links to Contextual Work

Air Matter: https://www.sofiaaronov.com/air-matter

Julian Voss-Andreae Sculpture: https://www.youtube.com/watch?v=ukukcQftowk

Mi.Mu Gloves: https://mimugloves.com/

Northstar:https://developer.leapmotion.com/northstar

Images of the work in progress

Progress Timeline Checklist (link).

Thursday 22nd: 

Designing the visuals

11-23-04811-23-047

Friday 23rd:

Getting Started with Leap

received_305280590079371 received_334901954003267 received_347159419428689

Tried out the Leap, ran into some challenges with the different software available for download. Tutorials we found (Research) seem to be for some versions of the software and not others.

Monday 26th:

Processing Sketch with mouseDragged

screen-shot-2018-11-25-at-4-11-45-pm screen-shot-2018-11-25-at-8-34-32-pm screen-shot-2018-11-26-at-1-02-43-pm

According to the sketch, the person would draw a squiggle with their finger as an outline for the sculpture. Thin rectangles should be placed at specific X positions to maintain a consistent gap between them. The height of the rectangles is determined by the Y position of the cursor or the finger of the person.

Processing Sketch with Leap

finger_painting_1003 finger_painting_1010

finger_painting_1102
Wrote code in Processing using the LeapMotion + Arduino Processing library. Used input from the Leap to draw a line. Boxes are drawn centered along the vertical middle of the screen, height and depth of the box are dependent on the y position of the user’s finger, placement along x-axis is dependent on the x position of the user’s finger (box is drawn if the x value is divisible by 50). The width of the box is constant. There is a bit of a lag between the line being drawn and the box is drawn, so the line has to be drawn slowly.

JavaScript finger/hand code reference: https://developer-archive.leapmotion.com/documentation/javascript/api/Leap_Classes.html?proglang=javascript

Tuesday 26th:

Converted Processing Sketch to Javascript

finger_painting-1 finger_painting-2

There was no export as STL file for the processing version we were using, so had to switch to javascript. This was important since Melissa’s STL library code from Experiment 2 had proven to work.

In the Javascript code, we used three libraries

  • Three.js (export STL library)
  • Leap.js (Leap Motion Controller for the javascript library)
  • P5.js
  • Serial Port

Pictured above is the functional p5.js/leap.js code.

Implementing three.js library into the functional p5.js/leap.js code

screenshot-58

This involved getting rid of p5 code, as the three libraries (three, p5, and leap) didn’t work well together. The biggest changes were changing how we created 3D shapes, creating a renderer to replace our canvas, setting up a scene (full of our shapes) to animate and render, and including an STL exporter, which will allow us to print the 3D object drawn on the screen.

The Leap coordinate system seemed to be very different from the Three.js coordinate system, which means the shapes we had created displayed as far larger than originally intended. However, the code technically works. The scene (airPrint) has our shapes in it, and they are being reproduced on the screen. Leap has a coordinate system where the units are millimeters, the origin being the center of the top surface of the Leap.

Further steps involve possibly implementing additional controls with Leap.

Connected Arduino to USB C

screen-shot-2018-11-27-at-12-03-28-pm

Using WebUSB, created a workflow for a physical button push as ‘enter’ on the keyboard.

This push button will download the STL file from the sketch which can then be used to 3D print.

GitHub: https://github.com/SuckerPunchQueen/Atelier-Ex-4?fbclid=IwAR1wH5XmWsn4G-S9e3zezb8yrDMfmp56uRA7xzTVI80JTh3Wj-hnKFjrZ-w 

Previous Experiments

Melissa’s Nameblem provided a starting point for a generative code → .stl → 3D printing workflow. Melissa’s original project combined p5.js with three.js and exported into a .stl file that she would have to manually fix on 3D Builder. While we had hoped to just reuse this code for Air Printing (it is a rather technical workflow), we are having issues interfacing Leap.js with p5.js. As well, something we are hoping we can do is automating the process.

Mahnoor’s work with capacitive sensing on Experiment 3 inspired our original interface for air sensing. Her umbrella had a proximity sensor created by using conductive paint and the CapSense library, and we reasoned we could use two capacitive sensors on two different axis to take an x-position and y-position for a hand. This would not be as accurate as Leap, and since Melissa wanted to buy a Leap anyways, we opted to use that for our project.

We are using p5.js, which Adam introduced us to in Experiment 1 to draw our design.

Haru’s Endless Forms Most Beautiful, specifically the experiments based off William Latham’s work, was our launch point for the visual design. Originally, our code was a bastardized Tinkercad / building blocks game. We felt that we could do more visually, to elevate the project from a tool/workspace to an actual artwork. We looked at the rule-based work we explored in Haru’s unit for inspiration, since we were already restricted by rules as to what would practically be able to print (basic geometry, cubes, connected lines).

Experiment 4 – Progress Report

Brian Nguyen – 3160984

Andrew Ng-Lun – 3164714

Michael Shefer – 3155884

Rosh Leynes – 3164714

Soup of Stars

Inspiration

The inspiration of the project developed as we looked into the potential of our original concept. We started off with the inspiration of movement and and implementing it with analog sensors as a form of a ball game where users would attempt to keep the ball up with their body parts that has sensors attached. After reviewing several pieces we decided to develop the concept based entirely on a webcam because we wanted the body to be the entire subject of the concept

Relevant Links for Inspiration

https://medium.com/tensorflow/real-time-human-pose-estimation-in-the-browser-with-tensorflow-js-7dd0bc881cd5

https://ml5js.org/docs/posenet-webcam?fbclid=IwAR2pg6qdmZfbi0Gxi3ohxtP9tcXUpokaYj6triiHtw6giJ9vTbYVyM1LNWI

Context

The project utilizes a webcam along with Posenet and P5. With the Posenet library, a skeleton is constructed based off the subject registered via the webcam. Within P5 a visual is constructed in the background of a particle system intended to resemble stars. While still focusing on movement, the particle system will react to the movement of the skeleton (mostly limbs). As the arms move across the canvas, the particle systems would swirl and twist following the movement of the skeleton. The skeleton of the subject would appear on the canvas. Additionally, more than one individual can be registered as a skeleton as long as they are in proper view of the webcam. The intent is to provide a sense of interactivity where the individual has an impact to the environment and can alter it the way they see fit.

screen-shot-2018-11-27-at-11-01-33-am

Pictured above is the skeleton using the Posenet demo that will be controlling the particle system. The movement of the limbs will be crucial in altering the environment. There are some issues where some limbs aren’t recognized at times especially when they are close to the body.

20181127_091845

Pictured above is the implementation of the Posenet library with P5

20181127_092035

Previous Materials/Experiments

For experiment 3, we’ve used the webcam with P5 to construct an image using particles. We managed to manipulate the particles with a sensor that then combined with the webcam feed. For this experimentation, we’re still using familiar elements such as the particle system in P5 and the webcam feed projection but altering its concept and their relation to one another.

Untitled (for now)

 

Dimitra Grovestine 3165616 | Kiana Romeo 3159835

Context

screen-shot-2018-11-27-at-12-04-21-pmThe concept of the afterlife has always been a controversial and highly debated topic. Many cultures and individuals disagree on what happens after death. Some believe in reincarnation while others think the human soul leaves its shell and ascends into a higher place. Some even believe nothing happens at all. In our project we aim to encompass all these concepts in which one will go through many stages of life after death. First, they will walk into darkness, then a walkway will slowly light up once they advance a bit farther until they reach the end of the path where a beautiful display of clouds will appear signalling they have reached the end.

 

Inspiration

We were inspired by the idea of dying and “going into the light”; something grim and undesirable becoming something beautiful and inviting. But of course, no one wants to die in order to experience this. Our project aims to fabricate the concept of “Heaven” and the afterlife by recreating it in many ways. We want to give people a near death experience without them actually having to die, complete with visuals and sounds we believe would be present when in this scenario.

 

Concepts/ how it will work

In order to create this effect, we will be using proximity sensors to trigger img_3993events. The space will be dark or inactive when no person has entered the room/ space. Whispers will be heard around the room as the other “souls” acknowledge your presence. The first proximity sensor will be connected to one laptop and will trigger the LED strips. These strips will gradually light up as you walk lighting up a path representing you being welcomed into the afterlife. Upon reaching a certain point, the next proximity sensor, connected to a second device will be activated where the heavenly clouds will gradually appear and angels will sing marking the moment you reach your final destination. A concept we are taking from a previous experiment is the idea of using sensors to trigger different events. Although we did not create these sensors, we plan on formatting and designing a space that will make the sensors completely unnoticeable and use materials to make our space as aesthetically pleasing as possible.


Contextual articles and videos:

http://time.com/68381/life-beyond-death-the-science-of-the-afterlife-2/

https://www.youtube.com/watch?v=axSxCo_uMoI&frags=pl%2Cwn

 

Progress Report: Shooting Gallery

Phantom Blasters (Working Title)

Maddie Fisher-Bernhut, Donato Liotino, Ola Soszynski

followermagicWe were inspired by the classic carnival and arcade games, namely shooting galleries. We also wanted to work with the gyroscope sensor and vibration motors. We really enjoy the aspects of friendly competition between two players, and wanted to create something that celebrates that. Playing off of that, we wanted to incorporate magic and technological themes to show the competitive aspect in a literal sense. Whether this will carry through is undecided currently. We truly want to make a game people enjoy playing. Furthermore, we wanted to work off of the typical shooting-gallery type games, shown below in the first two links.

Checking collision for the mouse and target
Checking collision for the mouse and target
Detecting mouse presses
Detecting mouse presses

Otherwise, we also wanted to incorporate vibration sensors, by having the targets be invisible, only tracked by the vibration, in a sonar-like way, in which it becomes more intense the closer the target is. By doing this, we will be working mostly with new things, while incorporating what we already know. In our case, we also want to work off some of what Maddie did in assignment one. By doing this, we can track the players’ crosshairs with a colour tracking camera. The coding ideas are a tad more simplified, so we can focus mostly on the new sensors and

assets in process

buzzers, allowing us to learn how they work. Other new aspects we are working with is loading gifs in p5.js, something Donato and Ola have n=both been unable to figure out thus far.


Research and context:

https://www.bigfishgames.com/online-games/8942/shooting-range/index.html

https://www.theglobeandmail.com/life/health-and-fitness/why-a-bit-of-healthy-competition-is-good-for-everyone/article8749934/

https://www.psychologicalscience.org/news/minds-business/the-upside-of-rivalry-higher-motivation-better-performance.html

Experiment 3 – Naruto Glove Game Controller

image24

Experiment 3: Material as Sensors

Naruto Glove Game Controller

Madelaine Fischer-Bernhut, 3161996

Salisa Jatuweerapong, 3161327

Brian Nguyen, 3160984

Atelier 1: Discovery, DIGF-2004-001

15/11/2018

Concept

A Naruto Primer:

From Wikipedia: “Naruto (ナルト) is a Japanese manga series written and illustrated by Masashi Kishimoto. It tells the story of Naruto Uzumaki, an adolescent ninja who searches for recognition from his peers and the village and also dreams of becoming the Hokage, the leader of his village.”

The Naruto franchise is one of the most internationally well-received and popular of Japanese IPs– ranking as the third best-selling manga series in history. This pervasiveness of the IP gives us a solid launching point for our project; the majority of people should at least have heard of it, while for others it invokes deep nostalgia and childhood memories of mimicking hand jutsus  in hopes of creating fireballs and wearing dorky headbands and cosplays for play-acting.

The main power system of Naruto revolves around chakra, a “life energy” that all ninjas can tap into and channel into a variety of attacks  and powers (jutsus) by using hand seals. Hand seals (Hare, pictured on the left) are specific hand gestures, and different combinations of them create different types of attacks. There are 12 basic hand seals (based off the zodiacs).

Since its initial birth, there have been over 50 Naruto games released and the majority of them are fighting games that rely on jutsus among physical attacks. However, for the most part they are all played using generic consoles, ie button presses. The game we are using: Super Smash Flash 2 (Pictured on the right), is a fan-made flash game made in homage of the Super Smash Bros series. In this game, the creator decided to add many popular characters from anime and gaming that are not typically found in the original series. For example, Goku from the Dragon Ball series and Naruto from the Naruto franchise, which we are using for our project.

Our concept is to bridge the gap between the physical world and the digital world through motion gesture controls that put you in the shoes of the playable character. Instead of pressing a button to create a hand seal inside the game and form the jutsu, the player would be able to form the hand seal in the physical world and have it reflected by the character in the game world; instead of pressing arrow keys to move your character around, the player would be able to step in a direction and have the character on screen copy their movements. In the spirit of childhood nostalgia and avid fans, we are turning people into ninjas.

image31 image11

Group Member Roles

Madelaine: Creating the positive glove, Leaf Symbols, sensor placement plans, and forearm guards creation.

Salisa: Initial conception, creating the grounded glove, sensor placement plans, and movement mat design and creation.

Brian: Programming the Arduino Micro and finding a workable game.

All: Conceptual planning and circuit creation, documentation.

Initial Planning

Placement of sensors

image16 image14 image4

Image 1: Initial sensor/switch placement colour coded

Image 2: Updated sensor/switch placement colour coded

Image 3: Sensor/switch designing (shape, type of stitch, positioning, etc)

When we were planning the the placement of the switches we had to make sure that we would not accidentally trigger the wrong switch when making a hand sign. The first step was to only choose hand signs that did not have that many of the same points touching. We decided on Boar, Dog, Ram, and Hare because their signs almost all made contact with at least one different point. At first, we were thinking of creating a switch by putting the ground and positive sides of the circuit on the same hand. The other hand would be unattached to any wires and bridge the circuit (acting as an activator only). We decided against this idea because we realized that it would make sense to have one of the gloves be the ground to limit the clutter of too many wires. Because of this decision, the left hand contained the triggering switches connected to each gesture.

Hare and Ram are the only gestures that required an override of another sensor, the palm switch. Ram we knew from the beginning had a point overlap, but for Hare, it came to us when we were testing the performance of the gloves. Ram’s unique point of contact was the middle and index finger and it shared the palm point with the boar. Hare also shared a palm point with boar depending on the angle the player used in making the gesture, so we had to make sure its unique point, crease of pinky and ring finger, overrode the palm point.

Materials

Hardware: Arduino Micro. It has the ATmega32U4 chip which allows it to act as a keyboard when connected to a computer through USB. The full support of being able to program keyboard keys to the micro was extremely helpful in writing the code. Before this, we thought of alternative ways to simulate keyboard inputs such as a Python script or a Chrome extension listening on the serial port. As well, Donato suggested using a Makey Makey.

Gloves: A black knitted glove, black thread, silver (non conductive) thread, conductive paint, conductive fabric, and conductive thread.

Movement Mat: Velostat pressure sensitive material, aluminum foil, neoprene, and fabric glue.

Forearm Guards: Black poster paper, neoprene, electrical tape, black thread, and fabric glue.

Implementation

Gloves

While we originally debated sewing our own gloves (this way we could fully match the design and create the entire ground glove using conductive material), we ended up grabbing 99 cent gloves from Chinatown because of the time crunch. We chose knit gloves over leather ones for the price point and for the one-size-fits-all quality. These gloves stretched rather well and were able to fit grown men with larger hands.

The biggest issues we ran into were the shoddy construction of the gloves (we had to patch up at least 3 holes that just opened up) and the stretchiness of the knit, which made it impossible to iron on the conductive fabric as we originally planned. It was also difficult to sew evenly-placed stitches; the craftsmanship was a steep learning curve.

Outside

image32

image15

Image 1: Back of gloves      Image 2: Front of gloves

Inside

image22 image19

Image 1 (from left to right): Conductive fabric “wire” connections for the pinky and index finger crease, the middle finger, and the Backplate.

Image 2 (from left to right): Conductive fabric “wire” connection for the Leaf Symbol and the ground.

*the paint cracked by the end of the day, unfortunately, leaving just half a leaf

The sensors/switches were made with either conductive thread to be less prominent as possible or conductive fabric to create more intricate designs. Each sensor/switches positive side was connected to the alligator clips, going to the Arduino Micro, using a thin strip of fabric inside the glove. The reasons for this was:

  1. We wanted the gloves to remain as close to the original as possible. This did not include silver thread crisscrossing the black glove.
  2. If we used thread, the stitches on the outside of the glove may accidentally trigger the wrong hand gesture if the wrong sensor is touched by the ground.
  3. We did not want to use wires as they would be too rigid and would make putting on the gloves difficult.

We decided to stick with the black gloves and Leaf Symbol design because they are so iconic in the Naruto Universe. The black gloves with the metal plate on the back are a signature accessory for Naruto’s ninja mentor, Kakashi. The Leaf Symbol is the symbol of Naruto’s home village in the franchise. Both Kakashi’s gloves and the leaf symbol are extremely popular in Naruto themed merchandise and cosplay, so Kakashi’s gloves are usually designed with the Leaf Symbol (the plates in the canon universe are plain) and pretty much accepted as fan-canon.

The Leaf Symbol for the palm switch was a stylistic choice to keep with the theme. Many of our initial ideas were basic palm padding designs. We ended up scrapping the idea because we felt that knitted gloves usually do not have pads and that the silver of the fabric would be too different from the original black glove.

The biggest issue we had creating the sensors was with the conductive thread being covered with the thick fabric of the knitted glove. Even after adding more thread to the spots with the sensors, some of the hand gestures had issues triggering (the Hare and the Ram)– mostly in part, due to larger hand compositions. Sewing the conductible fabric designs were also difficult. The material is extremely stretchy so it was easy to make a placement mistake that would look normal when unworn and look extremely off when worn; or vice versa. The conductible threads tendency to twist and knot were also banes of our existence.

On a stylistic note, the conductive stitching on the seams were meant to create “invisible” sensors; however, the stitches sunk too far into the fabric if they were not bunched up, and were not so “invisible” if they were. We stuck with it on the ground glove as it was needed to create the circuit, but avoided it on the right glove as the look was not as polished as we imagined. If we were to use a different material for the glove (and get really good at sewing), stitching matching seams on both gloves would effectively hide the sensors in plain sight. In addition, using black conductive thread.

Hand Signals/Gestures:

image6 image5

Key: I – Move: Defense Barrier/Shield

Switch Sensor Location: Backplate of the right glove (positive), Leaf Symbol on the palm of the left glove (ground)

image28 image21

Key: O – Move: Special

Switch Sensor Location: Conductible thread seam in the crease between the pinky and ring finger on the right glove (positive) and between the thumb and pointer finger on the left (ground)

image12 image25

Key: P – Move: Basic Attack/Smash Attack

Switch Sensor Location:  Leaf Symbol on the palm of left (negative) and right gloves (positive), conductible thread sensor around the middle finger on the right (positive and left gloves (ground)

***For the longest time we thought this gesture was the Tiger sign. So please do not mind the slightly off finger positioning in documentation photos.

image13 image8

Key: U – Move: Grab    

Switch Sensor Location:  Leaf Symbol on the palm of left (ground) and right gloves (positive)

One thing we debated was making the hand seals work even in mirror position (mostly for Dog) as many players mixed up their left and right hands. We ultimately decided not to as it went against the spirit of the ninja way. As a hypothetical commercial decision, this controller is marketed towards fans, so it should not be an issue– in fact, making them work in reverse would likely be a complaint from hardcore fans.

Testing

image38 image1 image3 image9

Every time we finished an element of a switch (positive or ground), we tested to make sure was able to run through the circuit by using an LED and a battery. The placement of the Leaf Symbols, the Backplate, and the sensors/switches were roughly figured out using tape on our hands, then on the glove (as well as a fabric pen later on).

Movement Foot Control Mat

image33 image39 image27 image29 image34

Image 1: initial concepts for floor mat

Image 2: WIP of floor mat

For the concepts, we conducted research by simply walking around and trying to figure out how to create a natural body sensor for moving. This was a scrolling 2D game, so a 3D floor-mat like DDR did not make as much intuitive sense to us; should the up arrow be jump, or be walking forward on the screen? Our project was aiming to connect the player to the onscreen character as much as possible, so a similar movement was key. The other key design focus was ease of reach: we wanted the player to not have to search or reach far for the sensor, and ideally always be to press the sensor without moving their foot placement.

Based off the way the characters stood on the Smash screen (and other Tekken-style games), we came up with the layout pictured on the lined paper: a ready-position half-crouch, with one foot in front of the other. The drawback with this was that it was not intuitive at first glance (though it would still be nice to playtest; it has potential, especially since we can use code to make it work), and most importantly, the sprites in this game could flip around on screen. So even when the sprite would be running left on the screen, the player would be pressing back with your left foot, and the focus became more on the left-right than the back-front.

Therefore, that idea was scrapped for a simpler left-right 2-sensor design. On top are some Naruto-themed designs, including ones based off significant clan symbols, weapon designs, and for one, a symbolically geographical representation of Naruto’s clan maps (there was a lot of overthinking).

Our biggest grievance with the 2-sensor designs was how… untethered, they felt. Both in position in relation to the other (they could easily be set-up differently or kicked apart) as well as in relation to the digital game and other gear. It raised the question of where the player would stand, at rest or otherwise: an space in between the two could work, but would have a significant delay in reaction time, meanwhile a space behind the sensors could make the player feel removed. With that in mind, and the map idea– we decided to make a single mat, with 2 sensors on either side. By marking a specific play area, we created a deeper sense of player immersion. Keeping with the colour scheme, red carpet fabric was glued to a sizeable neoprene mat. The tactile raise and feel of the carpet fabric helped players find the sensors with their feet. So the players would not have to lift their feet/ leave the play area, the sensors are located in the balls of the feet only; players can rest on their heels and rock forth on the balls of their feet to activate the sensor. (This movement needs playtesting and research for strain in continued use and should be adapted accordingly for safe play.)

The three comma-like shapes in the middle form the Sharingan, another nod to the Naruto franchise. Currently it does not do anything, but there is the possibility of 1) turning it into a jump sensor (up) 2) turning it into a duck sensor (down) 3) turning it into a special move where the player crouches down and presses their hand to it (it would likely be an analog sensor so any pressure works.) However, staying true to the franchise, there are seals and moves that require a ninja to press their hand onto the ground.

One critique of the floor mat design was that the player intuitively inferred that rocking forward would be moving forwards on the game, and rocking backwards would be moving backwards. This, while a cool mechanic, runs into the same issue as the initial half-crouch concept. The main issue here, however, is that the design does not clearly convey to the players to rock forwards to activate the sensor. This could possibly be changed by making the back heel an empty silhouette (we do want to mark where to stand regardless.).

On the day of the presentation, we were unable to get the movement mat to work due to a circuit issue. We knew something was wrong when we realized that the pressure values were way too low with little difference between rest and active states. Thankfully, right before the class presentations, we were able to rearrange the circuit and make it work properly. We made the threshold 500, so the movement would be easier to control.

Forearm Guards:

image20 image18 image10

The forearm guards were our design fix for both organizing the wires to allow greater ease of player movement and set-up, as well as (somewhat) hiding the circuitry. We always knew that it would be best to get the wires out of the way by bundling them along the arm with some sort of band, and then upon further research we were able to design a themed cover that would hide the wires as well.

These guards are based off Kakashi’s Anbu uniform in the storyline (the gloves themselves are also based of Kakashi’s design). Many of the players mentioned how they felt cool and in character while wearing them, and that they helped with the immersion. The wires were organized, separated, and attached to the inside of the guard with electrical tape. Organizing the wires also made it easier to keep track of how the sensors were connected to the circuit.

In order to fasten the guard to the arm, two bands were created.  Black poster paper was used to make the band permanently attached to the bottom of the guard near the wrist. The top band, on the other hand, was made from neoprene and was able to be placed anywhere to adjust to different sized arms. Both bands were fastened using Velcro dots.

A second prototype would likely have the forearm guards made of a mixture of foamcore and neoprene, and properly painted and textured.

*The neoprene strips were too short, so multiple strips had to be sewn together. Furthermore we ran out of neoprene, or we could’ve used them for all of the straps.

Circuit

image40

image41

*We taped down the breadboard so it wouldn’t get knocked over during demos.

Code

Link to code: https://gist.github.com/notbrian/f1bdf661cceba52939de42252c553623

The code has a couple of different states it runs through during every loop.

  1. Checks the value of every pin, whether the circuit is closed or open.
  2. Loops through the state of these pins, and presses the keys associated with them
  3. Delays the program 10ms (Required for the game to register actions properly)
  4. Releases all the keys that were pressed

When designing the code, one of the most important elements we wanted to uphold is responsiveness. This element of controlling the game is especially important in fighting games like Smash where the milliseconds count when performing different moves which could result in behavior that the player doesn’t expect.

Originally the code had a delay associated with every key press and key release. This led to issues playing the game however. Firstly, the player couldn’t use more than one key at once which lead to unresponsiveness in gameplay. Secondly, the player couldn’t use “smash attacks” attacks in the game that are extra powerful but only triggered when the player uses punch + a direction at the same time or “side specials” , attacks that are triggered when the player uses special + a direction. These attacks are a crucial part of the Smash Bros gameplay which led to a rewriting of the fundamental design of the code. This code ended up being much more compact and elegant than the early prototype.

Critique & Presentation

Video of a playtest

image24 image42 image37 image10 image26 image7 image2 image36 image30 image43

Overall, the reception and critiques were positive. One thing to note was that there were two main types of players: those who were familiar with Naruto, and those who had no clue what this was about. Somewhat disappointingly, this crowd largely consisted of the latter: these folks were enthusiastic about the experience and idea, but lacking the nostalgia and excitement of the former. An explained idea, after all, can never replace the power of nostalgia. Someone called the Rasengan “the blue cloud!”, which we thought was funny.

One viewer brought up using this in either a multiplayer video game, or in a table-top game with mini-holograms (referencing Gatebox’s virtual assistant or those pyramid displays) of the characters that would replicate jutsus accordingly.

A player also suggested adding to the UI design of the game, and having what symbols the player was making appear at the top of the screen (to keep track of combos and give immediate feedback). Many other games (especially tutorial modes) do already have this, and we agree it would be beneficial for us to either find one, or code a seperate layer where that is happening on top of our emulator (sort of like the basketball scoreboard from Shilo’s group). This was originally brought up during our development but with the small time frame, wasn’t able to be pushed out in time.

There was one isolated incident where the person had really big arms and we weren’t able to put the gloves on them properly, which was disappointing all around.  

One observation we made while play-testing were players’ tendencies to–once figuring out where the triggers were–skip the full hand seal and simply connect the triggers with minimal movement. This, really, isn’t a surprise– gamers consistently find loopholes in game systems to improve their reaction time with little energy (see: moving only the sensored limb in Just Dance, GPS hacking Pokemon Go, etc). For that reason, it’s debatable how much of a design flaw this is and how much of this is just unavoidable without more advanced sensor techniques.

Related works

Flightstick controls for D.Va on Overwatch

image35

This alternative control method follows the same vein of thought as ours: bringing an in-game mechanic/motion to the real world. In this case, flightsticks–matching the D.Va’s in-game flightsticks that she uses the pilot her mech– are wired up to Overwatch as the game console.

One comment points out a flaw: “I know I’m being picky, but it always bugs me that the flight stick input doesn’t match what D.va is actually doing on screen.” This is also an issue that we experienced in our work; the pre-programmed controls for Naruto in Super Smash Flash 2 are limited to punch, kick, grab, and block--not all special jutsus. And there’s no way we’d be able to find a Naruto game that doesn’t have those basic moves that are essential to every fighting game. A solution that isn’t coding one ourselves would be to create an interface that also allows you to punch, kick, etc, along with the special jutsus (adding extra physical controls).

That being said, we would ideally find a more jutsu/combo-based game.

The CaptoGlove Game Controller

image23

This glove-based controller was a good source of inspiration. The glove uses bend sensors and a cool implementation of haptic feedback to create an interesting game experience. On its Kickstarter page, one of the pledge goals was to add a programmable pressure sensor to the thumb of the glove. Unfortunately, they did not make that goal. It is amazing how we have learned to make pressure sensors, in this class, in parallel with a pressure sensor implementation worth over $40 000 USD.

Although we also thought of creating bend sensors with the Velostat, we chose not to because we wanted our players to have free movement of their hands in order to move from one hand sign to the next. The gestures of the Naruto hand signs are a lot more complex than the hand signs of the CaptoGlove. The CaptoGlove also consists of only one hand. We wanted to create a relationship between both of the player’s hands in order to control the game, so we chose to use switches. In order to complete the switch and trigger an action, both hands must be present and meet. It is extremely difficult to find a game controller classic or glove, that requires both hands to make contact.

Nintendo Power Glove

image17

The Power Glove is a glove controller developed by Nintendo in 1989 for the NES (Nintendo Entertainment System.) The player makes hand and finger motions in order to control the game. It uses two ultrasonic sensors to track the roll and yaw of the hand and conductive ink on the fingers to detect how flexed they are. The two ultrasonic speakers on the hand that transmit a signal to three receivers placed around the television set. Through triangulation, it can determine the yaw and roll of the hand. Unfortunately, when the glove came out it ended up being a complete commercial failure due to its poor tracking, and difficulty of use.

Unlike ours, the Power Glove uses wireless ultrasonic technology in order to transmit its data and also tracks hand position rotation, whereas our gloves use switches in order to translate actions to a game. Compared to the Power Glove, our implementation is much more responsive and intuitive.

Future Exploration

As discussed above, a different Naruto game could be explored (Shippuden was suggested). As well, in the critique, different games altogether were suggested.

The different UI/tutorial mode would also be incredibly useful, and improve the gameplay experience by clearly informing you what buttons you were pressing.

Exploring different materials for the build could also improve build quality. As well, soldering wires instead of using the alligator clips; it would make the visible parts of the wires under the guards more aesthetically acceptable.

Further developing it into a more closed product by attaching the Arduino to the arm guard with a battery power source. This would have to wirelessly communicate to the computer either through a direct bluetooth/wi-fi connection, or another Arduino connected.

Blast From The Past Arcade

 

screen-shot-2018-11-19-at-1-36-18-pm


Tyra D’Costa | Ola Soszynski | Kiana Romeo | Dimitra Grovestine

Project Description 

For the third experiment, our team studied a variety of conductive materials that could be used as conductive sensors. We applied these sensors to wearable based controllers, in order to create a larger purpose for them in terms of their use in gameplay, and design for health. Our process began with a discussion on a topic we all found nostalgic- our childhood. We all remembered playing  Miniclip and Disney games when they first came out. Being one of the first generations to grow up with technology, this topic seemed both fascinating and relevant to us. As we reminisced on our favorite childhood games, we also began to talk about the very real effects they had on social health of the millennial generation. Many of us remember rejecting the park or our friends to stay inside and play computer games or watch Netflix. Soon after, social media would further this drastic change in how children play, interact and socialise.

screen-shot-2018-11-19-at-1-39-12-pm

We also discussed some of the stereotypes and harmful effects associated with gaming for long periods of time. These effects may include:  anti-social behaviour, stress in the tendons of the hands and forearms, reduced mobility etc. Our focus was to think about how we could reintroduce the games we loved so much as children, while addressing some of these issues.  We wanted to create buttons that either engaged a group of individuals together, or encouraged an individual to engage in physical activity. However, we wanted to ensure that these purposed engagements did not have negative longterm effects on the body, and that the controllers were ergonomically designed to  enhance the users experience.

img-4901

Materials and Design

Using a Makey Makey micro controller eliminated the need for any code based work,  we really wanted to focus our attention to building and designing the conceptual aesthetic  of our project. Instead of making a single game controller we decided to make serval and host a 2000’s themed arcade. This involved analysing various elements of the games we chose. We observed the movements, directions and positions of each game in order to design controllers that were intuitive to the player and met the standard of our design goals.Our team concluded that it would useful to replace the traditional controller with wearable controllers in order to optimise the fluidity and interactivity of the designs. We thought it would be fun to make the controllers funny outfits from the early 2000’s, so we created mock ups of potential outfits and brainstormed design ideas for how they could be transformed into game controllers. The conductive thread became very useful when we wanted to hide wires and create a more seamless look.  To contribute to the overall theme of childhood fun and nostalgia we decided to add prizes and a ticket booth. This added a whole new dimension to the arcade, bringing a sense of good spirited competition and challenge to the user experience.

Design Goals 

  • Economic Design
  • Addresses social issues
  • Design for health
  • Interactive user experience
  • Kinetic user experience
  • Childhood Fun and Nostalgic Themes

Materials

  • Used clothing ,
  • Makey- Makey micro controller,
  • Iron on conductive fabric,
  • Conductive thread,
  • Aligator Clips

Outfit Mockups

2003 - Rocker Look
2003 – Rocker Look
2006 - The Dude
2006 – The Dude
2010 - The Hispter
2010 – The Hipster
First Sketch of the Arcade User Interaction Design
First Sketch of the Arcade User Interaction Design

How it all Works

In the Blast from the Past Arcade, every game we had used the Makey Makey, a small board that works like a USB and sends WASD/ left, right, up, down and spacebar signals to the computer. When the circuit is closed for any of these parts of the board, it sends the command to the computer being used as if the button was pressed on the keyboard.

Makey Makey Pluggeded into our Happy Wheels game contollers.
Makey Makey Pluggeded into our Happy Wheels game contollers.

Using the Makey Makey was quite easy since when connected to a computer it takes over the keyboard controls that would have been used otherwise. This allows for more movement in playing video games because the player is not required to sit directly in front of a screen and push buttons, they can get up and move around depending on how it is hooked up. As seen in the photo of the Makey Makey below, it can be seen how simple alligator clips and wires at various lengths can be attached in order to make more interesting controllers.

Makey-Makey Reference page

When performing the first test for the Makey Makey, we used it on the games we performed our research on, games that required an assortment of keyboard controls. The first game we tested was Super Mario Bros as this is a classic keyboard/ controller game. 

Video Documentation – Learning to use the Makey-Makey: Our First Success

Finally, when we put together the games, we had to ensure the controls on the Makey Makey corresponded with those for the specific game. For example, with the Sandwich Stacker game, the “left” and “right” arrow controls on the Makey Makey were connected to the two players wearing the suspenders and glasses while “ground” was connected to the player wearing the bow tie. Therefore the person who was grounded would simply have to make skin contact with the other players to activate the left and right controls.

glasses controller ( far left ), Bowtie contoller (middle), and Suspenders contoller (far right) in use.
glasses controller ( far left ), Bowtie contoller (middle), and Suspenders contoller (far right) in use.

With regards to the Club Penguin Dance Game, the player was connected to ground on each of their hands through the edgy gloves. The player wore a rock’n’roll t-shirt with two conductive iron on shoulder pads. These two shoulder pads were connected to the up and down arrows. Finally, the side movements were controlled by iron on conductible fabric on the hips of the belt. The gloves, not only were grounding themselves, but they also grounded the entire human body. Prior to getting the gloves to work, we had tested out using a single wristband. Because the wristband grounded the entire body, players could still use either hand to complete the switch and produce the proper game movement.

Edgy gloves is used as a ground to complete the circuit.
  Edgy gloves is used as a ground to complete the circuit.
The full Edgy Club Penguin Dance game controllers outfit
The full Edgy Club Penguin Dance game controllers outfit

For the Happy wheels game we  made two floor based controllers modelled after car pedals, we thought this was an inherently intuitive design for the acceleration and break.  If the user is wearing the conductive ground wrist bands they can touch the left and right sides of the hat, as well as the centre of the bandana to front flip, back flip and reverse. While designing these game controllers we paid close attention to our research to ensure that the design was safe for long periods of game time. We made sure to keep the neck aligned, the bean bag chair came in handy for this as we were able to adjust the game screen as needed. Secondly, the design allows the elbows to always be bent at either sides of the player when playing. Lastly,  the hand of the player is always in the same plane as the forearm which is key to reliving stress points in the hands and tendons.

Game controllers for the Happy Wheel Game
Game controllers for the Happy Wheel Game
Floor pedals for Happy Wheels game
Floor pedals for Happy Wheels game

Research

When conducting research for our Blast from the Past Arcade, we had to consider many factors. Firstly and possibly the most important part of our project we had to carefully select games that we would use within it. These games had to have certain criteria:

Nostalgic games from different eras of our childhood:

We had to ensure that the games would resonate with everyone who tested our arcade and also with us in particular. We tested many games and had to dig deep in the internet to find these games (Some of them are over a decade old!) furthermore, since the arcade was about revisiting the past, we had to exclude some of our beloved games as they coincided with the same era. Research on the game release dates was necessary so we could successfully spread our arcade across many years (around 2005- 2012).

Games that use any combination of keyboard controls (spacebar, left, right, up, down buttons):

Although we had many games in mind, the Makey Makey requires keyboard controls to work. Therefore, a game with mouse movements would not work at all while a platform game using the left and right keys would. Furthermore, while researching, our group decided that instead of looking for games that could potentially use mouse movements, it was more authentic to use games with keyboard controls.

Links to games we had tested with the Makey Makey (games highlighted are the ones we used in the final prototype):

Ergonomics

The next step in our research was to figure out the science and implications of the way movements of the body affect our health and also how different game controllers and the way we hold them effects game play. The most logical way to go to find info on this was to look for research conducted by e-sports specialists. Our research found that posture and ergonomics are directly related to successful game play. Certain postures and movements provide the body with relief from cramps and muscle pain. Furthermore, when playing video games, watching the screen is a very important part of the experience but it’s also known that staring at screens for long periods of time can strain the eyes. By making the placement of the game eyelevel and the screen at least an arm length away from the player, the gaming experience will be a lot better. As well, wrist and finger positioning should be in such a way that it makes it easy and intuitive for someone to use.

When we designed our interfaces, we took all of these facts into account. For example, with the Happy Wheels game, one would sit on a comfortable bean bag in which they could position it to their liking. The controls included intuitive foot pedals with one used to step on the gas and move the character on the screen forward and one used to break. This freed up the players hands so that they could reach up to the hat and press it to control certain things in the game and lastly the bandana on the chest of the player one would simply touch to activate. We thought carefully about the controls with all of our games, both relating the functionality with the theme of the game itself and making sure the controls were comfortable and ergonomic enough to make playing more fun.

Research sources:

http://www.1-hp.org/2016/10/28/esports-health-it-starts-with-ergonomics-and-posture/

https://medium.com/what-the-tech/ergonomics-of-gaming-gear-a07058f88bf3

Documentation

Click to see Club Penguin Dance + Rocker Look

Activates the body and encourages users to move and dance in order to gain pints, win tickets and get prizes!

Creating the 'Edgy Gloves'
Creating the ‘Edgy Gloves’

Click to see Happy Wheels + The Dude

Encourages the user to step outside of the traditional hand positions found in most game controllers. This design relives stress that could build in the the tendons and muscles of the hands.

Creating the floor pedals
Creating the Happy Wheels floor pedals

Click to see Sandwich Stacker + The Hipster

In this game users have to interact and work as a team to keep the game going, beat high scores and win big!

Wiring up the Sandwich Stacker game with Conductive Thread
Wiring up the Sandwich Stacker game with Conductive Thread

*Click to See the Full Arcade Experience *

A group of 3 working together to set a high score on sandwich stacker
A group of 3 working together to set a high score on sandwich stacker
Nick all ready and wired up to play club penguin dance.
Nick all ready and wired up to play club penguin dance.
Explaining the how to use the game controllers.
Explaining the how to use the game controllers.
Ola explains how to use the game controllers for the sandwich stacker game
Ola explains how to use the game controllers for the sandwich stacker game

 

EX3 – Rainy Weather Controller

 

Siyue Liang (3165618)

Mahnoor Shahid  (3162358)

Jin Zhang (3161758)

DIGF-2004-001 Atelier I: Discovery


Documentation: Weather Controller

img_7092unadjustednonraw_thumb_dea

Project Description

We created an interactive umbrella that a person could use to control the rainy weather. The umbrella had three sensors to control the projection; stretch sensor, proximity sensor, and a pressure sensor.

The stretch sensor triggered the starting and stopping of the rain. The pressure sensor triggered the lighting and thunder on the screen and the proximity sensor controlled the speed of the rain.

We incorporated these sensors with the structure of the umbrella and how it moves when it is opened and closed. For example, the stretch sensor was obscurely attached to an arm of the wireframe. When the umbrella stretched, so did the stretch sensor and it triggered the rainfall.

The pressure sensor was glued to the bottom part of the handle where it would be to press and hold the pressure sensor with the umbrella.

The proximity sensor was taped to the fabric of the umbrella close to where a person’s head would typically be. Therefore it was easy to control the rain speed with the head position.

 

Materials and Techniques

unadjustednonraw_thumb_de6


Stretch Sensor

  • Stretch sensing fabric
  • Needle and thread

img_7096

Pressure Sensor

  • Velostat (senses pressure)
  • Neoprene sheets
  • Sewing machine

img_7097-2

Proximity sensor

  • Bare conductive paint
  • Paper, paint brush
  • Capacitive Sensing library in Arduino

img_7099-2

Other materials used

img_7072

  • Umbrella
  • Arduino and breadboard
  • Male to male wires
  • Two 1 megaohm resistors
  • One 10k ohm resistor
  • One 220k ohm resistor
  • Alligator clips
  • Needle and thread
  • Sewing machine
  • LED lights
  • Tape and a glue gun

Obstacles with code

We had some issues with the code in Processing. For example, the soundtrack in setup function couldn’t play for some reason so we tried putting it in draw function. It worked this way but also had a minor issue where if the person pressed the sensor too frequently, the sound would overlap repeatedly and take a long time to finish playing.  Another issue we had with Processing was that the values generated by sensors were very unstable that it was difficult for us to settle on a fixed range for each sensor as the value inputted kept changing.

Since we had never worked with CapSense before, we had to experiment a lot, in the beginning, to make the code work. The CapSense code caused some issues with the serial input.  Each time a new sensor was added, the sensor input wasn’t being read in the CapSense sketch serial port yet it was being read in other Arduino sketches. It took a while to debug. We commented out the millisecond function and had to put the reading of the analog inputs in void loop().

 

Obstacles with sensors

The stretch sensor didn’t have a large difference in its resistance when stretched and un-stretched so we tried to used a higher resistor.

The conductive paint sensor needed very high resistance for a larger proximity area so we combined two 1 megaohm resistors on the breadboard.

 

Last Minute Issues

  • Due to the fabric material of the umbrella, it was incredibly difficult to glue and tape the conductive paint sensor onto it. We tried multiple tapes and eventually found one that stuck for longer yet was still peeling off after a while.
  • The stretch sensor broke the umbrella wire so we had to re-sew it onto another arm.
  • The stretch sensor lost much of its elasticity and so we needed to edit the processing code multiple times to match the serial port inputs correctly.
  • The proximity sensor wasn’t working too smoothly for some reason but eventually worked properly.
  • We tried using the conductive thread as a wire which caused short circuits.
  • The rain sound wasn’t working in processing yet we got the thunder sound to work with the lightning. In processing, sometimes the audio would I expectantly stop working for no reason.

 

Discoveries and Lessons Learned

  • Using the CapSense library, we discovered that almost anything conductive can become a proximity sensor.
  • We learned how to incorporate the sound library in Processing.
  • We gained a lot of experiences working with fabric and sewing techniques.

Overall, It was a lot of fun working with sensors and we learned a lot in the process.  Doing this project we realized that there is a huge space left about sensor for us to explore and try out.

 

Project Context (Inspirations & References)

We got the idea of the weather controller because of the rainy weather that has been around recently.  

Neoprene Pressure Sensor:

https://www.kobakant.at/DIY/?p=65

  • We used the conductive thread technique in this tutorial and elongated the shape of our sensor to wrap around the handle of our umbrella

Conductive Paint Proximity Sensor:

https://www.bareconductive.com/make/building-a-capacitive-proximity-sensor-using-electric-paint/

  • This tutorial and youtube video was very helpful in building a basic proximity sensor with the CapSense library

Troubleshooting with the Cap Sensitive Library:

https://playground.arduino.cc/Main/CapacitiveSensor?from=Main.CapSense

  • Learned about what resistor we should use for the desired response and how the library worked.

Sewing Machine Guide:

http://www.singerco.com/uploads/download/HD%20Series_ANT_Generic_QSG_F_0222_lo-res.pdf

  • We needed to use a sewing machine to construct our pressure sensor neatly.

Using Sound Library in Processing:

https://poanchen.github.io/blog/2016/11/15/how-to-add-background-music-in-processing-3.0

  • Used this tutorial to import rain and thunder sound files in our processing code.

 

CODE,  VIDEO & IMAGES

 

  • ARDUINO CODE

https://github.com/mahnoorshahid/EX3-Group-Project-/tree/master/Arduino:P3/capSense%20copy

 

  • PROCESSING CODE

https://github.com/mahnoorshahid/EX3-Group-Project-/tree/master/Processing/sketch_rain

 

References:  https://www.openprocessing.org/sketch/595792

 

  • FINAL PROJECT VIDEO

https://drive.google.com/file/d/1rXD5tPSmXtyRI2Lp8CuYDridtAz8vd2r/view?usp=sharing

 

Rain starting with the stretch sensor and the speed is increasing with the proximity sensor:

screen-shot-2018-11-15-at-12-56-53-pm

Thunder sound and lightning with the pressure sensor:

screen-shot-2018-11-15-at-12-57-35-pm

Fritzing Diagram:

screen-shot-2018-11-14-at-3-39-46-pm

Arduino Basketball: Arcade Basketball

Francisco Samayoa, Isaak Shingray, Donato Liotino, Shiloh Light-Barnes

Kate Hartman

DIGF-2004-001

November 15, 2018


Project Description

Our proposed sensing method is pressure sensing fabric and digital input buttons. We will be using pressure sensing fabric, conductive fabric, conductive thread, wires, and electrical tape. For props we will use a basketball and a basketball net. In terms of sensor construction, the shot success sensor will be a broken circuit woven into the mesh of the netting that will be completed when the ball passes through the net.  The backboard sensor will be constructed of pressure sensitive material in order to provide an analog signal.  Finally the foot position switches will be incomplete circuits that upon be stepped on by the player will be completed. The backboard and foot switches are both analog, and mesh is digital.

In the end we had to glue conductive fabric onto the basketball because the fabric that was already on the ball was insufficient to complete the circuit. The mesh had to be made tighter in order for the ball to be sensed by the conductive thread. The foot switches were initially digital but we made a conscious decision to change them to analog. Rather than having players where aluminum foil on their feet the players will simply have to step on them.

On the screen there will be a scoreboard that coincides with the points scored. There will also be a timer of 1 minute, where the player will have to score as many points as possible in the allotted time. The score of each throw is calculated based on whether or not the basketball passes through the hoop, the power with which it hits the backboard, and the distance sensor that the player is standing on. This will simulate the actual basketball experience, with the 2-point and 3-point lines. When hitting the pressure sensing fabric on the backboard with enough power, a 1.5x multiplier will be applied to the basket scored. If there was more time, we would add a power score in relation to the amount of pressure the backboard is sensing.

Our vision for the presentation consists of attaching the net to the whiteboard and setting up the foot switches on the ground. The scoreboard will be displayed using a projector. In relation to the image the sensors on the ground will be placed in a semi-circle in front of the net, both at the different distances. Look for this product at your local arcade, coming soon!

Project Code

https://github.com/avongarde/Atelier/tree/master/Assignment%203


Photos + Videos
November 13

ball breadboard net net1

November 15
 img_1579 img_1578 img_1577 img_1576 img_1584 img_1585


Project Context

Demortier, Sylvain. “Sigfox-Connected Basketball Hoop.” Arduino Project Hub, 4 Sept. 2017, create.arduino.cc/projecthub/sylvain-demortier/sigfox-connected-basketball-hoop-47091c?ref=tag&ref_id=basketball&offset=0.

This project helped guide our aesthetic for the final product. If you see the picture of his project, it is hanging from the wall with the Arduino tucked behind the backboard. This would be ideal because it wouldn't get damaged and in the way of play. If you look closely you'll also see the positive and negative wires (in his case an IR receiver and emitter) on the side of the net. This would indicate that the ball triggers the score when passing through the hoop. This is the approach we opted for as well.

Instructables. “Arduino Basketball Pop-a-Shot: Upgrayedd.” Instructables.com, Instructables, 10 Oct. 2017, www.instructables.com/id/Arduino-Basketball-Pop-a-Shot-Upgrayedd/.

Another Arduino-based Basketball game. This project was visually impressive as well. The creator even placed the scoreboard on the backboard itself! Visually this was a project we wanted to imitate as well. However, this one uses a distance sensor to the count the buckets. While we decided to use pressure sensing fabric, we did like the idea of a digital scoreboard. And so we decided to reference this example's scoreboard approach, but we used p5.js to create it instead of a quad-alpha numeric display.