Air DJ

Experiment #5: Air DJ

Emilia Mason,  Roxanne Henry, & Kristy Boyce

Project Description

Emilia, Kristy and Roxanne were inspired to create a fun and innovative way to create music and play with your friends. The Air DJ is a gesture-based musical device which allows you to load up your favourite sound effects and assign them to gestures which are intuitive to the user. It runs on the Adafruit Feather M0, combined with the Music Maker Wing. A Sparkfun ZX gesture sensor serves as the input. Then, you take your device with you, and jam wherever and whenever. Up to six sound effects can be loaded per device.


We want to bring fun music to everyone, everywhere.

screen-shot-2017-12-13-at-7-31-14-pm
PROMOTIONAL VIDEO: https://www.youtube.com/watch?v=WxZOndgZ8vM&t=6s

 

Code: https://github.com/rh11lp/rh11lp.github.io/blob/master/experiment5/experiment5.ino

Fritzing Diagram: experiment5_bb

 

 

_dsc0496Parts used:

3 SparkFun ZX Distance and Gesture Sensor

3 Adafruit Feather M0

3 Adafruit Music Maker FeatherWing

3 Lithium-Ion Polymer (LiPo) Battery (3.7V 1000mAh)

3 Micro SD Cards

4 Sets of Speakers

Beforehand: Preparations and Testing

What needs to be done ahead of time?

To make this experiment possible it was necessary to get the SparkFun ZX Distance and Gesture Sensor on time so we booked one full day to drive to Markham and buy all the pieces necessaries.

During our conversations, we decided to 3D print all three enclosures, for this it was going to be necessary to go time ahead to 100 McCaul to print them. This meant we needed to have the design ready time ahead.

To test Air Dj we booked Thursday to work together before presenting the experiment on Friday. We also decided to meet during the weekend for our second session of testing and film a promotional video.

 

Do you need extra batteries?

No, we won’t need extra batteries. We will recharge the 3 Lithium-Ion batteries overnight every time we tested the devices.

What goes into your repair kit?

-Electrical tape

-Hot glue

-Roxanne 🙂

The assembly of our experiment was not complicated at all. The recurrent issue was to reseat the Music Maker Featherwing, this was necessary every certain amount of swipes (about every 20). Having to reset the Featherwing meant we needed to have easy access to it, for this reason we decided to add velcro to it.

 

The Plan

 

How will you document your own user experiences while using the device? Notes? Journaling? Photos? Video? Audio recording?

 

We will be using video and photos of us testing Air Dj. We also want other people to interact with this experiment, we will be interviewing them as well.

 

What will you do to ensure that your data collection methods are consistent for each group member?

Data collection method: Working together to make sure we collect consistent data.

 

For each group member, what are the dates & times of testing?

Roxanne: One 4 hour session on Thursday

   One 4 hour session on Sunday  

 

Emilia: One 4 hour session on Thursday

           One 4 hour session on Sunday

 

Kristy: One 4 hour session on Thursday

          One 4 hour session on Sunday

 

Air Dj is an interactive device that works better when played with others and with a backtrack.

For this reason we have chosen to do two 4 hour sessions in which the three of us would be present, using and testing all three devices at the same time.

 

Will you be doing your day-to-day activities or doing a special activity?

 

We will be meeting in the Digital Futures Studio together. The idea is to have all three devices working at the same time and playing a backbeat for us to practice our music skills and test Air Dj.

 

Process Journal

Deciding what to do:

For experiment 5 we had so many ideas and it took us a few days to days to full decide what our experiment was going to be.

Some of our ideas:

-King of the hill game

-Light up shoes

-Personal/wearable drum set = EL wire

-Wearable Bongo butt, you literally tap that ass and make beautiful music

-AR glove – force feedback, detect collisions w virtual objects -magnetorheological

-Haptic undapants (what?)

-Conductive pen greeting card /copper tape

-Lazer tag phasers – IR sensors

-Virtual dj set, air dj

Emilia mentioned she wanted to make a wearable or something like Behnaz Farahi

http://behnazfarahi.com/breathing-wall-ii/


When we finally decided to develop Air Dj we made a list of the materials we would be needing:

2 more feather wings (Kristy already had 1 from the Mansplaining project)

3 micro sd cards

Buy 1 battery (Emilia already had 2 from the Love Corner project)

3 Proximity sensors / Gesture sensors

Since we need we were going to need more Music Maker Feather wings, Kristy immediately made an order for two more from the provider she got hers. That was very convenient!

Kristy mentioned it was very easy to use and sent us this tutorial :


https://www.adafruit.com/product/3357#tutorials

screen-shot-2017-12-13-at-7-38-34-pm

 

24259582_10159797633605096_338549185_o

 

 

Our package arrived!
Our package arrived!

Even though we had the Music Maker Feather Wings we were still deciding what sensor to use. At first we thought the proximity sensor we had in our CandC boxes were ok but then we decided to talk to Afaq and consult him which sensor he would recommend for our idea.

zx_product_cropped

The SparkFun ZX Distance and Gesture Sensor is the winner! Afaq recommended other sensors as well but because of price and availability, we chose the SparkFun ZX.

The only “issue” we had is they were only available in Canada Robotix and because of time we knew ordering online was not going to be our best option. We booked Saturday December 2nd for a roadtrip to Markham!!

During Friday’s class Roxanne and Emilia had the chance to speak with Nick regarding the Air Dj idea. Nick recommended to play with the size of our device, instead of making the device small to make it a little bit bigger, this would make the user’s experience more interesting because of the movements. We totally agreed!

That Friday (Dec 1st) we started to look for different enclosure ideas:

https://www.thingiverse.com/thing:571270/#files
https://www.thingiverse.com/thing:571270/#files

 

 

https://www.amazon.ca/Fisher-Price-Classics-Record-Player/dp/B003CGVCXS/ref=sr_1_sc_1?ie=UTF8&qid=1512159502&sr=8-1-spell&keywords=Fisher%20Price%20Classics%20Record%20Playe
https://www.amazon.ca/Fisher-Price-Classics-Record-Player/dp/B003CGVCXS/ref=sr_1_sc_1?ie=UTF8&qid=1512159502&sr=8-1-spell&keywords=Fisher%20Price%20Classics%20Record%20Playe

 

 

Early Sketches
Early Sketches
Early Sketches
Early Sketches

 

 

After researching possible enclosure ideas we decided our enclosures should be 3D printed turntables. We really liked the idea of 3D printing something that looked like a record for our Air Dj device.

 

This project was full of adventures! On Saturday Kristy and Emilia went to Markham to get the 3 ZX gesture sensors and 3 micro SD cards.

img_0375 25400656_10159671932215057_195626453_o

 

We also tried to buy these speakers but they were not available anymore and the project was becoming waaaaaaay too expensive. We decided to use regular speakers for the Air Dj.

24282066_10159625411155057_1282000775_n

 

 

Saturday night we all met in school and Roxanne started working on the code and explaining how it works.

 

From the very beginning, we had problems with the pieces. One of the Music Maker Feather Wings was broken and we had to use tape for the micro SD to stay in place.

https://learn.sparkfun.com/tutorials/zx-distance-and-gesture-sensor-smd-hookup-guide?_ga=2.251769418.219913925.1512157062-1708196433.1505250634

 

https://vimeo.com/247238710

 

Programming and Assembly:

Programming for the SparkFun ZX gesture sensor was surprisingly easy. SparkFun was kind enough to provide a comprehensive and well-rounded library which contained useful public methods such as readGesture() and gestureSpeed(). It was fairly trivial to create a case statement which iterated through the available gestures (sort of like an if statement, but cleaner for comparing different possibilities of the same variable) as they were being executed. Then, it was only a matter of determining the speed of the gesture to branch off into different sound effect options.

Following this, the music wing needed to be configured to play the sound clips. It was discovered by Sana, Kylie and Ramona that the music wing only accepted headphones which had a TRS jack. This meant that any speaker cable which also had a microphone would not work. This saved us a lot of time on troubleshooting since we all regularly use iphone headphones, which are TRRS.

One of the first problems we encountered after that was that wav files sounded really distorted. We didn’t have time to test whether file size impacted this, so we decided to simply use mp3 for everything. At this point, we realized that smaller file sizes were optimal, and tried to keep all of our files <= 50kb. The next challenge revolved around the feel of the sound bites we wanted to use. Since the Air DJ is gesture based, which means no haptic feedback, we needed to make sure the sounds associated to each gesture were intuitive and felt good. Once we accomplished this, it was only a matter of implementing the branches based on speed for each gesture.

We had to decide whether or not we wanted to play the full file or allow the feather to continue to operate while the sound bite played. In the end, the sound files were so short and the sensor reactive enough, that we settled on using startPlayingFile() instead of playFullFile(). This allowed the user to play more quickly, but also put the wing at risk of overloading and crashing. Allowing the full file to play wasn’t significantly less at risk of crashing the wing, so the tradeoff was worth it in the end for increasing the experience quality.

Speaking of crashing, boy do those music wings crash a lot! This made troubleshooting especially challenging since the life of the wing before crashing was limited to 5-10 gestures, depending on the sound bites it was trying to load. Some sound bites reacted better than others. In one case, a tiny 40kb mp3 file was crashing the wing almost constantly. We shortened it and compressed it a bit and it was suddenly much more responsive. We have yet to figure out if there was something corrupted about the file before or if the size alone caused the problem (though we had many sound files which were around 50kb, so it’s not likely the size alone which caused the problem). Our theory is that the music wing wasn’t meant to rapidly load and unload as much as we demanded of it.

The problem was always the same: the gesture would register, the sound bite was beginning to load from the SD card, and then the wing would stop working entirely. The loop function would stop executing too, which made it clear that the entire feather crashed. A manual reset of the wing would then be necessary, where the sound bite that was loading off memory would finally play, and then the wing would be good to go for another 5 to 10 gestures. Once every 8 or so resets, though, the feather would need to be powered off and then powered back on. I suspect that at these moments, there is simply too much corruption for it to recover after a reset.

These were some of our worst problems concerning the hardware. A simple, though no less devastating problem was that one of our music wings arrived with a broken SD slot. The pin that holds the SD in place was absent, so the card would have to be taped into its chamber in order for it to read. This was unreliable and frustrating to work with since it seemed that even jostling the card could disconnect it, and the feather required a reset when this happened.

 

The Enclosure and Branding

After figuring out how to make the sensor and the music maker featherwing we decided to make the coolest enclosure ever made!

From the very beginning we knew we wanted to 3D print the enclosures but the 3D printers in 205 Richmond were broken at the moment, Kristy decided to go to 100 McCaul to get them printed there

The enclosures ended up taking from Mon-Thurs to print due to a failed print on Monday in our lab and the size of the print job. In total we made 3 versions of the enclosures: The 3D printed ones (in grey below, later painted black,) a wooden backup, and then after testing, the melted record top Emilia can be seen working on below.

One of the problems with the casing (more on this later,) is that we didn’t consider that a feather mounted on a square protoboard wouldn’t be able to get close to a circular wall structure. This meant that plugging in headphones wasn’t as easy to do as we envisioned. We also didn’t think of putting in a hole for the charging cable.

New enclosure with stand
New enclosure with stand
Old enclosure with stand
Old enclosure with stand

Otherwise, the physical design of the product looked and felt really great. The rippled look of the vinyls gave a kind of unique affordance to the device which we might not have been able to convey otherwise. A small adjustment we should have considered sooner in the production, however, was the placement of the feather.

Since we had to open the case so often to reset the feather, and putting the lid back on to align with the sensor wasn’t a trivial task, we should have placed the entire device itself on the lid so we could adhere the sensor to the lid. With the feather and sensor adhered to the bottom of the body, it was easy to jostle the solid core cables a bit too much, and several of the connections ended up snapping at certain points during the testing.

 

img_4719 img_3744 25035310_10159642687810057_910550177_o 24879404_10159826765625096_554412854_o

Wood backup prototype
Wood backup prototype

img_0388 img_7373

 

 

Logo Designs:

24550312_10159817598595096_608801761_n 24337556_10159817599215096_1738569749_n 24331357_10159817599205096_1860321117_n

 

 

Survey and User Testing

Survey Link: https://www.surveymonkey.com/r/QKB5MWB

screen-shot-2017-12-13-at-9-05-27-pm screen-shot-2017-12-13-at-9-04-48-pm screen-shot-2017-12-13-at-9-04-31-pm screen-shot-2017-12-13-at-9-04-05-pm screen-shot-2017-12-13-at-9-03-47-pm

After Testing Session:

We tested the Air Dj amongst ourselves and students at OCAD. In general, everyone was very excited to interact with the device and seemed to enjoy using it. There is a steep learning curve in terms of actually making anything that resembles music. We found that it felt more or less intuitive and organic to use depending on the beat of the backing track that was playing.

The most frustrating part (touched on earlier) was that the microprocessor in the featherwing would stop functioning within approx 20 swipes. This affected its portability and caused us to change the design of the top of the enclosure(mentioned earlier) as the 3D printed enclosure was incredibly tight which we thought would be fine if you were just opening it occasionally to charge the battery, but was not feasible if it needs to be opened after a minute or two of use.

Again, to resolve this, we switch to the melted record top option which we then secured via velcro.

Kristy preferred the older design, while literally, everyone else liked the look of the new melted record top. Regardless of design, functionality needed to prevail. Most people (ourselves included) really enjoyed interacting with the Air DJ, we just wish it would last longer each session. We were also testing using large external speakers, though we had originally envisioned the Air DJ as an all in one enclosed product. This was a strictly financial choice as the project had become quite expensive. Our ideal iteration (especially after testing,) would have an enclosed speaker along with the battery with a small port added on the side to plug the battery in.

 

Final Product and Photos From Class Show:

_dsc0479 img_1450

Project Context and References:

We wanted this product to be fun and portable, so we researched other musical devices, things DJs are currently doing and toys.

After already deciding on the Air DJ, we came across this great device that creates a MIDI interface out of a pizza box:

https://learn.adafruit.com/circuit-playground-pizza-box-dj-controller/overview

Other Tutorials:

https://learn.sparkfun.com/tutorials/zx-distance-and-gesture-sensor-smd-hookup-guide?_ga=2.251769418.219913925.1512157062-1708196433.1505250634

https://vimeo.com/247238710

https://www.adafruit.com/product/3357#tutorials

Keith McMillen is also an interesting musician that’s creating different sounds and instruments:

https://www.keithmcmillen.com/

A look at what’s currently on the market:

https://store.djtechtools.com/

Mansplainer

Experiment #4: Mansplainer

Kristy Boyce and Tommy Ting

The Mansplainer in Situ.
The Mansplainer in Situ.

Mansplainer is an emergency button for use in the cases of  “mansplaining.” Derived from the Latin, Mannus Interrupticus, it is  the ancient art of a man (often white,)  explaining something, to a woman, and/or person of colour, how to better do, be,, etc,.

The button, which is connected to WiFi, triggers a WiFi speaker system to play the soon to be hit track “Mansplainer”.

A lot of our feelings about why one might need a “Mansplainer” station can be summed up in the video “The Handmaid’s Tale For Men”

https://youtu.be/ciPszqk703k

“This is the story of Manfred, a man just trying to survive in a world under the harsh rule of the feminazis”

 

Circuit Layout

Featherwing on feather MO
Featherwing on feather MO, then out to speaker (Speaker is connected to the featherwing with an audio cable, fritzing did not have a part for audio cables)
Feather with large LED button
Feather with large LED button

Code

https://github.com/livefastdynasty/Mansplainer

 

Supporting Visuals

Mansplainer img_3603
Process Journal

Day 1 – Brainstorming

We brainstormed a few initial ideas on our first day. We first looked at ideas that inspired us such as the methane gas detector from last year. We were both drawn to ideas that were ridiculous and added to the “internet of shit”. We then asked ourselves, “what do we hate?” and came up with a list of pet peeves most of which were masculine traits that we thought were disruptive to our daily lives. Two things really stuck – Mansplaining and  Manspreading. Mansplaining was very well documented and succinctly described in Rebecca Solnit’s book, Men Explain Things to Me (2005).

We briefly described what these two prototypes would look like. A “Mansplainer” would be a kind of walkie-talkie system where if the system detects a “female” higher pitch then on the other end if would garble up the speech into gibberish, but if it were a male or a lower pitch voice speaking into the system then it would freely allow the speech to go through. A “Manspreader” would use the flex sensor down the legs and a force resistor on the bum. When a person sits down and spreads their legs on public transport, then a shock would be sent. We let these two ideas sit until we met up again.

Brainstorming board
Brainstorming board

 

Day 2 – Finalizing idea

We decided on the Mansplaining walkie talkie idea. We spoke with Kate and she suggested that we simplify the idea and we came up with a button trigger system instead since we have to use WiFi to connect the devices and not radio. We quickly settled for this new idea and looked into things we have to purchase which included the music maker feather wing and a big button.

Sketching out interaction.
Sketching out interaction.
Rough sketches of button and speaker design.
Rough sketches of button and speaker design.

 

Day 3 – Testing with codes and simple button, LED light and piezo speakers

We first played with two sample codes to ensure that we have a communicating system between two feathers. We adapted the pubnub samples from Nick and the LED samples from experiment 1. We changed parts of Nick’s code so that the button only generated either a 0 or 1. We successfully used the button the turn the LED light on and off. As you can see in the video documentation, there was a delay in the button and triggering the LED light but we figured that it was the WiFi connection. We also noticed that after a few attempts, “client error” would show up on the button side of the serial port.

We then switched the LED light out for a piezo speaker instead. This is essentially the very basic version of what our final project would look like; a button triggering some sort of tone/music. The piezo speaker required a pitches library and we got it working by adapting the previous LED code with some sample piezo speaker code. Neither of us did a lot of arduino coding in our previous group experiments so we were pleased with our abilities to get a very simple system going on our first attempt.

screen-shot-2017-11-25-at-3-46-47-pm

Testing with piezo speaker
Testing with piezo speaker
Button testing, the large push button LED was in the mail
Button testing, the large push button LED was in the mail

 

Day 4 – Music Maker Feather Wing and LED Button, 3D Printing

We wanted our system to play music rather than an alarm, so we purchased the Adafruit Music Maker Feather Wing from an online shop based in Montreal. The music maker feather wing has a microSD slot and a 3.5mm audio jack slot to connect to a speaker that goes on top of the feather M0.

We also bought a bigger button that has a LED light that would turn on when pressed. First, we ran into some problems with the music maker microSD formatting requirements but once we figured that out we were able to successfully play a track using their example code. While the new button was being wired up, we combined the existing code for the piezo speaker with the new example code from the music maker library. screen-shot-2017-11-25-at-3-47-09-pm

We removed all the code relating to the piezo speakers and pasted in the music maker code. In the music maker example code, the play music function was sitting in the void setup section, which we mistakenly did not catch at the first few attempts. Because the play music function was in set up instead of loop, the track would just play regardless of the button. There were also a lot of unnecessary code in the example which we either removed or disabled.  Once we put all the things in the right place, we were able to use the button to trigger the music maker to play our track.

The LED push button worked pretty much like our arcade button that we had been using for testing, except that it had two more attachments for power and ground to the light and a self-contained resistor within it.

We wired the button so that it stayed on continually until pressed. We did find this interaction a little fast for the eye to catch, and in future would most likely program in a blink to the beat of the music upon button press.

Testing LED button sending to featherwing on press, over wifi.
Testing LED button sending to featherwing on press, over wifi.
Large LED push button testing
Large LED push button testing

 

 

Additionally, we spent several days attempting to 3D print our speaker encasement, but the print job failed 7 times before all of the 3D printers in the lab broke.

Success! The featherwing music maker worked!

Fusion 360 file that never turned into a speaker
Fusion 360 file that never turned into a speaker

 

 

Day 5 – Crafting Day and Recording

We decided to riff on the song “Maneater” by Nelly Furtado to create our new song “Mansplainer”. We wrote some new lyrics of the chorus and recorded it then combined it with an instrumental track of Maneater.

“Mansplainer,

cut you right off

with his man thoughts

can’t you please just shut the fuck up?

He’s a mansplainer,

make your life hard,

give you bad scars,

wish you never ever met him at all.

https://www.youtube.com/watch?v=Pvr_ivXcuok

We chose a faux concrete finish spray paint to cover wooden boxes for the button and speakers enclosures
We chose a faux concrete finish spray paint to cover wooden boxes for the button and speakers enclosures
We hand painted wooden letters for the signage.
We hand painted wooden letters for the signage.
Speaker in the enclosure with usb power supply
Speaker in the enclosure with usb power supply
The speaker mesh was hand painted red and sculpted from a sheet of wire mesh, into a circular shape.
The speaker mesh was hand painted red and sculpted from a sheet of wire mesh, into a circular shape.
Final protoype at critique
Final prototype at critique

 


The Mansplainer in action at critique

Project Context

As people who have both experienced toxic masculinity, we wanted to a project that explored and poked fun at the things we find irksome in terms of western presentations of masculinity.

There is actually statistical dating showing the number of times a female doctor, for example, is interrupted vs her male counterpart. A look at many American tv stations will show males guests interrupting and talking over or in a patronizing manner to female anchors. In the 2016 U.S. election, Donald Trump repeatedly interrupted Hillary Clinton during her speaking allotment, then complained he hadn’t been given a fair amount of time to talk. While we see this as a serious problem, we also recognize that mansplaining can often be well-intentioned, so we wanted to approach the project with humour and a light touch, while still underscoring the issue at hand.

We took inspiration from different forms of satire and parody like Kate McKinnon’s Justin Bieber Calvin Klein ad and Saturday Night Live’s classic “Bag O’ Glass sketch” as well as their fake products like “Oops, I crapped my pants” and “Woomba.”

Bibliography

Solnit, Rebecca. Men explain things to me. Haymarket Books, 2015.

Internet of Shit – Reddit subreddit

https://www.reddit.com/r/theinternetofshit/

 

Calvin Klein Ad – SNL

https://youtu.be/OXvo6ksBHnI

 

Oops, I crapped my pants

https://youtu.be/iUP3PMLdoOs

 

Woomba – It’s a robot and it cleans my business

https://youtu.be/gqesEYUXr78

 

A Cultural History of Mansplaining

https://www.theatlantic.com/sexes/archive/2012/11/a-cultural-history-of-mansplaining/264380/

 

A look at the science behind one of the Internet’s favorite new words.

https://www.youtube.com/watch?v=t7GUjKv9qSI&feature=youtu.be

 

Large LED push button tutorial:

http://play.karlssonrobotics.com/tutorials/circuits/wire-big-dome-button/

 

Nelly Furtado’s cover of “Maneater” with the vocal removed:

https://youtu.be/bDoTqATJL6c

We used the PubNub sample setup from Kate Hartman and Nick Puckett’s class as well as their provided Arduino examples as a base for our code.

 

Experiment #3 “Trump Punch”

By: Kristy Boyce
Code available on Github:
https://github.com/kboyce123/Experiment3

Final version with Trump reaction upon being hit by the right hand boxing glove.
Final version with Trump reaction upon being hit by the right-hand boxing glove.

img_4549

 

 

 

 

 

 

 

Project Description:

Trump Punch is a computer game and peripheral device, it is intended as a stress release aid for liberal newshounds fed up with the antics of those in political office. The current version (for rather obvious reasons) comes with a large, smug Donald Trump face that’s just ripe for the punching. In the spirit of bipartisanship, the  two button classic arcade style controller allows the user “left” and “right” punching abilities. The game will also punch Trump on mouse click.

 

Process Journal:

My process began with taking a day to doodle things and come up with ideas for peripherals that I would do without regard to feasibility or failure. Some ideas were:

  • Left-handed mouse
  • Mouse that works in a 3D space in the air
  • Antique radio but with potentiometer knobs linked to Spotify
  • A classic video game
  • A sensor that detected men and catcalled them

I narrowed it down to the radio concept and the game idea. I figured I could 3D print the vintage radio enclosure or cut an actual old radio, and use some of the Chuck Norris code we played with in class; the dial options would be labelled based on mood and then load up a playlist via Spotify or perhaps an on computer xml playlist.

 

The game I came up with was Trump Punch and based on a simplified version of a PUNCHOUT style classic Nintendo game. I liked the topical hook, my research was looking like it was the more feasible option based on my skillset, and I was interested in animating on the web.

I met with Kate on the 3rd day of the experiment and presented both ideas to her. She seemed to like the radio idea best, but based on my enthusiasm and the fact that I could storyboard out all the steps for the game, she suggested I follow that route.

 

img_3518-2

I spent a good deal of time playing with the p5.play library, animating and creating sprites. I gained a lot of ground initially with the simple example interactions but ended up spending hours and hours attempting to complicate and customize the functions.

screen-shot-2017-11-04-at-3-58-16-pm screen-shot-2017-11-04-at-4-07-57-pm

I started working with a creative commons licensed png drawn from a photo of trump and a red glove.

 

Glove & Trump Interaction:

I could get the glove to move up in a punch like fashion by adjusting its y-axis 50 pixels on mouse click.

I easily got the glove to track the mouse coordinates and follow, but I encountered a lot of trouble in creating the actual ‘hit’ of the glove and Trump’s face. At this point .overlap and .collide became confusing and in the end, I had to use overlap to create my collision. Which was very counter-intuitive.

I was able to  rotate or scale Trump’s face when the two PNGs came into contact, but I had a lot of trouble getting it to swap to a reaction PNG in a convincing way in combination with the mouse click and image overlap.

 

Buttons

I couldn’t figure out how to make things happen on button click rather than mouse, but it turned out all I had to do was make the P5 code listen for mouse or button click as in “||” in the code.

I was able to get 1 button working with communication between P5 and Arduino fairly easily.

 

Testing with buttons
Testing with buttons

img_3445
There was a certain point when I couldn’t seem to get more than a weird “tap” instead of a punch and that I started playing around with a Trump Asteroids game with an angry uterus. I really actually like this concept, but it didn’t seem to require the two button input setup that I was already very committed to at that point.
In the end, it was a good exercise and helpful because the examples I looked at worked with calling and animating sprites.

The angry uterus 1
The angry uterus 1
Angry Uterus 2
Angry Uterus 2

 

 

 

Roxanne Henry went over some array stuff that helped me get the second button up and running along with Nick’s example in the experiment 3 pages. This is where I also learned you really do need to use a ground, really.

Syntax problems
Syntax problems

Once all the cod issues were worked out, I put the PNGs in the right order so the glove wouldn’t float behind Trump, I added a left punch glove and a reaction sound effect for when Trump gets hit.

Peripheral Controller:

Initially, I thought I’d use a big red button you could just smash, but I wanted to create an object that I would actually keep on my desk that wouldn’t take up too much space but that would match my aesthetic (I have a white desk with white peripherals.) So in keeping with the vintage video game feel I went with a white and red Nintendo-ish controller design that would enclose the feather. I followed a 6 button arcade enclosure tutorial on Adafruit and modified the design to suit.

screen-shot-2017-11-04-at-1-31-58-pm screen-shot-2017-11-04-at-1-34-34-pm screen-shot-2017-11-04-at-1-35-17-pm

Fusion 360 design
Fusion 360 design
Fusion 360 mockup
Fusion 360 mockup

 

 

 

 

 

3D printing
3D printing
img_3486
img_3487
Disappointing controllercase

But then disaster hit! The red buttons I had purchased were not the right size for my printed design. I had double checked before purchase, asking the saleman if they were 24mm buttons, but they were actually 30mm. A smart person would measured anyway but I didn’t.

I ended up purchasing a box and keeping everything on the breadboard, which was functionally fine, but I really wanted that slick, white, handheld controller that I spent hours and hours on.

The horrible backup box
The horrible backup box

img_3483

Drilling the horrible backup box
Drilling the horrible backup box

 

 

Final Prototype In Action:

Context:

This project was influenced by my interest in socio-political issues and the classic video game “Mike Tysons’s Punchout” I see Trump as a perfect King Hippo style character. I knew I wanted to make something highly graphic, within my skillset, and with some type of reasonable “why” factor or hook.

Moving forward I would like to involve more motion in Trump, some taunting animation and audio, a bell “ding” sound and have the game cued via API. Perhaps when #Trump is trending in the news, the game opens. An interesting idea I had during crit was that maybe social media mentions would literally grow his head and strengthen him as an opponent, he literally feeds on the online attention, just like in real life! #SAD

References/Sources:

https://www.arduino.cc/en/Tutorial/AnalogReadSerial

http://p5js.org

http://jessiereyes.com/gamepro/group-project/

In class Code from Nick Puckett & Kate Hartman

P5.play.js

http://p5play.molleindustria.org/

 

Code Snippets

Coding Train:

Donkey Hotey via Flickr

Adafruit Industries

 

Experiment #2 – RGBelieve it!

Group Members: Roxolyana Shepko-Hamilton & Kristy Boyce

Project Title: RGBelieve it !

Description: RGBeleive it! is an interactive, multiscreen colour tracking experience. It works as a game or stand-alone art installation, depending on the user’s needs/mood. Using Computer Vision and webcams, RGBelieve it! Tracks colour and creates a variety of “stunning” shape based on screen experiences.

For our project, we wanted to get more familiar with capturing and tracking motion via a computer’s webcam.

We focused our research around the following questions:

How do we use multiple computers to track motion and show the effect of it on each screen? Is it possible to have a motion sensor on each computer connected to a central website? How would the website automate the output? Can we make motion based art? What about a self portrait done via video captured from a web cam?

Sketches & Brainstorming:

  • Create art with your body
  • Lay all of the phones down and use as a giant interface that you can interact with
  • Phones are all touch-based motion tracking? Gestural? Drawing input, swipe etc. could we make the phone vibrate
  • Computer screens – light sensitivity, motion tracking, speakers in computers,

We decided to try and create a multi-screen, motion activated art installation; when you moved, the visualization on the screens would move, grow and ideally draw.

So we researched people that had created similar work and watched tutorials on painting with pixels, edge detection, etc. We were able to get the webcam up and running fairly quickly but found the filters like “pixellate” etc, were a little too basic and we worried that though it would look cool to have 20 different screens with different looks: heatmap, pixel, edge detection, etc, using the filter function was a little too basic in terms of adding out own touch and really creating something new.

One of our first ideas was to have a wall vintage looking of TVs like one might have seen in a 1960s department store, but of course, in this case, the “TVs” would be png files layered over our webcam video feed.

Experimentation:

Edge detection with a png tv border
Edge detection with a png tv border
Motion tracking with webcam is a comparison of one frame to the next, shifting of the color
Motion tracking with webcam is a comparison of one frame to the next, shifting of the color

 

TV wall sketch
TV wall sketch

 

 

 

 

Links to some of our first early research:
https://www.npmjs.com/package/gifshot

http://diffcam.com/

https://medium.com/little-miss-robot/motion-tracking-in-the-browser-6a4f48b9ba29

To open webcam in browser

https://davidwalsh.name/browser-camera

Brightness Mirror – p5.js Tutorial This video looks at how to create an abstract mirror in a p5.js canvas based on the brightness values of the pixels from a live video feed:
https://www.youtube.com/watch?v=rNqaw8LT2ZU

 

 

Experimentation Cont’d:

We tried to create our motion/colour tracked output in a variety of ways:

  • Edge detection
  • Pixellate
  • Particle field (examples we found were way over our heads but that didn’t stop us from spending a few days tinkering with it!)
  • A faux heat mapping effect
  • Optical Flow

Video of particle effect responding to magenta on the webcam

Particle 1
Particle 1
Particle 2
Particle 2
Particle 3
Particle 3
Experimenting with webcam on the ipad
Experimenting with webcam on the ipad

 

Got particle effects working, got fullscreen (window, height etc), broke particle effect, fixed it. Original particle code was a different version than the javascript file we had downloaded. Some technical issues included how to hide the webcam window while still drawing the data from it.
To solve this, we tried the following:
Getting rid of the window entirely (didn’t work)
Hiding it through display:none (didn’t work)
Minimizing the window’s size so it wouldn’t appear on screen (didn’t work)

Essentially realized the #video, #canvas had to be on the screen in order for the color tracking to work. Solution: change the opacity of #video, #canvas so that it wouldn’t appear on screen but still existed there invisibly!

Once we got the code working locally and basically appearing in the layout we wanted, today was mainly an exploration day. We messed around with the color of the circles, the shape/size, how many in a cluster, etc. One of the biggest issues that we are finding is not being able to find how to change the specific color that the code is tracking. What if we wanted different colors to be tracked? Ideally, we would come up with at least 4 or so different colors that the website(s) would track and load up different folders onto webspace. We’d have 4 almost identical websites, which would open up the opportunity for us to create a sort of matching game; find which site responds to your color!

Our sense is that the code is in the the tracking.min.js file. There is a colortracking wording associated with ‘Magenta’ but there are no RGB or hex values, just the word, ‘Magenta.’

Another potential extension of this project would be to create/affect sound along with the particles; or at least call up some appropriate sound affect to go along with the interaction!

 

After our first meeting with Kate, she suggested that what we were trying to do was A. Computer Vision and B. HARD.

We then looked at Kyle McDonald’s work and additional tutorials from Coding Rainbow that focused more on painting with pixels and Computer Vision:

https://kylemcdonald.github.io/cv-examples/

Nick provided a simpler sample for creating a colour track and response system
Nick provided a simpler sample for creating a colour track and response system

Challenges:

  • A lot of the resources we found were in processing as opposed to p5.js
  • We found a working demo of a particle effect but because it was so complex, even editing it was a very steep learning curve and we had trouble really making it our own. That and it was a time waster when in the end Nick was able to point us towards a much simpler baseline as a (re) starting point.
  • Lack of understanding of javascript held us back in terms of knowing how to show our effects but not the capture video and how to layer visuals (pngs vs video, etc,.)

With our new, dumbed down approach we started again. While keeping with the idea of capturing video via the webcam, our new approach involved drawing simple shapes in response to motion. We also started to consider more analog ways of interacting with RGBelieve it!, like a game involving sticky notes.

One of our game layout sketches
One of our game layout sketches

 

Sticky notes to be placed on each user/player
Sticky notes to be placed on each user/player
Sana in action
Sana in action

 

Success!
Success!

 

 

Colour tracking via the webcam and tracking.js; the camera detects the colour, then draws a rectangle. The size and x, y coordinates change based on the motion captured via the webcam. Basically, you can put a magenta stickynote on your forehead and dance like a maniac in front of the webcam and a shape will be drawn and “dance” with you. If you thought it was hip to be square before, now it’s righteous to be a rectangle!
 

Shapes move along with tracked blue ellipses.
Shapes move along with tracked blue ellipses.

 

img_3364
 

Working from some examples, Kristy was able to get more complicated shapes with their own motion to show up on screen when magenta was detected by the webcam, but couldn't get any further tracking operable
Working from some examples, Kristy was able to get more complicated shapes with their own motion to show up on screen when magenta was detected by the webcam, but couldn’t get any further tracking operable

 

Shapes move along with tracked blue object
Sana was able to integrate the code from the complicated shapes that had their own separate movement to move along with our simple tracked blue object

Videos of Drawing/Tracking & Motion:

Working on making the "start" page responsive
Working on making the “start” page responsive
Brainstorming a name
Brainstorming a name

Game Logistics Setup:

To prepare for the in-class gameplay and to save time, we assigned each participant a team and a url to load onto their laptops in advance. We listed this url on our Digital Futures facebook page and came early to load the browsers manually to ensure timely setup for the presentation.

These are our groups (as chosen by this random list organizer)

(LIST GROUPS AND NAMES ON HOMEPAGE)

Final name and logo
Final name and logo

 

 

Welcome to RGBelieve It!.

These are your teams:

MAGENTA TEAM

Tommy

Prof. Kate

Karo

Feng

Dave

Yiyi

Ramona

Emilia

 

CYAN TEAM

Roxanne H.

Max

Prof. Nick

Sean

Quinn

Chris

Finlay

 

YELLOW TEAM

Kylie

Savaya

Jad

Emma

Margot

Roxanne B.

Dikla

 

Please locate the sticky note pads that correspond to your team’s colour.

 

Each team member will place 2 sticky notes onto themselves; this can be on their shoulder, knee, etc. Choose the sticky note location wisely, as once you have placed the 2 sticky notes on your body, the locations are final!

 

You will also carry extra sticky notes, but keep these in a pocket, or concealed in some way!

 

Once everyone has placed their sticky notes on themselves, the game will begin once everyone hits START. Make sure to allow RGBelieve It! to access your webcam!

 

Walk by each screen to see what happens. If the screen reacts in some way, that means you have found a screen that matches your team colour. Mark the screen with a sticky note of your team colour.

 

The team that locates 6 or more screens that reacts with their colour first, wins!

URLs:

1=magenta

2=cyan

3=yellow

 

https://webspace.ocad.ca/~3164381/1a/

https://webspace.ocad.ca/~3164381/1b/

https://webspace.ocad.ca/~3164381/1c/

https://webspace.ocad.ca/~3164381/1d/

 

https://webspace.ocad.ca/~3164381/2a/

https://webspace.ocad.ca/~3164381/2b/

https://webspace.ocad.ca/~3164381/2c/

https://webspace.ocad.ca/~3164381/2d/

 

https://webspace.ocad.ca/~3164381/3a/

https://webspace.ocad.ca/~3164381/3b/

https://webspace.ocad.ca/~3164381/3c/

https://webspace.ocad.ca/~3164381/3d/

 

URL list randomized and up to 20 (made sure to have at least 6 of each color):

 

https://webspace.ocad.ca/~3164381/2d/ savaya

https://webspace.ocad.ca/~3164381/1b/ feng

https://webspace.ocad.ca/~3164381/3c/ sana

https://webspace.ocad.ca/~3164381/2b/ kristy

https://webspace.ocad.ca/~3164381/1c/ kylie

https://webspace.ocad.ca/~3164381/2c/ roxanne

https://webspace.ocad.ca/~3164381/2b/ quinn

https://webspace.ocad.ca/~3164381/3d/ ramona

https://webspace.ocad.ca/~3164381/3b/ emilia

https://webspace.ocad.ca/~3164381/2a/ chris

https://webspace.ocad.ca/~3164381/1d/ max

https://webspace.ocad.ca/~3164381/2d/ jad

https://webspace.ocad.ca/~3164381/3d/ emma

https://webspace.ocad.ca/~3164381/1a/ sean

https://webspace.ocad.ca/~3164381/1b/ tommy

https://webspace.ocad.ca/~3164381/3a/ roxanne b.

https://webspace.ocad.ca/~3164381/2d/ dikla

https://webspace.ocad.ca/~3164381/3a/ karo

In Class Demo:

img_9587 img_6544

Chaos ensues
Chaos ensues

Code:

https://github.com/sanaalla/RGBelieveIt_Experiment2

Context:

We see RGBelieve it! as the first phase of research for a much larger project, involving motion capture and art creation via gesture. Much of our inspiration in terms of possibilities game from artists working with motion capture and Computer Vision. Caitlin Sikora’s interactive web application Self Portrait, 2015 “…uses motion capture data and Markov models to generate new movement data in real time.”

We really connected with the way her animations responded to the user and the user data influenced the output. One of her other projects, I need-le you, baby used p5.js and webcam pixel data, which made us think that many great interactions were possible using these same technologies. And though that’s true, Sikora’s knowledge and ability in this area is leaps and bounds ahead of ours. But from an inspiration perspective, her work was invaluable.

 

Moving forward, we see a large screened art installation being the likely final output, wherin the user stands in front a screen larger than themselves and their gestures are captured by a camera then rendered in real time on screen in some artistic visual output. Whether the motion is captured via placing magenta mittens and kneedpad sfor example, on the user or via other motion capture techniques will require more thought and research.

Bibliography:

McCarthy, L., Reas, C., & Fry, B. (2015). Make: Getting Started with p5.js. (1st ed.). San Francisco, CA: Maker Media

P5.Js. (n.d.). References. Retrieved October 17, 2017, from https://p5js.org/reference/

 

http://caitlinsikora.com/interactivity.html

Camera and Video Control with HTML5

Motion Detection with JavaScript

https://www.npmjs.com/package/gifshothttps://motiondetection.be/

http://diffcam.com/

https://motiondetection.be/

https://github.com/lonekorean/diff-cam-scratchpad

https://trackingjs.com/docs.html#trackers

https://github.com/ACassells/processing.js.SimpleWebCamInteraction

www.youtube.com/watch?v=DW3AR9PFY84

http://caitlinsikora.com/interactivity.html

https://kylemcdonald.github.io/cv-examples/

http://www.jamesalliban.com/#projects

https://github.com/eduardolundgren/tracking.js/issues/175