Gay Magic

Gay Magic

https://garnet-moose.glitch.me/

By: Tommy Ting

Gay Magic is a web-based game where the player has to move the objects around the site with two potentiometers in order to unlock all the mystery items such as poppers, Truvada and a chain collar. Once all items are unlocked, a spell is cast and Dionysus is summoned to start the party.

Circuit Layout or Circuit Schematics

untitled-sketch_bb

Code

https://github.com/livefastdynasty/peripheral – Master file, final code and assets.

https://github.com/livefastdynasty/peripheral/commits/glitch – See how I progressed and worked through my cod.

 

Supporting Visuals

Wiring up the force resistor

img_2293

First Test

Second Test

Third Test

Fourth Test

Final Presentation and Game Play

Final Presentation and game play

Final Presentation

img_2342

After Nick’s suggestion of moving the potentiometers in front of the fur and use the fur as a mouse rest

img_2357

First Iteration of Dionysus without the animated rainbow

screen-shot-2017-11-09-at-1-00-57-pm

Final iteration of the Dionysus page

screen-shot-2017-11-09-at-10-07-17-pm

Process Journal

Day 1 Working with API

On my first day, I was looking through all the available APIs and found one that I was particularly attracted to called “We Feel Fine”. This API draws from many different web spaces whenever someone posts something that starts with “I feel” or “I am feeling”. From this, I was hoping to use a force resistor to activate this API with Spotify to generate emotions with a matching song. I started to wire up the sensor and immediately found it quite hard to work with because it was very sensitive, even a very light amount of force would send it from 0 to 1023. Moreover, I realized that the “I Feel Fine” API was not working as I wasn’t able to pull anything from it. I decided to move on and find another API that I want to use. After some search I couldn’t find anything that really struck my interest.

 

Day 2 Moving On

I started playing around with some photo montage assets that I have been working with in my class Possible Futures with Dr. Poremba. In this class, I have been exploring the intersection of queerness and critical future studies with performance and dance. I have been cutting up all these body parts from magazines to form new shapes. One thing that really struck me during this creative process was using these body parts to form a Stonehenge scene. Queerness and witchcraft have a really interesting relationship of radicality, Otherness and anti-normativity (I will explain in depth in my project context). I quickly used a sprite that I have previously made with some a sample code found on the Play.js library from P5js.org. I really liked what I saw and decided that I would use this experiment to further develop some ideas I have been exploring in Possible Futures and my thesis proposal.

 

Day 3 Start Working on Stonehenge Scene

On my third day I quickly made the first Stonehenge sprite with a crystal gem as the cursor. I wired up one potentiometer and got it to move the gem on the X-axis. I adapted the Arduino and P5 code from ITP. I had to play around with the P5 code in order to get the potentiometer to control the gem. In my P5 code, I first mapped the x.position of the gem to inData which only got the cursor to move at a very limited distance, maybe around 300 pixels. Later, I created the a var for which mapped the x-position to the potentiometer and then mapped the position of the gem to the var. The x-position was mapped to the full width of the screen which allowed the gem to move full-screen. I did encounter a problem with this solution because initially I put the var up top above everything but once I put it under the draw function it worked perfectly.

 

Day 4 Two Potentiometers and Stonehenge Scene

On my fourth day I tried wiring up two potentiometers so that one would control the X and the other would control the Y axis. I updated the P5 code and used the Arduino code from Nick’s example. I couldn’t get it to work and wasn’t able to find any help from my peers. I decided to put away this idea for now because I was running out of time and I wanted to finish the Stonehenge scene and make it a working prototype with at least 1 potentiometer. I then created the rest of the sprites on Photoshop and tested the collider and displacer codes found in the P5 play library, which I assigned to some assets. I played around with the overlap change animation code (also found in the Play library) to the gem cursor to make change into a different object when it overlapped on top of one of the stonehenge montage. Since it did, I went back to Photoshop to finish creating the sprites.

 

Day 5 Finished Stonehenge Scene

I finished the Stonehenge scene by creating all assets on Photoshop. I developed the game concept further designing a very simple mechanic and goal. The player has to move around the these floating body parts (which the gem collides on) to overlap the Stonehenge body parts, only one combination of the stonehenge and the floating body part would result in a change of animation of the floating body part. Once all the body parts have been changed, an image of Dionysus would appear. It took me some time to figure out the code to make Dionysus appear once all the images have been changed. After a bit of playing around I figured out that if I created a simple if statement of if this overlaps && this overlaps && this overlaps && this overlaps then add Dionysus animation, it worked. Unfortunately the animation played at a very slow speed but I only had 1 more day left and I wanted to spend it on going back to trying to get two potentiometers to work.

 

Day 6 Revisit Two Potentiometers

I found this really challenging. I revisited Nick’s example, and after some suggestions from Kate, I changed my P5 code to reflect some of the wording that Nick used in his code. I was able to get both potentiometer working but only one at a time. After a lot of frustrating moments, I gave up and sought help from my classmates. I had help from a number of people, but it was Finlay who was finally able to help me with two potentiometers (he had 8 in his project…). He told me to put everything I had in my draw function into a new function called “game”, and the draw function made it so that the game wouldn’t run unless the serial connection was established. In the ardCon function, which runs the serial connection, he told me to make the values of serialStatus switch from 0 to 1. Finally, going back to the drawing function, we made it so that 1 would start the game function. This made both potentiometers working!

 

Project Context

Gay Magic is a game that explores the intersection of queerness and future studies through the investigation of gay identities and subjectivities, ephemerality and history. The mechanic of the game is to move around the body parts to uncover gay objects such as poppers and truvada pill. All of the artistic choices of the game are deliberated, the background is pink which reflects on the colour Nazis’ used to label gay men in the concentration camps, the body parts draws on the history of our bodies being sites of resistance, political protest and medical experiments, and the reference to witchcraft and magic references the similar shared experience we have with witches and also a general reference to 70s radical political groups. The picture of Dionysus at the end of the game is because Dionysus is the ancient greek god of play and party, which has been a symbol for Freud’s pleasure principle and for queer men groups such as the Radical Fearies.

 

The usage of peagan and witchcraft imagery is informed by the work of Silvia Federici. Federici argued that capitalism, in order to continue existing, requires a constant expropriation of free labour, especially of women’s (2004). Federici connects the rise of capitalism to the struggle of communalism and feudalism and the the success of capitalism depends on the subjugation of communalism. She situates the institutionalized punishment of prostitution, witches and heretics to the beginnings of proto-capitalism (2004). By incorporating imagery of paganism and magic, I want to reintroduce the radical potential of the gay male body that have been lost due to heteronormativity, privilege and capitalism.

 

I decided to use body parts to create my photomontage because queer and gay bodies are the sites of our political resistance and therefore, the sites for our political liberation. Photomontage is a great way to illustrate the fantastic and the eery using recognizable imagery, which has been a technique of choice for the Surrealists, radical architects such as Archigram to current speculative futures designers (Jain et al 2011).  Gay bodies bodies have been experimented on, chemically castrated (Turing), burnt (the origin of the word “faggot”) and were famously wrapped up a bag and thrown on to the lawn of the White House on October 11th 1992 to protest the US government lack of action during the AIDS crisis (Hubbard 2012). I am also deeply interested in the softness of technology and the “triumph of software” which is more malleable and has connotation of playfulness (Banham 1981). In Banham’s 1981 essay, he stated that software is able to quickly adapt to change, and undermine the rigidity of hardware. I see this also as an analogy to counter culture’s’ relationship to the strict culture of a normative hierarchical nuclear capital society. Softness also informed my choice of using fur to conceal my hardware.

 

Ephemerality is explored in my project through the game mechanic. By moving the body parts around, the player reveals the hidden objects, but if they move the body parts away, the revealed object quickly disappears. Scholars have suggested that queerness is often represented through ephemera such as untimely death, dance and gesture as communication and short-lived spaces (Desmond 2001; Castiglia and Reed 2011; Farmer 2000; Getsy 2016; Muñoz 2009). I wanted to incorporate this queer ephemerality by letting it inform the design of the game mechanic. Gay Magic is a small experiment of my major research project which will be a video game that will bring out the radical potential of the gay male body through looking at queer history, speculative futures, cyber spaces and dance.

Bibliography

Banham, Reyner. “Triumph of Software.” Design by Choice, edited by Penny Sparke, Rizzoli International Publications, 1981, p. 133‐136.

 

Castiglia, Christopher, and Christopher Reed. If Memory Serves: Gay Men, AIDS, and the Promise of the Queer Past. University of Minnesota Press, 2012.

 

Desmond, Jane E, editor. Dancing Desires: Choreographing Sexualities On and Off the Stage. The University of Wisconsin Press, 2001.

 

Farmer, Brett. Spectacular Passions: Cinema, Fantasy, Gay Male Spectatorships. Duke University Press, 2000.

 

Federici, Silvia. “Caliban and the Witch: Women, the Body and Primitive Accumulation. Brooklyn, NY: Autonomedia, 2004.

 

Getsy, David J., editor. Queer. Whitechapel Gallery and The MIT Press, 2016.

 

Jain, Anab, et al. “Design Futurescaping.” The Era of Objects (Blowup Reader 3), 29 Sept. 2011, pp. 6-14., v2.nl/archive/articles/the-era- of-objects- blowup-reader.

 

Muñoz, José Esteban. Cruising Utopia: The Then and There of Queer Futurity. New York University
Press, 2009.
United in Anger: A History of ACT UP. Directed by Jim Hubbard, The Film Collaborative, 2012.

Frame It Up!

FRAME IT UP

By Finlay Braithwaite and Tommy Ting

https://webspace.ocad.ca/~3164558/FrameItUp/

Frame It Up is an interactive screen-based game best played with 10+ people. The game requires players to carry their laptops and physically walk around the room. Frame It Up is a choreography generating game influenced by Twister.

 

Instructions

Using your laptop, open the URL in Google Chrome.

Read the instructions.

Click ‘PLAY’ to enter the game.

You are presented with a name and gestural prompt.

Use your camera to find the person and ask them to perform the prompt.

Click anywhere to take a picture.

Pictures are saved onto your laptop.

After capturing a prompt, a new prompt appears.

Take pictures of the person with the new prompt.

Repeat until the 1 minute timer.

Every minute, on the minute, all players are provided with a new name and prompt.

 

Objectives

  1. To negotiate with other players in the room to capture an image of a person and gestural prompt.
  2. To generate random acts of choreography and dance movements to highlight human’s relationship with technology

 

P5.js Code

https://github.com/braithw8/OCAD_webspace/blob/master/FrameItUp/sketch.js

 

Supporting Visuals

Presentation Day

Screenshots

frameitup_documentation01

frame-it-up-kate_s-squat

frame-it-up-emilia_s-right-clench-fist

frame-it-up-finlay_s-right-knee-raise

 

Process Journal

Day 01 [2017.10.16]: Experiment 2 Introductions

We came up with a few different ideas on our first day. We were interested in using the camera function, but inherent in the camera technology is the conversation of ethics and more specifically privacy. We wanted to use the camera in a critical way that would open up discussions around ethics.

  1. “No Pervert!” Using the camera, the screen will direct you to point it at someone in order to “see what lies underneath”, but when once you line it up with a body, it will generate a message saying “Why would you ever want to do that?”
  2. “Conversation Helper” Your mobile device will connect you with another user, then it prompts you with some conversation topics
  3. “Colour Matcher” Using the mobile device’s gyroscope, you have to rotate your phone to the right x y and z coordinates to match the colour of the text to the colour of the background of the canvas.
  4. “Shake It Up” Using your phone again, shake it to generate a prompt to find another player in the room, once located, shake it up again to generate a prompt for a body part, take a picture.

After coming up with a few different ideas, we decided to go with Shake it Up. We were interested in the movement of humans that this game would generate. It brings up some important things we were both interested in exploring with this experiment which are physical interaction with digital technology and movement and dance.

Day 02 [2017.10.17]: Coding

The first major hurdle was to get the video camera to work in a consistent and predictable way. The number of different possible device types, makes, and models made this a daunting task. We were fairly determined to use smartphones and tap into their cameras as the technical underpinning for our project. We ran into some basic hurdles getting video to work even in a rudimentary fashion. Chrome, for example, demands an https:// server is used if the camera is to be engaged, for security/privacy reasons. This means that code has to be uploaded frequently to such a server for developing and testing purposes. Dreamweaver became our go-to editor as it facilitates automatic SFTP sync on save. It also has built-in Github integration which is a dream come true.

new-mockup-1

As the working title suggests, getting the shake input code to work would be imperative in our development. However, our early testing led us to conclude that it would not be an effective way to move through a serial sequence of interaction as unintentional double shakes and phantom shakes were difficult to avoid using code. This investigation was illuminating in that it demonstrated that our user flow had too many stages in the sequence and device interactions. We felt that this took away from the experience as the device became the focus of the experience rather than a catalyst. We played with the idea that the random people and body part prompts cycled on a timer, not relying on interaction. It would also be a great moment if this timer was set to a common clock on all devices, so that new prompts were generated for all players simultaneously.

Day 03 [2017.10.18]: Back to the Drawing Board

One immediate concern we had was capturing pictures of someone’s body part without their consent. Although it would call attention to the problems of privacy, we thought this was too simplistic and literal. We went back to the whiteboard to brainstorm new ideas.

img_1861

We came up with a few different ideas for new prompts. One was to use colours, mood and feelings, this would be more abstract and will give the player the choice to interpret this however they want though it is still not consensual.

Next was to use a RGB value or a Grayscale value, player has to find the match colour with their camera on their person’s body. This would make our project more “game-like” but we didn’t know how to use camera to calculate values. Moreover it still doesn’t solve our consent issue.

Lastly we came up with a list of gestures such as head nod, smile, right hand shake, left middle finger, right peace sign. This immediate solves our consent problem since you have to ask your person to perform such a task. This also creates more of a negotiation between you and the other players. Finally, this would add a much richer dimension to our initial interest, which was to use this game to create random acts of dance and choreography.

Day 04 [2017.10.19-23]: Coding (Cameras, Mobiles to Laptops)

Eureka! We were starting to make real progress on the video front. Kate Hartman had suggested that we ‘time box’ this problem, giving up on it if we didn’t get the results we needed in specified amount of time. The biggest challenge that we overcame was specifying which camera a mobile device was used. The p5js video capture allows for constraints compliant with the W3C specification which includes language to call for different camera types. The type we were interested in was ‘environment’, the non-selfie outward facing camera on the back of a phone. Finding the correct syntax to connect this constraint to p5js was elusive and frustrating but eventually my android phone took a brave step and faced the world. With this victory, we began working with the video image and integrating it into our code. To accommodate for variable screen and camera resolutions, we created a display system that would respond to four possibilities:

  • Camera resolution width narrower than horizontal display.
  • Camera resolution width wider than horizontal display.
  • Camera resolution width narrower than vertical display.
  • Camera resolution width wider than vertical display.

With these four scenarios, our video placement would respond to the parameters and crop and place itself accordingly.

In this meticulous process we encountered a bug with the p5js reference. With the following function: ‘image(img,dx,dy,dWidth,dHeight,sx,sy,[sWidth],[sHeight])’, you can crop an image and place it into your canvas, possibly resizing it in the process. However, in working with this code it appears that the destination coordinates (d) and the source coordinates (s) are reversed from the documentation. We will investigate further and let p5js know if this is indeed the case.

This code was important as we wanted to crop our video instead of resizing it. We wanted a clean ⅓ height band of video centered in the middle of the screen. We wanted this to resize smoothly and adapt to variables of screen and camera resolution. We felt a crop would give us a natural zoom that would enhance the image finding aspect of the game and would also lower the CPU overhead of live video resizing.

%#$&%#&!

Tommy’s phone won’t open %#%^#@^. Try as we might, our code worked well on android, but not iPhone, particularly Tommy’s iPhone. With the time box in shambles and our project in jeopardy, everything was on the table, including revisiting other ideas or generating new ones. Realizing that the majority of portable devices available to us were made by Apple, we swallowed our pride and began developing for laptops. Unfortunately, we didn’t have the current ability or time to figure out a way to include both android phones and laptops, so we went with laptops only.

Despite our worst fears, the laptops were great, and added some new dimensions to the game. People could see themselves being captured and adjusted their position and pose to assist in play. This interactive feedback element would not be possible with a phone’s ‘environment’ camera.

Day 05 [2017.10.24]: Playtesting

The playtest was extremely revealing and gave us a lot of insight into how to quickly resolve some immediate issues. We noticed a two main issues:

  • Our sketch did not work on iOS consistently, some phones worked, most didn’t.
  • People were upset by not having a specific end goal, namely they were confused with what to do after they framed the person up with the corresponding body part
  • The 1 minute timer was too long since it was easy and simple to find the person and the body part.

It also confirmed what we had hoped for:

  • During the scuffle to locate the person and the body part, it resulted in a dance amongst players.
  • People had to negotiate with each other in order to find their body part.

Play Test

Day 06 [2017.10.26]: Refinement in Code and Game Concept

On our last day, we refined the game visual interface from small details such as font size and stroke shade to adding a photo capture feature.

The last major coding hurdle turned out to be fairly easy. Neither of us had made an app with multiple states or scenes. Our code to this point would rely on one loop for the entire experience. We needed to make a splash page to introduce and explain the game. We could have done it as a separate html launch page, but we wanted to try doing it in a single p5js sketch. To start, Tommy created the launch page in one sketch and I finished the details on the main code. By using a simple ‘if’ statement tied to a button on the splash page we were able to have users cleanly move from one state to the next cleanly. Huzzah!

The coding details of this project were fun mini-challenges. We attempted to make everything proportional to the display such as the text and video size & placement. A fun example is that the button size is tied proportionally to the font size which is tied proportionally to the overall number of pixels in the canvas. Another fun detail was randomness. The colours are all randomly generated giving this a fun look that’s different every time. However, in our tests, users complained that the text often blended in with the background and became difficult to read. We set some rules to enforce that the randomly generated colours have a specified minimum difference in hue. Changing the p5js colour mode to a hue based system instead of RGB made colour picking of this nature possible. Making the sounds random was a larger challenge than anticipated. Generating a random hue is one thing, but randomly selecting from a pool of sound clips is another. With sound, we wanted generate a fun and chaotic reinforcement of the experience. We wanted each device to emit sounds unique from the next. To achieve this, each device loads ten random sounds from a pool of fifty-one. At each sound cue, the code randomly selects one of these ten files for playback. Loading all fifty-one sounds would have increased the loading time and made the experience fairly buggy considering there’s already a live video input in play. This seemed like a good compromise.

Playing with sounds

Finally we changed the focusArray from a list of body parts to a list of gestures which included positive, neutral and negative gestures. We decided that this would be more interesting as the people would have to negotiate even more with other players in order to capture their photograph. It acts as a consensual prompt which mitigates the issues of privacy. We also decided to keep the 1 minute timer instead of speeding it up (from our play test). Within the 1 minute, the person must capture as many different gestural prompts as possible. Lastly, using gestural prompts instead of finding their body parts creates more of a dance, this was more fitting in our conceptual framework of contemporary dance.

 

Project Context

While doing some initial research for the project, we were immediately drawn to the relationships between kinaesthetics, human bodies, dance and choreography and camera technology.

Although Frame it Up is a game, we were more interested in the choreographic outcomes of playing the game. We found that the game was able to generate random acts of choreography which materialized our interest in human’s relationship with technology in the form of dance. We deliberately decided to use the camera as the main device to connect the players, as the act of taking someone’s picture is inherently violent (Sontag 1977) and to explore this violence through dance and play. Informed by Jane Desmond’s idea that how we move, and how one moves in relation to others comes from a place of desire (Desmond, pp.6), and gestus, a theatre technique created by theatre director Bertolt Brecht that understands gestures as an integral part of the human character, its wishes and desires (Baley, 2004). We wanted to investigate how do we move with each other and amongst each when our violent technological devices have become both embedded within and extended out to how we express desire.

Although it wasn’t our original idea, carrying around the laptops was able to intensely highlight our increasingly posthuman body. The sound tracks that we used were all compiled of laughing tacks, which call attention to human’s happiness, playfulness and desire but also its violence and brutality. We also looked to the works of choreographer Pina Bausch. Bausch’s work highlights the violence of men and the suffering and oppression of women to an incredibly uncomfortable degree. Her work “forces her audiences to confront discomfort: they are painful to look at but impossible to turn away from.” (Avadanei, pp. 123) Using Susan Sontag’s understanding of the camera as a weapon, dance theory and Pina Bausch work, our goal that Frame It Up is both a playful game and a tool to generate choreography that explore relationship between privacy, human desire and technology.

 

Bibliography

Avadanei, Naomi J. “Pina Bausch: An unspoken explorations of the human experience.” Women & Performance: a journal of feminist theory, vol. 24, no. 1, 7 May 2014, pp. 123–127., doi:10.1080/0740770X.2014.894289.

Baley, Shannon. “Death and Desire, Apocalypse and Utopia: Feminist Gestus and the Utopian Performative in the Plays of Naomi Wallace.” Modern Drama, vol. 47, no. 2, Summer 2004, pp. 237-249., doi: https:éédoi.orgé10.1353émdr.2004.0018

Desmond, Jane, editor. Dancing Desires. The University of Wisconsin Press, 2001.

Sontag, Susan. On Photography. Picador. 1977