Author Archive

Cross the Dragon – An Interactive Educational Exhibit

screenshot_2018-12-10-jpeg-image-480-x-270-pixels

Project Name : Cross the Dragon

Team Members: Norbert Zhao, Alicia Blakely, and Maria Yala

Summary:

Cross the Dragon is an interactive art installation that explores economic changes in developing countries and the use of digital media to create open communication and increase awareness on the topic of economic investment from global powers in developing countries. The main inputs in the piece are a word find game on a touch interface and an interactive mat. When a word, belonging to the four fields: Transport, Energy, Real Estate, or Finance, is found,  a video is projected onto a touch-responsive mat. Through the touch sensitive mat one can initiate another video in response to the word puzzle word. The interactive mat plays video through projection mapping. In order to be able to interact with the mat again one has find another word. We have left the information in the videos open to interpretation as to keep it unbiased and to build a gateway to communication through art and digital gaming practices.

What we wanted to accomplish:

Through this interactive installation the idea was not to presume and impress preconceived notions about the educational information provided. The installation is designed to encourage positive thought process through touch, infographic video and game. Through this interface we can conceptualize and promote discussion on information that is not highly publicized and considered widely accessible or generally discussed in Canada.

Ideation & Inspiration:

Ideation

This project was inspired by a story shared by one of our cohorts. She describes how Chinese companies are building a new artificial island off the beach in downtown Colombo, her hometown, and are planning to turn it into Sri Lanka’s new economic hub. At the same time, in the southern port Hambantota, the Sri Lanka government borrowed more than $1 billion from China for this strategic deep-water port, but couldn’t repay the money, so they signed an agreement and entrusted the management of the port to a Chinese national company for 99 years.

For us, such news is undoubtedly new and shocking. As China’s economic growth and increasing voice in international affairs, especially after The Belt and Road Initiative was born in 2013, China began to carry out a variety of large investment projects around the world, especially the developing countries in Asia and Africa, the investment in infrastructure projects from China has peaked. At the same time, we have discovered a series of reports from the New York times, How China Has Become an Superpower, which contains detailed data about China’s investment in other countries and project details.

Therefore, this project was focused around the discussion about the controversy of this topic, because some people think that these investments have helped the local economic development, but some people think it is neo-colonialism. In the beginning during concept development we knew this topic would have an awareness aspect. It was important to portray this topic that has a profound effect on the social, cultural lives and identities of people across the globe. By having a heterogeneous subject in the sense that it stemmed into other socioeconomic conditions. After discussion and data research, we decided to focus on China’s growing influence especially economic in Africa.

Finally, we decided to explore this interesting topic through interactive design. We came up with the idea of creating a mini-exhibition, through which visitors can explore the story behind this topic by interacting with the game. When the visitor first comes into contact with this exhibition, they do not have detailed information about the exhibition, but after a series of game interactions, the detailed information about the exhibition theme would be presented in the form of intuitive visual design. The resulting self-exploration process will give visitors a deeper impression of the topic.

Inspiration

These three interactive projects were chosen because of how they combine an element of play and the need for discovery in an exhibition setting. They engage the audience both physically and mentally, which is something we aim to do with our own project.

Case Study 1 – Interactive Word Games

An interactive crossword puzzle made for the National Museum in Warsaw for their “Anything Goes” exhibit that was curated by children. It was created by Robert Mordzon, a .NET Developer/Electronic Designer, and took 7 days to construct.

screenshot_2018-12-10-final-presentation

Case Study 2: Projection Mapping & Touch interactions

We were interested in projection mapping and explored a number of projects that used projection mapping with board games to create interactive surfaces that combined visuals and sounds with touch interactions.

screenshot_2018-12-10-final-presentation1

Case Study 3: Interactive Museum Exhibits

ArtLens Exhibition is an experimental gallery that puts you – the viewer – into conversation with masterpieces of art, encouraging engagement on personal and emotional level. The exhibit features a collection of 20 masterworks of art that will rotate every 18 months to provide new, fresh experiences for repeat visitors.The art selection and barrier-free digital interactives inspire you to approach the museum’s collection with greater curiosity, confidence, and understanding. Each artwork in ArtLens Exhibition has two corresponding games in different themes, allowing you to dive deeper into understanding the object. ArtLens Exhibition opened to the public at the Solstice Party in June 2017.

screenshot_2018-12-10-final-presentation2

Technology:

We combined two of our projects – FindWithFriend and Songbeats & Heartbeats for our final project. The aspects of the two projects we were drawn to are the interactions. We wanted to create an educational exhibition that has a gamified component to it and encourages discovery almost like the Please Touch Museum.

Interactions:

We combined the touch interactions from the wordsearch & interactive mat.

Components:

P5, Arduino, PubNub, Serial Connection

Brainstorm

img_6212

Team brainstorming the user flow and interactions

screenshot_2018-12-10-untitled-diagram-xml-draw-io

Refined brainstorm diagram showing user flow, nodes, and interactions

How it works:

The piece will work like a relay race where one interaction on an Ipad will trigger a video projection onto an interactive mat. When a sensor on the mat is touched it will trigger a different projection showing the audience more data / information.

The audience is presented with a wordsearch game in a P5 sketch (SKETCH A) with the four keywords; “Transport”, “Energy”, “Real estate”, “Financial”, representing the industries that China has made huge investments in. Once the word is found e.g. “Transport”, a message is published to PubNub and is received by a P5 sketch (SKETCH B) that will play a projection about transport projects. When the audience touches the mat with the sensors, the sensor value (ON/OFF) will via a Arduino/P5 serial connection to a different P5 sketch (SKETCH B) will stop playing the Transport projection and displays more information about China’s transport projects in different African countries.

Step 1: Sketch A – Wordfind game

The viewer’s initial interaction with the “Cross the Dragon” exhibit is initiated in the wordfind game. This is created using p5.js. The gameboard is created using nested arrays that create the word find matrix. Each tile in the board is created from a Tile class with the following attributes: x,y co-ordinates, RGB color values, a color string description based on the RGB values, a size for it’s width and height, booleans – inPlay, isLocked, isWhite, and a tile category that indicates whether the tile is for Transport, Finance, Real Estate of Energy.

To create the gameboard, 3 arrays were used. One array containing the letters for each tile, another that contained the values that indicated whether a tile was in play or not. This was made up of 1’s and 0’s. Tiles that were in play, i.e tiles that contained letters for the words to be found, were marked with 1’s and those that were decoy tiles were marked with 0’s. The last array was one that indicated the tile categories using a letter i.e T,F,R,E, and O for the decoy tiles. The matrix was created by iterating over the arrays using nested for loops.

screenshot_2018-12-10-cross-the-dragon1

The arrays used to create the game board tile matrix of clickable square tiles

buildingtiles

Generating the 11×11 game board and testing tile sizes

Once the tile sizes were determined, we focused on how the viewer would select the words for the four industries. The original Find With Friends game catered to multiple players, identifying them each with a unique color. However, here there is only one input point, an iPad, so we decided to have just two colors showing up on the game board; red to indicate the correct tile and grey to indicate a decoy tile. When the p5 sketch is initiated, all tiles are generated as white and marked with the booleans – inPlay and isWhite. When a tile is clicked and it’s inPlay value is true, it turns red. If it’s inPlay value is false, it turns grey.

testingwords

Testing that inPlay tiles turn red when clicked

The image below indicates testing of the discover button. When a word is found, and the discover button is clicked, a search function loops through the gameboard tiles, counting from the tiles that are inPlay and have turned red, a tally of the tiles clicked is recorded in four variables i.e one for each industry. There are 9 Transport tiles, 6 Energy tiles, 10 Real Estate tiles, and 7 Finance tiles. Once looping through tiles is complete, a checkIndustries() function is called to check the tally of the tiles. If all the tiles in a category are found, the function sets a global variable currIndustry to the found industry and then calls a function to pass that industry to PubNub. When a tile is found to be in play and clicked, it is locked so that the next time the discover button is clicked, the tile is not counted again.

testingdiscover

Testing that inPlay tiles are registered when found and that already found tiles are not recounted for the message sent to PubNub.

Step 2: Sketch B – Projection Sketch – Part 1

When the sketch initializes, a logo animation video, vid0, plays on the screen and a state variable which was initialized as 0 is set to 1 in readiness for the next state which will play video 1 / a general information video on a found industry.

When the second p5 sketch receives a message from PubNub, it uses the string in the message body that indicates the current industry to determine which video to play. The videos are loaded into the sketch in the preload function and played in the body of the html page crossthedragon.html. During testing we discovered that we had to hide the videos using css and show them only when we wanted to play them, re-hiding them after because they would all be drawn onto the screen overlapping each other. When the sketch is loaded videos are added to two arrays – one to hold the initial videos and another to hold the secondary videos that provide additional information. The positions both the arrays for each industry are Transport in index 0, Energy in 1, Real Estate in 2, and Finance in 3.

Once a message is received a function setupProjections(theIndustry) is called. The function takes the current industry from the PubNub message as an argument and uses it to determine which video should be played. The function sets the values of the global vid1 and vid2. This is done by using the industry to determine which video to pull from the two arrays that hold all the videos. e.g if transport was found, vid1 = videos1[0] and vid2 = videos2[0]

A function makeProjectionsFirstVid() is called. This function stops the initial “Cross the Dragon” animation from playing and hides it, then hides vid2 and plays vid1. It then updates a global variable state to 2 in readiness for the second in-depth informational video.

Note: vid0 only plays when state is 0, vid1 only plays when state is 1, and vid2 only plays when state is 2.

Step 2: Sketch B – Projection Sketch – Part 2 Arduino overs serial connection

The second in-depth video is triggered whenever an signal is sent over a serial connection from Arduino when the viewer interacts with the touch-sensitive mat. Readings from the 3 sensors are sent over a serial connection to the p5 sketch. During testing we determined that using a higher threshold for the sensors produced a desirable effect of reducing the number of messages sent over the serial connection thus speeding up the p5 sketch and reducing system crashes. We set the code up so that messages were only sent when the total sensor value recorded was greater than 1000. The message sent was encoded in JSON format. The p5 sketch parses the message and uses the sensor indicator values passed i.e. either 0 or 1 to determine whether to turn on the second video. If the sensor indicator is 0 this means OFF and the video start is not triggered, if the value is 1 this means ON and the video is triggered. The makeProjectionsSecVid() function triggers the start of the video. If the state is 2, the vid1 is stopped and then hidden and the vid2 is shown then played on a loop. An isv2Playing boolean is set to true and is used to determine whether to restart the video and prevents it from jumping through videos if one is already playing.

Electronic Development 

While choosing materials I decided to use  a force sensitive resistor with a round, 0.5″ diameter, sensing area. This FSR will vary its resistance depending on how much pressure is being applied to the sensing area. The harder the force, the lower the resistance. When no pressure is being applied to the FSR its resistance will be larger than 1MΩ. This FSR can sense applied force anywhere in the range of 100g-10kg.  To make running power along the board easier I used an AC to DC converter that converted 3v and 5v power along both sides of the breadboard. Since the FSR sensors are plastic due to travel some of the connections would come loose. One of the challenges was having to replace the sensors a few times. When this occurred would follow up with quick testing to make sure all sensors were active through the serial monitor in Arduino. To save time I soldered a few extra sensors to wires so the old ones could be switched out easily if they became damaged.

screenshot_2018-12-10-cross-the-dragon2

Materials for the Interactive Mat Projection

  • Breadboard
  • Jumper cables
  • Flex, Force, & Load Sensor x 3
  • YwRobot Power Supply
  • Adafruit Feather ESP32
  • Wire
  • 4×6 piece of canvas material
  • Optoma Projector
  • 6 x 10k resistors

Video Creation Process

Information was extracted for the four most representative investment fields from the database of investment relationship between China and Africa: transport, energy, real estate and finance. Transport and real estate are very typical, because the two famous parts of China’s infrastructure investment in Africa are railway and stadium construction. In addition, energy is also an important part of China’s global investment. The finance part corresponds to the most controversial part of China’s investment, that is, when the recipient country cannot repay the huge loan, it needs to exchange other interests. Sri Lanka’s port is a typical example.

Initially, we wanted to present investment data in four fields through infographic. But after the discussion, we believed that video is a more visual and attractive way to present. So we make two video’s for each field. When visitors get the correct words in this field, they will be shown the general situation of China in the world and Africa in this field, which is video 1, including data, location, time and so on. When visitors click on mat,  projector will play more detailed video about the field, which is video 2, such as details of specific projects.

In video 1, we use Final Cut to make dynamic images of infographic produced in adobe illustrator, and add representative project images of this field in the latter half of video. So that visitors have a general understanding of this field.

2
In video 2, we use Photoshop and Final Cut to edit some representative project images in this field, and then add key words about the project in the image, so that visitors can have a clear and intuitive understanding of these projects.1

The Presentation

The project was exhibited in a gallery setting in the OCAD U Graduate Gallery space. Below are some images from the final presentation night.

settingup

Setting up the installation

shotsfrompresenation

People interacting with the Cross the Dragon installation

Reflection and Feedback

Many of the members of the public who interacted with the Cross the Dragon exhibit were impressed by the interactions and appreciated the educational qualities of the project. Many people stuck around to talk about the topics brought up by the videos, asking to know more about the projects, where the information came from and how the videos were made. Others were more interested in just the interaction but most participants did engage in open ended dialogue without being prompted. Overall feedback was positive. People seemed to be really interested in changing the informational video after finding the word in the puzzle. Some participants suggested slowing down the videos so that they could actually read all the information in the text.

For future iterations of this project, we would like to explore projection mapping more so that we can make the interactive mat more engaging. We noticed that once people found out that they could touch the mat, they tended to want to keep touching it and exploring it. We had spoken about including audio and text with animation earlier on in our brainstorming and we believe this would be a good way to include these through having sensitive areas on the mat to create more interactions. It was also suggested that we should project the videos onto a wall also so that people who were around the room would still be included in the experience without having to actually be physically at the exhibition station.

References

Code Link on Github – Cross The Dragon

P5 Code Links:

Hiding & Showing HTML5 Video – Creative Coding

Creating a Video array – Processing Forum

HTML5 Video Features – HTML5 Video Features

Hiding & Showing video – Reddit JQuery

Reference Links:

1] https://learn.adafruit.com/force-sensitive-resistor-fsr/using-an-fsr

2] http://osusume-energy.biz/20180227155758_arduino-force-sensor/

3] https://gist.github.com/mjvo/f1f0a3fdfc16a3f9bbda4bba35e6be5b

4] http://woraya.me/blog/fall-2016/pcomp/2016/10/19/my-sketch-serial-input-to-p5js-ide

5] https://www.nytimes.com/interactive/2018/11/18/world/asia/world-built-by-china.html

6] http://www.sais-cari.org/

7] http://www.aei.org/china-global-investment-tracker/

 

 

 

FindWithFriends

By Maria Yala

Creation & Computation – Experiment 4 / Networking

1jwq71543250959

FindWithFriends is a collaborative web-app game that is built using p5.js and PubNub a data stream network used to send and receive messages between players’ devices. When the game starts, each player is presented with an identical game board made up of a matrix of clickable tiles and a list of words to find, however each player is assigned a random color to differentiate them from other players. Words are arranged on the board in varying directions – forwards, backwards, upwards, downwards, and diagonally. Players can either play collaboratively, or competitively. Every time a player clicks on a tile, the tile’s color changes to the player’s assigned color. Once a player finds a word, they can lock their tiles to prevent other players from stealing tiles that they have found. If a player clicks on an unlocked tile, it will turn white again. When the ‘lock tiles’ button is clicked, the player’s score is calculated and drawn on the screen.

Background

For this project, I wanted to learn more about creating custom classes in JavaScript as I didn’t have much experience with them in JavaScript before. Additionally, I was drawn to the idea of working visually with objects in a matrix when we went over some of the examples in class. I wanted to challenge myself to learn more about custom classes and nested for-loops.

Ideation & Inspiration

My main inspiration was to initially create something to do with collaborative storytelling, however, since I was thinking of working with grids/matrices, I ended up choosing word games particularly crosswords and word find puzzles. In the end, I settled on the idea of a collaborative word find game where each player was identified by a different color. This was inspired mainly by a word find game in a zine I had created and the game WordsWithFriends.

20181126_115158-1

20181126_115114-1

20181126_115131-1

20181126_115611-1

Step 1 – The Tile Class

I created a custom class to represent the tile objects of the game board. In the beginning I circular and square tiles using the ellipse() and rect() functions. Each tile had the following attributes: x and y coordinates and size dimensions. The Tile class also had a display function that when called would draw the tile at the object’s x and y position.

Step 2 – The Matrix

I began by testing out – creating a 3×3 matrix of square objects onto the screen. This was done by using a nested for loop that upon each inner iteration would create a new tile passing the x and y positions in the for-loop to a mapping function to generate a position on the screen. The matrix was restricted to a size of 600 and 600 and these were the dimensions used to map coordinates.

An image showing testing of a 3×3, 15×15, and 30×30 matrix on a screen portion of 600×600 pixels.

1

Step 3 – The Letters

I then created a second custom class to overlay letters over each tile. The Letter class was also created using a nested for-loop iterating over an array of letters. Each Letter object had the following attributes: a letter, an x and y coordinate. The Letter class had one method; a display function that when called, draws the letter on the corresponding tile. Below is an example of a 3×3 sample that was iterated over to generate a nested 3×3 array of Letter objects.

var test = [‘A’,’P’,’J’,’X’,’E’,’I’,’C’,’O’,’W’];

An image showing the letters overlaid onto a 3×3 game board

screenshot_2018-11-21-find-with-friends

Step 4 – Clicking on tiles

Upon success with drawing the tiles and letters onto the canvas, I began to work on interacting with the objects on the screen so that a tile would change color when clicked. I began by creating a variable pcolor to hold the players random color assignment which is generated when the page is loaded. Using the mousePressed() function, I got the x and y position of the mouse when the user clicked and then passed it to a new method in the Tile class, clickCheck(). This function, clickCheck(), used the x and y coordinates of the player’s click and the x and y coordinates of the tile, calculating the distance between the two to determine if the player had indeed clicked within the tiles radius. If the click was within the radius, the color of the tile would change from white to the player’s color. Here I also updated the Tile class, adding the clickCheck() function and color attributes i.e r, g, and b for RGB color mode. The nested for-loop that created the array of Tile objects was then updated to initially create tiles as white in color. Initially I was using mouseClicked() but changed it to mousePressed() because during testing I found that it worked on a laptop but not on the iPad.

An image showing testing of clicking on tiles to change their color, with color randomly updating upon page refresh

2

Step 5 – Adding another player

Once basic game functionality was working for one-player, I began to integrate PubNub so as to allow for multi-player functionality. I updated the mousePressed() function to publish a message to pubnub and upon receipt of a message back in the readIncoming() function, the clickCheck would be called. The message passed to and from PubNub carried information about the mouse x and y coordinates and player color. These were then passed to the clickCheck() function which would update the tiles accordingly.

An image of a game screen showing a 2 player game where each player is a different color

screenshot_2018-11-21-find-with-friends6

Step 6 – Testing Tile Size When Matrix Size Increases

I changed the test array(as shown below) containing letters, switching from the 3×3 grid to a 7×7 grid to begin testing out how the grid would look with a words that were placed in different directions on the grid. And tested adjusting the size of the tiles so that the background would be covered.

Figuring out correct size for the tiles on a new, and larger game board

screenshot_2018-11-26-007myala-wardragon

3Testing the game using a game board with circular tiles and 3 players

20181121_145904

I ended up removing the circular tiles as square tiles were more visually pleasing because they didn’t overlap with the other tiles.

Step 7 – The “Steal tiles” feature

Here I updated the Tile Class, adding 3 new attributes – ‘c’ a string to hold the color id of the tile e.g white would be 255255255, ‘isLocked’ a boolean to check whether a tile is locked or not, ‘isWhite’ a boolean to check whether a tile is white or colored. I used the ‘isWhite’ variable to detect clicks on tiles; a tile is created as white, when it is clicked, it’s color is changed and this variable is set to false. When a user clicks again on a tile they had already clicked, I compare the tile’s ‘isWhite’ value and its current color id to determine whether it is being stolen or the click is simply an undo. If it is an undo click, color reverts to white, if not, another player is stealing the tile. I had trouble with implementing an undo click because I was calling the clickCheck function twice i.e in my mousePressed() and again in my readIncoming() functions. This caused the color of the tile to change from the players color to white and then it remained white. I solved this by removing the call to the clickCheck function in the mousePressed listener.

An image showing testing the “undo click” feature and “steal tiles” feature

Step 8 – A “Lock tiles” feature

I used the ‘isLocked’ boolean to prevent a tile from being stolen by another player. I also added a button on the screen that when clicked would lock all the tiles that had the same color as the player. To do this I created 3 new method: lock() – a function to pass the player color to the tile’s lock function, updateLock(p) – a function to update other players screen locking all tiles belonging to a particular color, pubLock() a function to publish a message indicating a lock has occurred. I also added a new boolean ‘lockPressed’ that would be used to determine what kind of message was being sent i.e a normal message or a lock message. If lock id in message was 0 then the message was a normal message, if it was 1 then it was a lock message and the readIncoming() function would call the updateLock(p) function for the player who initiated the lock.

Step 9 – Final 15×15 matrix & Game Text (Hints)

I settled on a 15×15 grid for the final game and using a word find puzzle from the Huffington Post, I created a new array to hold the matrix. Text hints were drawn to the side of the game board with a ‘lock tiles’ button underneath. When choosing the theme for the puzzle, I wanted to pick a topic that was universal and a little controversial so I ended up in the realm of politics and Donald Trump with the “Who has Trump offended?” puzzle from Huffington Post. The 15×15 grid size was chosen as it was the best size that allowed legibility and precise clicks on a tablet and laptop using a fixed portion of the canvas.

Who Has Trump Offended?” puzzle from The Huffington Post

trumpsearch

The final 15×15 game board matrix array

screenshot_2018-11-26-switched-to-a-15-x-15-grid-added-hint-text-header-%c2%b7-007myala-wardragon-5a7b1211

The final 15×15 game board matrix

screenshot_2018-11-23-find-with-friends

Step 10 – Adding a Score

I updated the code, adding a points array that would be mapped to tiles the same way that letters were. The points system I created was that tiles in words positioned forwards or downwards were each worth 1 point each. Tiles in words going backwards were worth 2 points each. Tiles in words positioned diagonally were worth 3 points each. Tiles in the hidden word were worth bonus points of 2 points each. I added a points and a global variable score to calculate a players score based on locked tiles. Tiles that were not part of the words were each assigned 0 points. Calculation was triggered when the lock button was clicked. This score was then drawn to the screen in large font and was colored in the player’s color.

The points matrix for the final 15×15 game board

screenshot_2018-11-26-added-css-to-make-the-lock-button-bigger-on-ipad-screens-%c2%b7-007myala-wardragon-e1fa7b7

Testing the score calculation – PRESS is 5 points, HISTORIANS is 10 points

20181123_113428

Testing the game

Video link of test from the image below – Testing the game with 2 players playing on iPad & Laptop

20181123_120230

End result of the test between two players

4

Presentation (Setup & the experience )

For the presentation, I decided to use iPads instead of a combination of iPads and laptops, this was done mostly to ensure mobility. I didn’t want the players to be tied to one place. Although I had provided 5 iPads I anticipated that people may form teams when playing and would pick up the iPad or move about with them. I did use one laptop, hooked up to a projector so that onlookers would not be left out of the FindWithFriends experience. Projecting the game board on the wall was a choice that ended up being beneficial to the presentation as it heightened the game experience as people watched the tiles change color. Ultimately, the game experience changed when players realized they could steal each other’s tiles and once words had been found they proceeded to try and see who could get the most colored tiles on the screens. Below are some video and images from the presentation.

5

6

Video link to the presentation

Feedback & Future Plans

Some future adjustments to the game I would like to make would be to add the other players score onto the individual screens so as to heighten the competitive aspect. I would also like to explore turning this into an installation piece. This is inspired by feedback from the cohort and the experience of how that game turned from a simple word find to a different game when the players were notified that they could “steal” each other’s tiles. It shifted from collaborative play to competitive play when I informed them of the lock button’s functionality. Comments were also given about the choice of theme – Trump & politics. When some players were playing they would spontaneously shout out things like “I found humanity” or “I’m stealing immigrants” The words being found have the potential to make people uncomfortable and I would like to explore this further by perhaps playing with different contexts. It was also suggested that because of projecting the game board onto the wall, a new interaction could be actually having players pick tiles on the wall. This is also something that I would like to explore in the future.

Links:

Code on github – FindWithFriends

Reference Links

https://github.com/DigitalFuturesOCADU/CC18/blob/master/Experiment%204/P5/pubnub/06A_LOCAL_ONLYcommonCanvas_animSpeed_inertia/sketch.js
https://randomcolor.lllllllllllllllll.com/
http://nikolay.rocks/2015-10-29-rainbows-generator-in-javascript
https://p5js.org/examples/color-color-variables.html
https://stackoverflow.com/questions/42101752/styling-buttons-in-javascript

https://p5js.org/reference/#/p5.Element/parent

 

 

 

 

 

 

 

Artful Protest

ARTFUL PROTEST

By Maria Yala

Creation & Computation Experiment 3 / This & That

 

Project Description

Artful Protest is an interactive art installation, using p5 and Arduino over a serial connection, that invites participants to use art as a form of protest. It was inspired by this quote by Carrie Fisher:

“Take your broken heart, make it into art.”

The installation is a form of peaceful protest, designed around the idea that the freedoms we enjoy everyday are not guaranteed and that we have to constantly fight to keep them around. In the installation, participants are presented with a screen projecting the current problems in the world at the moment. E.g black people being brutally murdered by police, families and children being separated at the US border, or innocent people killed in Churches, synagogues and schools. There is also a cardboard protest sign, that controls the images that are being projected. If the blank physical sign is picked up i.e. a person begins protesting, the projection on the screen changes to that of a digital protest sign animation. The digital protest sign changes depending on how high or low the physical cardboard sign is being held. If the cardboard sign is put down, the initial screen is projected again, inviting the participants to protest again.

Presentation

Process Journal & Reflection

Initially, I wanted to recreate a voting booth experience to communicate the idea that people need to get active and vote to create or push for local/global change. However, upon further reflection and consultation with a few members of my cohort, I realized that I was overthinking the idea. I decided to keep it simple and focus on the experience that was to be created and not on the code or technology. I started to look at the experience on a wide scale before narrowing down, how I would execute it. I knew that I wanted to keep the political angle, and I wanted something visual and also something interactive where a viewer would become a participant. I was thinking of the women’s march and protest signs and came up with the idea of a peaceful protest that used art.

[ Protest signs ] + [ Making one’s voice heard ] + [ Peaceful protest ] = Artful Protest

Inspiration

Once I had a general direction, I began looking for inspiration. Below are some of the images and work from other artists that inspired Artful Protest.

screenshot_2018-11-12-imperfect-health-the-medicalization-of-architecture-exhibition

‘Imperfect Health: The Medicalization of Architecture’ Exhibition

lust_schrofer_06

type / dynamics installation at the stedelijk museum amsterdam

twr_final

Interactive Art Installation by Akansha Gupta

I was inspired by the idea of hidden information being revealed and text being projected on the screen in the art pieces and installations above. I took elements of these ideas and used them in Artful protest where protest signs are hidden and then revealed depending on how high the physical protest sign is held.

Arduino Hardware & Code

For this project, a simple Arduino setup was used to collect information about how high the cardboard sign was held. I used a SRO4 Ultrasonic sensor to collect the proximity data. I found that this sensor was very finicky and would return strange sensor readings. To fix this problem, I used a suggestion found in Kate Hartman’s code found here. The suggestion was to use the NewPing library in the Arduino resource library. The library helps receive cleaner values from the proximity sensor by providing a maximum reading for the sensor. I set my maximum to 200, therefore the range of values returned was 0 – 200cm. Additionally, the library also provides a function to return the distance calculated in cm. Artful Protest uses a serial connection where proximity data is gotten from Arduino as input and send over to p5.

giphy

Serial Connection

I chose to send the distance data over to p5 using JSON( JavaScript Object Notation ). I had some problems using pretty print when sending JSON over serial. On the p5 side I would get an error about unexpected whitespace and was able to resolve this by removing the pretty print function and using printTo instead i.e. p5Send.printTo(Serial);

P5 Animations

On the p5 receiving side, the proximity data is used to determine what mode the installation is in. There are two modes – watching and protesting. Using the received data and a global boolean variable ‘isProtesting’, which is initialised as false, I use the distance data to determine what mode to set. If the distance is less than or equal to 30cm, the mode is ‘watching’ and ‘isProtesting’ is set to false. If the distance is greater than 30m the mode is ‘protesting’ and ‘isProtesting’ is set to true.

In the draw loop, I then use the ‘isProtesting’ boolean to determine which animation to “play”. I use a global variable ‘animation’ which is initialized to 0. Using the sensor value from Arduino, I determine a new value for animation based on the height of the digital protest sign.

giphy

Sketches and initial idea designs

Animation 0 – Initial Screen.

The screen below is drawn when the variable animation = 0. It is the initial screen drawn when the mode is ‘watching’. The screen is drawn using text that is confined to the width and height of the screen.

screenshot_2018-11-09-artful-protest1

Animation 1 – Little Sign, Yuge Feminist Screen

This screen is drawn when the distance recorded is less than 80cm. The animation part, “Yuge Feminist” is drawn buy changing the color of text from black to white and back to create the illusion that the text is appearing to and disappearing into the black background. The colors are generated using sin function to generate colors within the range of white to black.

giphy

Animation 2 – Dump Trump Screen

This screen is drawn when the distance recorded is less than 100cm. The animation is created by toggling the drawing of two string variables. i.e “Dump” and “Trump”. To create the effect of toggling both words, I use a counter variable dumpCnt which is updated, by 1, every time animation 2 is drawn. I then check whether the dumpCnt is odd or even. This is done using the modulo function. If the dumpCnt is even “dump” is drawn, if it is odd, “trump” is drawn. When I got this animation to work, I realized that the frameRate was too fast to get the effect that I wanted, so in setup function i set the frameRate to 2. However this ended up slowing down my other animations later so I resolved this by setting a different frameRate for each animation, within the draw function.

giphy

Animation 3 – Love is Love Screen

This screen is drawn when the distance recorded is 100cm or greater. Is shows an animation of an explosion of balls forming the word ‘LOVE’. This is done by tracing the font used to draw ‘LOVE’ on the screen. The tracing is then turned into points, using the x and y co-ordinates . The animation was based on the The Coding Train : Steering Challenge which can be found here. Each point is initially drawn at a randomly generated x,y co-ordinate then it moves to it’s target position, which is assigned according to the tracing of the original text. This is what creates the animation of the points exploding on the screen and moving to a set location. The points are created as vehicle objects of the class Vehicle. They are held in an array and the array is iterated over drawing them on the screen. Each vehicle has a target x,y co-ordinate, hue (color), starting position, velocity, acceleration, radius, max speed and max force.

To assign colors to each point / vehicle, I set the color mode to HSL and then while iterating over the points when vehicles are getting targets assigned, I assign a color and iterate over a hue variable, causing the rainbow effects of the points.

giphy

Designing the protest sign

The protest sign was created using a sheet of cardboard material. I chose cardboard as it is a material that is used a lot during protest marches, and cardboard signs featured heavily during the 2017 Women’s March which was my main source of inspiration. I chose not to add a pole as a handle for the sign because I felt that it restricted the interactions with how a person would hold the sign. I created a small pocket at the back of the cardboard to hold my breadboard and sensor.

During testing, I realized that the cardboard was quite heavy, and when lifting it up higher it got significantly harder to hold it up. I wasn’t too bothered by this as I felt that it added another layer to the experience. During presentation, I got a suggestion to maybe tie this into the types of protest signs displayed i.e. when talking about heavy issues, show these protest signs when the physical protest sign is held up highest.

giphy

The physical cardboard protest sign with pocket at the back for the breadboard

Future expansions

I would like to add more protest signs and possibly include an orientation sensor, to determine another way in which the sign is being held. I also got the suggestion during critique, that I could explore having multiple people interacting and having that affect the protest signs shown. This would allow multiple people to join in on the interaction. I think this is an avenue I could possibly explore further, perhaps even adding more protest materials to the installation, not just one protest sign, and I could use an additional screen.

What I learned

I particularly enjoyed this experiment because it helped me think to simplify my work and focus on creating meaningful, organic interactions rather than showing of what a piece of tech can do. The quote below summaries my findings 🙂

The artist should never contemplate making a work of art that is about something; a successful work of art can only ever be about nothing. The artist’s complete negation of intent thus creating a reflective surface into which the critic, curator or collector can gaze and see only himself – Sol LeWitt, Paragraphs on Conceptual Art, 1967

References & Resources

Artful Protest Code on Github

Code – Resources

Creative Coding – Text & Type

https://creative-coding.decontextualize.com/text-and-type/

Rainbow Paintbrush in p5

https://medium.com/@kellylougheed/rainbow-paintbrush-in-p5-js-e452d5540b25

The Coding Train – Steering Code Challenge

https://www.youtube.com/watch?v=4hA7G3gup-4

CC18 Digital Futures Class Github-

https://github.com/DigitalFuturesOCADU/CC18/tree/master/Experiment%203

Images – Resources

https://www.newyorker.com/culture/jia-tolentino/the-somehow-controversial-womens-march-on-washington

https://www.dailymail.co.uk/news/article-5539595/Hundreds-fall-silent-outside-embassy-London-historic-anti-gun-protest-place-globally.html

 

EmojiBall

The Emoji Ball

Creation & Computation : Experiment 2 (Multiperson)

by Carisa Antariksa, Nick Alexander, and Maria Yala

EmojiBall is a multiplayer physical ball game where the game pieces exert influence on the players. The components of the game are three EmojiBalls, three hoops, and an environment that affects the game’s outcome. The game is intended for a minimum of three players, and there is no necessary maximum number of players; however, each player must be assigned a hoop and an EmojiBall. The balls are reactive and have “moods” which will change based on how they are manipulated. Balls can be shaken, rolled, thrown, or otherwise handled in any way as the player decides. When a player scores with a ball in a “positive” mood they win a point, while scoring with a ball in a “negative” mood leads to an opponent losing points. The game is intended to be played in any space, regardless of size or surface. The distinct features of whatever space the game is played in will influence the available options players have with the balls, resulting in a unique game scenario every time.

Emojiball explanations

The Project:

good-boi

Process

We began by brainstorming by ideas; each team member gave ideas about what kinds of experiments would interest them, and then presented to the group why they were interested in the idea. We considered whether it would be a game or more of an art installation, and included ideas on what inputs and outputs the product would need.

Brainstorming

We then proceeded to come up with various products stemming from the set of ideas.

Initial ideas

After the brainstorming session, we were interested in two ideas; what wold become EmojiBall (a ball game that would let players manipulate the mood of a ball and score points when they made the ball happy) and a Wearable Glove that would allow someone to send signals and secret messages to another person.

Sketches

We ended up choosing EmojiBalls as it was the idea we were most excited about; It combined elements from each of our initial ideas – mood (Maria), emojis (Carisa), and ball games (Nick). EmojiBalls would also allow us to explore a larger combination of inputs and outputs. Additionally, EmojiBalls would provide the most interaction for the players that wasn’t completely tied to the device as it incorporated a spatial component whereby the environment affected the balls’ state. We planned on using the light, sound, and motion sensors for inputs and the vibration sensors, LED lights, and speakers as outputs.

To further the collective concept we had another brainstorming session in which we went over the project and defined what would be needed to accomplish our task. During this session we also began to discuss possible game mechanics and form of the ball.

Our initial idea for EmojiBalls was this – Emojiball: a game where you must manage the mood of three balls in order to score points. The game encourages players to use tactics that usually don’t come into play in game spaces – such as manipulating the room’s brightness, shouting, and even cuddling the game pieces.

Final Brainstorm

Once we had a general idea of what our project was we decided to break down tasks into chunks so that each of us would explore the various working of the project and teach that to the rest of the group members. We ended up deciding to use Light Sensor & Motion Sensor as our input and (Green, Yellow, Red) LEDs & Speaker as our outputs.

Lights & Points

We decided that the ball would have 3 main mood states for out minimum viable product. Happy, Calm and Angry. Each mood state would have a corresponding color – Happy (green LED), Calm (yellow LED) and Angry (red LED). Once we connected all the lights to the breadboard, we began by testing cycling through the three moods.

First, we tested the light sensor, setting up our breadboard so that when the light was bright ( a reading of 500 and above) the green LEDs would turn on and if the room was dim ( a sensor reading of under 500) then the red LEDs would go on indicating an angry ball whereas the green LEDs indicated a happy ball.

Fritzing

code     Lights

Once we got the light sensor functioning to switch the ball’s mood depending on the intensity of light, we wrote code that would alternate lighting each of the LEDs mimicking the idea that the ball’s mood would change over time. I.e. It would move from calm towards happy then back to calm and then down to angry in a looping fashion.

Lights

To affect the balls mood changing, we created a global variable, moodpoints , that would affect the ball’s mood. We decided that the ball’s moodpoints would max out at 1000 points and would be at least 0. Between 1000 moodpoints and 750 moodpoints, the ball would be happy. If the moodpoints were between 749 and 250, the ball would be in a calm state. And if the moodpoints were between 249 and 0, the ball would be in an angry state.

Sketchbook

Input from the light sensor was then used to affect the moodpoints – If the player moves the ball towards light they score 10 moodpoints, if the player moves the ball to a darker environment they lose 10 moodpoints. A function – checkLight() – is called once per loop, where the light reading (dim or bright light) determines whether the player will score moodpoints or lose moodpoints. If light reading is under 500, 10 points are lost. If the light reading is over 500, 10 points are scored. Then depending on the value of the moodpoints a function is called i.e. getHappy(), getCalm(), or getAngry(). Each of these function will turn on a corresponding light and play an accompanying melody.

Orientation Sensor

After assembling the Adafruit BNO055 and downloading the libraries and driver, we installed and tested the sensorapi example supplied by Adafruit at their website. We experimented with the hardware and software to get a sense of what kind of readings it could supply and how they might be used for our purposes. We worked with the sensor for some time before being informed by our instructors that it required a pair of 4.7KΩ pullup resistors in order to be used safely – as it happened, we had been having trouble with our computers consistently detecting ports while the sensor was enabled, and installing the pullup resistors solved this issue.

Our instructors also supplied us with demo code for the sensor which we experimented with but eschewed at that time as we were already comfortable with the code supplied by Adafruit.

Our next hurdle was determining how best to put the raw data returned by the orientation sensor into use. The sensor can return multiple types of data but we looked at it primarily for data it could return as a gyroscope and accelerometer. We noted that the sensor could return temperature and considered exploring temperature as an input to our device and as a game mechanic, but ultimately shelved it as being out of scope at this time.

In considering the application of the sensory data we considered the intended “personality” of the ball. We wanted a ball that likes to be moved gently, and gets angry if it is kicked, hit or thrown. For our purposes measuring acceleration seemed like the better choice over measuring rate of rotation. Acceleration could serve as a catch-all for fast movement, especially if the ball was tossed or otherwise moved in a way that prevented rotation. We noted that measuring acceleration would only return good data over very short periods – if the ball was, for example, thrown a great distance, it would accelerate at the beginning of the throw and at the end, but the period of time in the middle of the throw would likely not have enough of a rate of change in velocity to trigger our code. However, considering that we were working on a short timetable and demonstrating on a small scale, most interactions with the ball involving its velocity were likely to be brief – thus returning beneficial data nearly the entire time the ball would be in motion.

We instituted a threshold within the code of + or -2 m/s^2 on any axis as triggering a change in moodpoints. The threshold itself is a variable for quick adjusting during testing. Using the if(millis()-lastRead>=sampleRate) loop suggested by Nick Puckett we set the ball to test its acceleration twice per second. It was our intention that this low sample rate would discourage players from simply shaking the ball in order to anger it and instead encourage them to throw or roll the ball. As discussed below, this did not pan out in testing.

Melodies

Aside from the LED output, we decided that using the 8 ohm speaker to convey different tunes for happy, calm and angry would be essential in enhancing the experience. Moods are often quickly associated with sound, and the strong connection between the two would play a large part in creating more effective game mechanics. It also would bring these items to “life” as the sound output shares an important indication of “feeling a certain way” as it is a personification to express feelings through spoken words, opinions or sounds.

To start with, identifying how to read the notes with the pitches.h library for the toneMelody code would help in creating custom melodies. There was also a difference in volume that we would need to note between the lower notes and the higher ones. The much lower notes would be much “quieter” than the higher ones. In fully utilising the potential in the default library, research was done by looking up different musical scales, from major to minor. We quickly realized that in this library, the major scale with sharps in the notes were used. Once the middle C in the major scale was found (NOTE_C4) playing around with the notes for happy and calm were easier to do.

note_identify
Initially, we had a blue LED placed in the circuit for a “sad” mood and a sad tune was intended, but it was later scrapped. The choice to implement only happy, calm and angry was arbitrary and other moods could be implemented in future iterations of the game. 

There were a lot of instances tested with the toneMelody examples, such as:

  1. Altering the note durations between 2-16 and 1500-2000 overall to either speed up or prolong the note in the melody sequence,
  2. Experimenting with the delay() within the tune itself by either placing in a value or calling a ‘pauseBetweenNotes’ action
  3. Placing in a ‘0’ between the notes (e.g. In the calm tune, the sequence became “NOTE_C4, 0, NOTE_C4, NOTE_G4, 0, NOTE_G4, NOTE_A4, 0, NOTE_A4, NOTE_G4” to pause briefly between the melody to create a more musical tune.

These were then verified and uploaded in a simple button and speaker circuit, to repeatedly test the tunes and confirm that they are suitable in the happy, calm and angry outputs. These would activate as the color of the LED changes into the respective colors.

img_4999

Testing different combinations of note sequences.

For each respective mood, we decided that we would go for the standard beeping alarm for the angry as opposed to a custom one, a moderate tune for the calm mood, and a kept a custom tune for the happy sound. With the high pitched alarm sound, there came a sense of urgency that can affect player motivation as opposed to what could be a more robotic, angry melody. The calm tune was a familiar sound that stopped just the right timing to anticipate the change in the mood that you are after. The happy tune was a custom sound that indicated the EmojiBall is communicating a “I’m Happy!” sound by using a higher pitch at the end of the sequence (This is well reflected in the main video above.)

notes_final

Final note sequences used in the code.

These are then placed into the respective if statements that can be triggered once they have reached a certain threshold for happy and angry. By defining the “makeASound” function, each melody is called upon based on the mood changes. Some minor problems we encountered in embedding this code was making sure to define the total amount of notes in the sequence to make sure they all played within the for() equation. Within the “for (int thisNote = 0; thisNote < x; thisNote++)” the x would need to be replaced with the right amount of notes for the specific mood for the speaker to play, otherwise it would come out as a stifled sound. In the final code, the melodies synced well with the LED output, which allowed for a more “living” EmojiBall to be used in the game.

Initial Build

final-emojiball-board_bb

img_20181025_094206

First complete circuit board. This one includes a voltage regulator (lower left) while subsequent builds did not.

The physical build of the balls was an exercise in trial and error. We felt it was important to solder our boards due to the physically demanding way they would necessarily be treated. None of us had soldered before and the first board, after being built and tested on a breadboard, took the better part of two days. After becoming familiar with the materials and apparatuses, making mistakes and undoing them, the first board was complete.

mvimg_20181024_141442

Early exploration of the final circuitry

Power management was an immediate issue. We explored several options for power before consulting with a peer, Omid Ettehadi, who gave us excellent advice and pointed us toward a simple 12V battery. He explained to us that the Arduino contains a built-in capacitor to keep regulate incoming voltage. He also suggested building a voltage regulator into our circuitry to regulate the voltage and take some strain off the chip – Omid warned us that it might overheat without an external regulator.

img_20181025_094206
12V power source with voltage regulator

We built one prototype with the regulator and two without. While early testing showed no practical difference between the circuits that lacked a regulator and the other, it eventually became clear that the balls without regulators drained battery power at a much higher rate than the one that did.

img_20181025_165117img_20181024_165428

 

 

 

 

 

 

 

 

 

 

Whatever we used as a case had to be tough enough to withstand rough play but supportive enough to keep our components safe. We considered hamster balls for housing early on but were dissuaded by their expense, lack of versatility, and advice from Reza Safaei, our fabrication guru. After discussing with Reza we settled on foam for our case, which could be easily shaped and hollowed to accommodate our components but prevent them from being jostled or moved out of place. An added advantage of foam was that it is light, therefore doesn’t affect the balls portability or the player’s ease of use.

As a final touch, we covered our balls in a soft felt material, which gave them a pleasant tactile feel and encouraged players to be gentle with them. This had the added benefit of interacting well with the wood floor of our demonstration surface, as the juxtaposition of hard vs soft objects in the play space and amplification of the balls’ sounds by the wood made for a pleasant experience that we did not explicitly design for.

img_20181026_145555

The three EmojiBalls

 

pasted-image-0
Bill of materials per unit

User Testing

Our user testing suffered as a result of our inexperience with fabrication. Our team’s familiarity with code allowed us to build a game engine with parameters that could be easily tweaked, and we planned through testing to be able to adjust the parameters to get a sense of reactivity and personality of the balls. However we took longer than anticipated to complete a testable prototype and did not have enough time to test the product or the game to our satisfaction. Early iterative testing of the code using breadboards instead of soldered components pointed us in the right direction. We adjusted our thresholds and parameters to give the ball a sense of agency within the game and deliver immediate feedback to players. Moodpoint gain and loss from interaction was increased greatly from the values we had initially set for them, and the balls instantly felt more dynamic.

Testing the game itself was another matter. Our intention was to the keep the stated rules of the game simple and brief in order to allow interactions between players to flow naturally from the behavior of the balls rather than following arbitrary rules. We found that players did not intrinsically understand the consequences of interacting with the balls in certain ways and had to have the parameters of the mood-shifting mechanic explained to them. At that point, they tended to discover a single action – shaking the ball or holding the ball under light – that most reliably caused the mood-shifting outcome they wanted. We hoped that with more adjustment or the addition of extra sensors this might be mitigated, but we had prioritized fabrication over user testing and we were not able to tune the ball and the rules of the game to the extent we had hoped.

Product Design + Next Steps

The goal of the EmojiBall project was to create game pieces that felt as if they had emotions and exerted their will on the players of the game.

The actual rules of the game were secondary to the interactions with the ball. We considered adding rules such as “no touching the ball you last scored with” to prevent continuous scoring, but we decided against adding new rules in order to keep players’ minds clear and encourage interactions arising organically from play. We wanted to keep the rule set broad to ensure players not feel constrained and encourage them to try unusual things.

There are capabilities of the orientation sensor that we are not making use of that could be explored in further iterations. In addition to allowing us to tweak gameplay through the effects of physical interaction with the ball by measuring its current g-force or rate of rotation, the sensor is capable of measuring ambient temperature. Affecting the temperature of the ball or the game space might make for an interesting game mechanic.

Foam was chosen for its lightness, ease of use as a prototyping material and its protective qualities. Early in testing affixing the two foam cases together with dowels was adequate, but the rough nature of play as we drew closer to the due date necessitated us finding a more secure way to close the balls. We settled on tape in the interest of time. Layering felt material on the surface of the foam would also improve the mobility of these objects in the game. There was also a great advantage to having a felt ball to further emphasize the concept of a ball of emotion, where the idea of it becoming a sentient “feeling” object can be realized. The change in the tactile aspect could affect the interactions that can occur between the player and the object, which can bring about a sense of care for these balls that you are about to throw or roll into the hoop. With the code and core components now complete, another round of exploration and design is warranted to settle on the perfect material and closure system for the ball itself. Perhaps the ball can be designed using two spherical halves with grooves that fit into each other or even have one piece screw over the other, locking the components in place.

To refine our existing prototypes, we would add voltage converters to conserve battery life on the two prototypes that lack them. We would continue to explore options for casings and ball sizes – how might the gameplay change if the balls bounced like basketballs or were as heavy as bowling balls? How might the gameplay change if the ball texture was further altered?

We also see a likely expansion of the project as involving testing more sensors. This is not necessarily a priority, but experimenting with different kinds of inputs (e.g. a microphone to allow coaxing into different moods) might serve to refine the feeling of the ball as a discrete organism. Additionally, we would add RFID sensors to the goals and balls so that players are released from the burden of having to keep track of who scored where and by how much and who lost points where and by how much. This way, players can focus on enjoying the game and interacting with each other. The ball could be networked to cause it to react to stimuli outside the control of the players – for example, as suggested during the critique, it could change moods according to the weather.

img_4924

Resources

Critique Presentation Slides

EmojiBall Code on GitHub

Adafruit Oritentation Sensor code on GitHub https://github.com/adafruit/Adafruit_BNO055

Adafruit BNO055 Absolute Orientation Sensor. (n.d.). Retrieved from https://learn.adafruit.com/adafruit-bno055-absolute-orientation-sensor/arduino-code

Paul, E. (2018, June 19). Solfege: Why Do Re Mi Isn’t Just Child’s Play. Retrieved from https://www.musical-u.com/learn/solfege-do-re-mi-isnt-childs-play/

Friends-of-Fritzing Foundation. Fritzing. Retrieved from http://fritzing.org/ 

Circuit diagrams created with the Fritzing tool.

Play a Melody using the tone() function. (n.d.). Retrieved from https://www.arduino.cc/en/Tutorial/ToneMelody

All photographs taken October 2018.

Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.