Excuse Me Accessories

Title: Excuse Me Accessories

Group: Savaya Shinkaruk, Tommy Ting, and Chris Luginbuhl


Project Description

Our product was created from thinking about a project for our assignment Portables / Multiples.

The goal of this experiment is to create robust interactive prototypes that we can make more than one of and that we want to travel with us. – Nick and Kate.

When coming up with ideas for this project, we came up with some really cool and interesting concepts, but none we personally wanted to test or use in our everyday lives. So, after lots of thinking over a few days, we came up with Excuse Me Accessories.

We go more into depth about the journey of our process towards making our game in our blog  but the overall description of our project is:

To help people with their productivity on any task or activity we created the Excuse Me Accessories products. By using the Pomodoro Technique (work for 20 minutes – break for 5 minutes), people can follow this technique with their wearable or desktop accessory.

So, continue on to the rest of our page to read more about the Excuse Me Accessories team and our journey.


About team

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.

Chris Luginbuhl: Chris Luginbuhl is engineertist who likes to create new words and new ways of using technology to help the world.

Tommy Ting: Tommy Ting is an artist and an emerging video game designer currently in his first year MFA student in Digital Futures at OCAD University studying game design and development.


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF OUR EXPERIMENT:

November 24, 2017

After everyone presented the projects for Experiment #4, Kate and Nick assigned us our group members and talked about the description for Experiment #5.

The three of us got paired together (Savaya, Chris, Tommy), and we decided to take the weekend to each come up with ideas for this project for Monday.

End of day one.

Thanks!


DAY TWO

DAY TWO OF OUR EXPERIMENT:

November 27, 2017

Today in class Kate went further into the project description for Experiment #5.

She went over the requirements and the deliverables for our project.

For this experiment we are to produce 3 copies of the prototype we are intending to code and design.

During class, each group was given roughly 45 minutes to come up 5 ideas for their project. This is the first time we started to come up with ideas for this experiment.

Here are the 5 ideas we came up with:

  • Night time to Daytime t-shirt wear – exploring clothing that changes in appearing in light and dark environments – perhaps becomes more or less revealing, or adding lighting effects.
  • Linking Maps to Instagram to source the best Instagram photo opportunities in Toronto
  • Dance Wearables – clothes and accessories that produce sound and light effects in response to movement. Used for performance dance, and just dancing for fun.
  • Real life Social Media – wearing a small Eddystone/Physical Web/Puck.js beacon to broadcast your key interests via bluetooth around OCADU. You are alerted when someone within range (~10m) has overlapping interests. You then have to find the person and have a conversation to figure out your common interests.
  • Uncomfortable fashion – exploring fashion that places demands on the wearer, and negotiates with them to adjust their posture, movement and behaviour.
  • Now you see me, now you don’t – exploring the theme of wanting to be looked at or wanted to not be looked at in public. In particular, clothing that can tighten or loosen, fasten or unfasten, illuminate the wearer, or conceal.

The common thread we had for all of these ideas is that we want to make a wearable fashion item. It could be a shirt or an accessory. We are more interested in creating an accessory for this experiment.

Before class ended we had to pitch an idea (that we could change later on, and we did) to Kate so she knew the direction we were headed.

We talked to her about going on the theme of “you see me, now you don’t” and we wanted to go about this concept by creating make a device that unlocks your garments at a certain time of day (like sunset), or to make a hat that has lights that turn on at sunset.

These are some of the ideas we are running with, and we are going to meet on Wednesday, November 29, 2017 to settle on an idea.

End of day two.

Thanks!


DAY THREE

DAY THREE OF OUR EXPERIMENT:

November 29, 2017

Today we met in the DF lab to start to brainstorm more ideas for this experiment.

Here is a timelapse of us meeting and chatting about what we want to do:

The new idea we came up with is:

We are going to are creating a wearable product that is a navigation system that informs the user if they are travelling North. In addition, when daytime turns into night and people start to feel nervous walking around in the dark, our product will light up so the user feels more safe when walking home.

We want to iterate this idea, but this is the initial concept we are going to start with and then build off of.

Some of the iterations we are looking to add are:

  • Have the navigation move in all directions: NORTH, SOUTH, EAST, WEST (like a compass)
  • Build the Puck.js on a magnet, so the user can easily move the item from their wrist, to neck, to finger.
  • For safety purposes to have the devices where if a male or female are encountered in an unsafe situation, by clicking the button it will make an alarm and blink light so the attacker leaves.

We ended up having to leave to class after these notes and work on other assignments so we are taking a break from this and will meet again tomorrow.

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF OUR EXPERIMENT:

November 30, 2017

We met again today in the DF lab to go over the ideas we came up with yesterday.

We are still interested in our idea, so we are going to work on getting our proof of tech and proposal done for class tomorrow, as that is due.

We got to work on setting up the Puck.js (EMA) devices to Bluetooth.

Here is a image to show the pairing:

pairing-our-puck-js-device-to-bluetooth

To get our Puck.js (EMA) devices up and running this is the link we used to follow the steps to get it connect to Bluetooth, and to start doing tutorials on how to code: https://www.espruino.com/Puck.js+Quick+Start

Each Puck.js (EMA) devices has a 4 digit number that will be linked to our individual MACS so we know what Puck.js (EMA) is linked to whose computer.

This is Savaya’s:

bluetooth-code

We are going to use this code from this link for our proof of tech code for class on Friday, Dec 1, 2017:  http://forum.espruino.com/conversations/296979/?offset=25

Just kidding, in the end the code from the link above is not working because we need to subtract the magnetism in the area we are using this device in, for it to calibrate. And we need to set which axis you are mounting the device on. The original was not a robust enough and isn’t working.

Here is a image of the first try of the original code from link above:

firsttrycompass

So, we have sourced new code, and here is the link: http://forum.espruino.com/conversations/297915/

We are playing around with this new code because it looks like it will be easier for us to set an axis with a direction.

map

In the end, we are using the above code in the link as a template, but adding our own code into it. The personal code we are adding and changing is to assist in the calibration of the axis movement.

Here is a link to the GitHub that has the code we worked on and Chris implemented for our project: https://github.com/ChrisLuginbuhl/WalkWear/blob/master/WalkWear.js

Here is a first trail video Puck.js (EMA) working with the code:

This code is to show for our proof of tech, due tomorrow (Fri, Dec 1). We are going to iterate it a bit however as time goes on. But this is the starting point to show that when we move in a direction it navigates that and shows which direction you are headed, NORTH, SOUTH, EAST, WEST – as shown and talked about in the video above.

From there we started to work on our design ideas and noted what materials we will need for tomorrow.

Some of the design ideas we came up with for the Puck.js (EMA) are:

  • Original Idea: Magnetise the back side of the Puck.js(EMA) and then have a magnet it will attach to on the bracelet / watch, ring, and necklace. We like this magnet idea because it will be easy for people to move the device from location to location without having to undo something and tie it up again. However, the issue with this is the magnet will play with the compass too much, which will ruin the concept.
  • Prototype Idea: For Friday, Dec 1, we are going to do a prototype with velcro – which will showcase the same concept as the magnet, but won’t play with the calibration or compass.
  • Iteration Idea: We also like the idea of how Pop Sockets work. With the Pop Sockets you can also purchase a device where you can attach a car mount, where you can slide the Pop Socket into it, and it will hold you phone. We have attached images below to show what we mean:

 

pop-socket-mount popsocket

We also started to think of some working titles:

  • Wear and go
  • Navi – Direct
  • Walk Wear – WINNER!

For Friday Dec 1, here is a list of the materials we will need for our prototype:

  • Velcro
  • Wrist band – bring these products from home
  • Puck.js (EMA)

What is due for Friday Dec 1, 2017:

Proposal:

Working Title: Walk Wear

Group Members: Savaya Shinkaruk, Chris Luginbuhl, and Tommy Ting

Concept: For experiment number 5 Chris, Tommy, and Savaya are creating a wearable product that is a navigation system that informs the user if they are travelling North. In addition, when daytime turns into night and people start to feel nervous walking around in the dark, our product will light up so the user feels more safe when walking home.

Form: We are going to be making a wearable product that can be worn either on your wrist, hand, or neck. We are designing it so the device can be removed and then placed between your ‘bracelet / watch band’, necklace band, or ring band. For our prototype we are going to use velcro.

Electronic Component: https://docs.google.com/spreadsheets/d/1me4clmdyE9FGIMsQC5aUXlfaip62lC03WpXSjYtY48Q/edit#gid=0

Proof of tech:  https://youtu.be/vgKwKgTtEaA

Materials and production process for enclosure:  The Puck.js (EMA) will be shown as it already is built to act as an enclosure, so we are just building around it to make it into a device that can be worn on your wrist, neck, or hands.

We will work tomorrow in class on the next iteration steps of this product.

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF OUR EXPERIMENT:

December 1, 2017

We had class today, where we had to hand in our proposal (which is shown above) and then we have individual team meetings with Nick to go over our project.

When talking to Nick in class, he told us that we should try and focus on one thing, and only use what the Puck.js (EMA) provides rather than adding in extra information and stuff for us to do.

The big thing for us to focus on, is the product. What is our intention and goal – and how are we going to make the Puck.js (EMA) look good and have it look like something. Ultimately, we have to make it into something because the technical side of it and input + output portion, it already does so much.

So after talking to Nick, Savaya left because she was feeling under the weather, so Chris and Tommy met to chat about some new ideas for this project.

We planned to meet tomorrow to go over what Chris and Tommy talked about.

End of day five.

Thanks!


DAY SIX

DAY SIX OF OUR EXPERIMENT:

December 2, 2017

Today we met to go over what was talked about yesterday during class hours.

And in the end we decided to go with a new idea and concept for this experiment.

New concept:

We want to make a productivity assistant app that is a smart device to inform the user when it is time to take a break and time to get back to work. Using light, green and red, it will inform the user when break time starts and ends. Along with this, when it is time for the user to take a break they have the option to have a screen pop up letting them know of their agenda for their things to do, or have their display sleep so they can take the allotted amount of time till break time ends. To turn the device off hold down the button for it to shut off.

For the display options we are looking at using this function for that side of our product: https://www.boastr.net/downloads/

Here is a list of some of the brainstorming ideas we came up with for this new concept:

  • Productivity app
  • Hardware
  • Assistance
  • Lifestyle
  • Not connected to anything – it’s a smart device (the Puck.js (EMA))
  • Does not turn anything on but turns things off
  • The goal is to disconnect – and you as the user have the choice of what that might be
  • Then when to go back to work
  • Just have the colours of Green and Red – but before it shuts OFF gives you warning
  • 20 min on – 5 off and then 25 min off – the break / work schedule.
  • Keep it a wearable
  • Keep it all in the puck.js (EMA)

We are following the Pomodoro Technique for this system where you work for 20 minutes and then take a 5 minute break and do this for an hour, and after doing this for an hour you will work for 20 minutes and then take a break for 20 minutes.

What would the user use this device for:

We talked about the ways we would individually use this:

Savaya: I would use this more for, not when i am working but for when I am taking time off my work to shop online.

Tommy: I would use this for when I am working and need a break to relax and then get back to work.

Chris: I would use this for focus computer time and remembering to take breaks.

In the end we all agreed people can use this productivity device for:  school / work / fun

For design options we would like to:

  • Have an option for a wearable / like a bracelet or watch look alike
  • Have a option for people to put this device on their desktop so they can see it  

We want to have two options for people because we believe this will help when we put this product on the market, so people can choose how they want to use it and wear it as well!

Here is a sketch of the wearable design we are thinking of:

watch-sketch

For the wearable part of this device we are looking to purchase watch bands like this (below) and to 3D print a way for the Puck.js (EMA) to sit and have the watch bands connect to the watch band connectors (shown below).

Watch bands:

ordered-watch-bands

 

Watch band connectors:

idea-watch-band-connectors

We shopped around Spadina and Queen West at multiple brick and mortar stores to find the intended materials to make our product and to know the size of what we need to 3D print for the Puck.js (EMA) to sit in / on. But we couldn’t find the right materials.

So we are ordering the items from Amazon. Which should be here Monday.

We are ordering the watch bands shown above. 

We decided on this watch bands because it will be easier for all people off all wrists sizes to wear this device without having to remove links.

We however, are not ordering watch band connectors. Because we are going to 3D print a surface for the Puck.js (EMA) to sit in. 

watchv3 watch-design watch-spring

New names/titles:

We had the title as Walk Wear, but are changing it now as we changed our concept.

  • Wear Away
  • Walk Away
  • Deskside
  • Deskside Breaktime
  • Deskside Wearable
  • Excuse Me Accessories = WINNER!

Now we are working on getting the first part of the code working: to have the red and green light on button press running, but also by clicking the button to have the display turn off and on.

Here is video of us turning Chris’s computer display off by clicking the Puck.js (EMA):

We are going to finish here today. We need to ordered the watch parts we need for this wearable, and we are going to work on Monday to 3D print what we need, so by Tuesday we have the wearable made.

End of day six.

Thanks!


DAY SEVEN

DAY SEVEN OF OUR EXPERIMENT:

December 3, 2017

Today all three of us worked separately, but kept a communication line open via Facebook Messenger.

Today we spoke more about the design of the watch, the theme colours, and branding.

We decided to use these colours, font, and included stickers in our branding because we wanted to evoke a ‘kid friendly’ feeling. When people are breaking we want people to feel like they have freedom to do whatever they want, kind of how kids act when they are playing with their toys.

We decided on this theme colour for the brand / product:

exp5-colour-theme

We also started to think about the design of the desktop holder for this product. Here are a couple ideas we came up with: ADD IMAGES

And we started to work on our branding too!

Here is one concept we are working on for the title:

branding

This is all we worked on today, independently, and communicated over Facebook Messenger.

End of day seven.

Thanks!


DAY EIGHT

DAY EIGHT OF OUR EXPERIMENT:

December 4, 2017

Today we re-connected in class and sent in our Testing Plan Proposal:

Testing Plan Proposal: Savaya, Chris, Tommy:

  • Preparations
    • What needs to be done ahead of time? Each person in our group needs 1 wearable and one desktop piece to either take home, to work, or leave at school – it is up to them where they want to test it. We just need to make sure the tester has both options – including the package given with Excuse Me Accessories.
    • Do you need extra batteries? There is the option to include extra batteries in the package each tester will receive.
    • What goes into your repair kit? Trial period of testing there will be velcro and tape supplied in the package in case anything breaks – however there is a receipt of return to Excuse Me Accessories for product to get fixed.
    • Be sure to take “before” photos. ADD IMAGE HERE.
  • The Plan
    • How will you document your own user experiences while using the device? Notes? Journaling? Photos? Video? Audio recording? Each of us (3 group members) will document our own experiences with Excuse Me Accessories by: Journaling, Photos, Video, Audio recording and a Questionnaire that we have created.
    • What will you do to ensure that your data collection methods are consistent for each group member? To ensure our data collection methods are consistent, we will give a Start and Finish time of testing that each group member has to follow. Also with answering a standard Questionnaire about the product.
    • For each group member, what are the dates & times of testing?
    • Savaya: Wednesday DEC 6 @ 9 AM – 4 PM:    AND Thursday DEC 7 @ 12 PM – 6 PM
    • Chris: Wednesday DEC 6 @ 9 AM – 4 PM:    AND Thursday DEC 7 @ 12 PM – 6 PM
    • Tommy: Wednesday DEC 6 @ 9 AM – 4 PM:    AND Thursday DEC 7 @ 12 PM – 6 PM
    • If there is a reason that (2) 6-hour testing periods don’t make sense, include a proposal for a reasonable equivalent for your device and get sign off from Kate. Not needed.
    • Will you be together or having your own experiences? We will each be having our own experience using this product.
    • Will you be doing your day-to-day activities or doing a special activity? We will be doing our day-to-day activities because we want this product to make sense for everyone and their day-to-day activities.
    • Any other details? For this testing period, we have said we cannot work on this direct assignment.
  • End of Session Reports
    • You are required to create End of Session Reports. Create a survey / form using Google Forms for each group member to fill out at the end of their 6-hour testing periods. You will end up with 6 entries (3 users x (2) 8-hour testing periods.) Link to your form here. Each of us will have to do the standard Questionnaire at the end of both of our testing periods. And taking a video sharing their experience of what they liked and did not like.
  • After – Crunching the data & documentation
    • After the field testing, how will your team structure a debriefing conversation? Each of us will read each other’s Questionnaire and watch one another’s videos, and come up with solutions to the things they didn’t like. And then we will have a discussion group / thinking out loud session about what worked and did not, and what we should update.
    • What will you do with the data and media once you find it? We have decided to not work on this assignment during testing period because we want to make each of us use this product as though we had just bought it off the shelf. When it is a break time and there are important things to note we can write it down so we don’t forget, but that is it. The goal of this testing period is to also see and discover ways to use this device and product.
    • How will you display or present your observations & findings? We will present this by doing a blog post, and video presentation.
    • Be sure to visually document each prototype after testing is complete and make notes on what state they’re in. Done deal.

During class we also talked about our branding. We created the stickers to go over the Puck.js (EMA) to dress up. We will need to find clear adhesive stickers and change the opacity on some of the stickers colours so it they don’t cover the Green and Red light.

We also worked after class on the BetterTouchTool software, to see how it works and play around with it, and also see what trigger makes the most sense for the user.

BetterTouchTool is a software application on Mac that allows you to create triggers and gestures to result in a action. We are playing around with it so that when your computer wakes up from sleeping – your reminders app / page will pop up on your screen so you can see what you need to do.

better-touch-tool-commands

We are also going to meeting after class in the DF lab around 6:00 PM to discuss what we need to do tomorrow and aim to finish.

End of day eight.

Thanks!


DAY NINE

DAY NINE OF OUR EXPERIMENT:

December 5, 2017

Today we worked on building our watches, finishing the code, and making our desktop accessories.

To start of the day we started working on ordering the proper stickers we need for our product design. We went to Staples and Michaels but they didn’t the right ones we need. So we ordered some from Amazon, https://www.amazon.ca/gp/product/B007Z7LQ54

And we then put our designs into the proper template, and we will print them off tomorrow!

We also purchased our desktop accessories and painted them!

painted-desktop-accessories desktop-accessories

From there, we put together our first watch!

 

The middle part is 3D printed, and the pink straps are the ones we purchased from Amazon.

The 3D part of the watch is slightly too far apart where we need to attach the watch bands, but with soldering we were able to push them together. But we might reprint these again just so the bands do not fall off when someone is wearing them.

While working on the code we also learnt more about how we will be using BetterTouchTool with the code.

We are using BetterTouchTool to pull up peoples note taking and reminder apps on their computer after their computer wakes up from sleep mode.

We are also using BetterTouchTool as a command to turn your display sleep by hitting Control D. This is the hotkey.

With the code we have it working when, you press the button (Puck.js (EMA)) the light turns GREEN and it starts the timer for 20 minutes (working time), and after 20 minutes the light turns RED and it sends the hotkey (BetterTouchTool command ; control D) to turn screen display to sleep. And Green blinking light = break time over, but it will not turn your computer display on.

We are working on the code to have a 5 minute timer for break times, and right now there is no way to turn off the computer all together.

Here is a video of wearing the device and turning your display on after a break and off to take a break:

End of day nine.

Thanks!


DAY TEN

DAY TEN OF OUR EXPERIMENT:

December 6, 2017

Today we met at 10 AM in the DF Lab to finish building the watch accessories and update the code too.

Here is a video of how you would pair your Puck.js (EMA) to bluetooth / the code we are using:

We also wrote a instruction manual for users to follow and read so they know how to use our Excuse Me Accessory.

Today is the first day for our testing period!

Testing Period #1:

Duration: 12 PM – 6 PM (for all of us).

  1. Take before videos / pictures of where you are testing and what accessory you are testing with and how you will be testing it.
  2. Write notes during testing period about things you like / don’t like.
  3. Take video during the 6 hour period of what you are doing and how you are using it. (We won’t be filming the full 6 hour period, just intervals of it).
  4. Test these accessories however you would like to!
  5. At the end of the 6 hour period take a video of your findings.
  6. At the end of the 6 hour period also complete the survey: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewform

The dates and times of testing period #1 are as follows:

All of us will be testing for 6 hours today -independently- from

Savaya: Will be testing from home using both the wearable and desktop accessory

Tommy: Will be testing from school (in the DF lab) using both the wearable and desktop accessory

Chris: Will be testing from school (in the DF lab) and at home using both the wearable and desktop accessory

*Chris, Tommy, and Savaya will be using our Apple Computers.

We also decided that if we want to update any part of the code and test the new version, rather than waiting for tomorrow’s testing period to test it, we will check in with one another every 2 hours to see how things are going. And update it then and test it during Testing Period #1.

We have individually written our own notes and taken video of our first testing period – and in our blog post we have added links to each of our observations. (Links are shown in Finale Project Blog Post).

Before testing started we worked on the setup and plan for Friday’s presentation of our product.

We took some of the required photos we need for our presentation on Friday.

After we got the code up and running, chatted about Friday, went over our testing schedule, and took some images of our product and device, we went on our own paths to test our product for the first time.

Notes on how things went for all of us will come soon!

End of day ten.

Thanks!


DAY ELEVEN

DAY ELEVEN OF OUR EXPERIMENT:

December 7, 2017

Today we met in the DF lab to take video for our final presentation on Friday.

Today is also the second day of our testing period!

Testing Period #2:

Duration: 12 PM – 6 PM (for all of us).

  1. Take before videos / pictures of where you are testing and what accessory you are testing with and how you will be testing it.
  2. Write notes during testing period about things you like / don’t like.
  3. Take video during the 6 hour period of what you are doing and how you are using it. (We won’t be filming the full 6 hour period, just intervals of it).
  4. Test these accessories however you would like to!
  5. At the end of the 6 hour period take a video of your findings.
  6. At the end of the 6 hour period also complete the survey: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewform

The dates and times of testing period #2 are as follows:

All of us will be testing for 6 hours today -independently- from

Savaya: Will be testing when walking around Toronto to run errands, using the wearable

Tommy: Will be testing at his gym using the wearable

Chris: Will be testing from home using both the wearable and desktop accessory

*Chris, Tommy, and Savaya will be using our Apple Computers.

We have individually written our own notes and taken video of our first testing period – and in our blog post we have added links to each of our observations. (Links are shown in Finale Project Blog Post).

We also worked on our instruction manual:

  • To turn on pair your device to bluetooth by… going to this website: https://www.espruino.com/ide/… click the yellow square in the top left corner and choose a device number that hasn’t been taken // then click PAIR // and wait for it to connect.
  • When you have to work the light will be GREEN.
  • When you have to break the light will be RED.
  • When the GREEN light is blinking it means break time is almost over.
  • When it is break time your screen will sleep.
  • To make your computer sleep go to BetterTouchTool and go to the tab that says Keyboards – then click +Add New Shortcut or Key Sequence and in Shortcut put ‘control – option – D’ then go to Trigger Predefined Action and select Sleep Display.
  • Working with BetterTouchTool.. You can include commands to trigger an app to launch when your screen wakes up. Go to the previous action above and then click on it and choose +Add New Shortcut or Key Sequence and select Trigger Predefined Action and choose Open Application / File / Apple Script… in Controlling other Applications list.
  • To turn device off hold your EMA for a few seconds so light will device off.
  • You are given an extra battery
  • You are given stickers
  • Desktop accessory
  • Watch Band
  • EMA 3D printed holder
  • To turn it on you click the EMA once quickly

Here is a image of the battery being used:

battery

Here is the final instruction manual which user would get in their package when purchasing this product:

https://docs.google.com/a/ocadu.ca/document/d/1zW0JyQokGSjLWK2eN_BKDi7QTTDJu2hzPR6c5kjh5lM/edit?usp=sharing

End of day eleven.

Thanks!


DAY TWELVE – THE REVEAL OF EXCUSE ME ACCESSORIES

DAY TWELVE OF OUR EXPERIMENT:

December 8, 2017

Today we are presenting our product to our class, and we are so excited about it!

Here is our code: https://github.com/ChrisLuginbuhl/WalkWear

Here is image  of people trying on our product: yiyi

Here is our video / presentation: https://vimeo.com/246383252

This is our final project description:

We created a product that assists people with their productivity by using the Pomodoro Technique. We made and designed the Excuse Me Accessories that are available in wearables and desktop accessories for you to choose how you want to wear or use our device. In the delivery package you will receive, the EMA device, a 3D printed EMA holder, a watch band, a desktop accessories, extra battery, and stickers.

End of day twelve.

Thanks!


FINAL PROJECT BLOG POST

Project overview:

Project title: Excuse Me Accessories

Group: Savaya Shinkaruk, Chris Luginbul, Tommy Ting

Project description, including overview of object and intended context and users:

To help people with their productivity on any task or activity we created the Excuse Me Accessories products. By using the Pomodoro Technique (work for 20 minutes – break for 5 minutes), people can follow this technique with their wearable or desktop accessory.

We tested this product in multiple ways because we want this product to be usable and available for people to use it how they choose. This product is great to use when in school or during work hours to help you focus on the tasks at hand. But it is also a great product to use when out and about running errands to know how much time you are spending in stores shopping.

2-minute video presenting the portable & summarizing the field testing plan & results:

https://vimeo.com/246383252

Image of device on its own:

img_7131

img_7146

Images with device being worn / used by each group member:

portraits-1

portraits-2

portraits-3

Production materials

Design files: more are in the blog above. 

stickers stickers3 stickers2

Final Bill of Materials (spreadsheet including costs & suppliers):

https://docs.google.com/spreadsheets/d/1clJ1SG9paU3YRE_ewiWXgeS4Wr1ZE-hx4iZrrl9CGBc/edit?usp=sharing

Final circuit diagram:

watchv3

ema

pugk-js-device

Code:  https://github.com/ChrisLuginbuhl/WalkWear

User testing materials

User testing plan:

For both of our testing periods we stuck to this consistent formula – but had the freedom to use this product to our own regards.

Duration: 12 PM – 6 PM (for all of us each day).

  1. Take before videos / pictures of where you are testing and what accessory you are testing with and how you will be testing it.
  2. Write notes during testing period about things you like / don’t like.
  3. Take video during the 6 hour period of what you are doing and how you are using it. (We won’t be filming the full 6 hour period, just intervals of it).
  4. Test these accessories however you would like to!
  5. At the end of the 6 hour period take a video of your findings.
  6. At the end of the 6 hour period also complete the survey: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewform

The dates and times of testing period #1 are as follows:

All of us will be testing for 6 hours today -independently- from

Savaya: Will be testing…*location

Tommy: Will be testing…*location

Chris: Will be testing…*location

*Chris, Tommy, and Savaya will be using our Apple Computers.

We have individually written our own notes and taken video of our first testing period – and in our blog post we have added links to each of our observations. (Links are shown in Finale Project Blog Post).

Link to end-of-session report forms:

Here is our feedback from the survey we each completed after both of our trail runs:

Savaya:

Testing Period ONE: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewanalytics

Testing Period TWO: https://docs.google.com/forms/d/e/1FAIpQLSf7FQ9jBOA8BOaoMN2CQpqUsHTAXv-PGnW7K97MhyYRyEu6Ww/viewanalytics

Tommy:

Testing Period ONE: Shown in his Google Doc.

Testing Period TWO: Shown in his Google Doc.

Chris:

Testing Period ONE: https://docs.google.com/forms/d/1vjf0zSr_2iWvGgt04cNeQrbLx7yqAydsGli3gOBK2tU/edit#response=ACYDBNjmFDXXOQ6Hd-VSOB6JoJtaJ0wiu9POMmWGkvu1JetOWe62f7v11YIl_A

Testing Period TWO:

https://docs.google.com/forms/d/1vjf0zSr_2iWvGgt04cNeQrbLx7yqAydsGli3gOBK2tU/edit#response=ACYDBNheNu74TooUHnBuqjHPLqsVlysUGnVt5D8tffGapAItLMut3ALzoig1BQ

Link to data collected: NOT APPLICABLE – our data is our feedback shown in our notes and survey (above).

Photos, video, or other media gathered in the field:

Photos, video, or other media gathered in field is represented in our individual Google Docs.

Summary of the testing process: Information is included in our individual Google Doc Links.

Savaya: https://docs.google.com/document/d/1CRPUb4gkZqGOY5HT9sN97NA_C4V-HzqGcLCNI3hviHE/edit

Chris: https://docs.google.com/document/d/14ZC8tYcGWOSXoA-tu3VwBUTSjqD1SYjhLelLFA1RQII/edit

Tommy: https://docs.google.com/document/d/1r_Hzi_QUvadfG6nPVTHZBSKM4YTKxPf7xu96qmMWD58/edit

Reflections on your findings:

Information is included in our individual Google Doc Links and Survey links listed above or in our individual Google Docs.

Any summary materials presented at the critique: This would be our desktop accessories and wearable, and stickers available (images in above blog post).

Code development notes:

Part of the reason for using Puck.js as our platform was that we wanted a compact, battery operated package.

The Puck.js is based on the Espruino architecture. This is an open source javascript interpreter for arduino-like microcontrollers. It is programmed via a web IDE, which uses web Bluetooth to connect to the puck.

The use of Javascript means this is an event-based development approach, which is a good solution for battery-operated projects that spend a lot of time waiting for something (a timer, a button press, a certain pin to change states, etc).

The use of Javascript requires a change in mindset and a change in libraries from what is normally used with Arduino, however! The web IDE does contain modules that function similarly to libraries in Arduino, which extend the functionality of the device, and allow access to hardware features. They are also pre-compiled, and so run more efficiently than writing lots of Javascript.

We used a Bluetooth HID library (HID = Human Interface Device – i.e. mouse, keyboard, etc), which allowed the Puck to connect to Windows and Mac computers as a bluetooth keyboard, and send hotkeys which would sleep the display or the computer.

Although javascript for Espruino is quite compact and concise, it was hard to develop for this environment because of

  1. i) the relatively young age of this system. Espruino was first funded in 2013 and Puck.js was first funded in 2015.
  2. ii) Javascript is normally run in web browsers, so most of the examples and conventions available relate to screen-based devices.

iii) There are still some bugs in the web IDE. Once or twice we were able to fix a bug simply by removing code that was already commented out!

Here is a image to show the old / new firmware we used in our process making:

old-vs-new-firmware

Further considerations:

Although it is quick and easy to make updates to the code and flash devices with small fixes, it quickly became clear that version tracking would be important. There is no way to check what version of code is running on a device once it has been flashed to that device. For testing and debugging, it was important to use github and not make spur-of-the-moment changes to code to be used in testing

References & related works:

Any references / support materials used for the project:

References to related articles, papers, projects, or other work that provide context for your project. Write about the relationship between your project and these references:

Do more and have fun with time management. (n.d.). Retrieved December 2, 2017, from https://cirillocompany.de/pages/pomodoro-technique

Misfit. (n.d.). Retrieved December 3, 2017, from https://misfit.com/fitness-trackers/misfit-shine

 

Title: Love Corner

Title: Love Corner

Group: Savaya Shinkaruk and Emilia Mason


Project Description

Our product was created by thinking of a project for our Messaging and Notifications assignment.

For this project you will design and create a custom device that allows the members of your group to communicate in new ways.  Using the Wifi connection of the Feather M0, you will create standalone objects that communicate over the internet.  Each group must consider how these devices create a specific language that allows the things, people, or places on the ends of the network to communicate.  – Nick and Kate.

When we were first assigned this project Emilia wanted to make a Christmas gift for her boyfriend. She showed me some videos of concepts that work with couples who are in long distance relationships. And since both of us are in long distance relationships, we came up with our product Love Corner.

We go more into depth about the journey of our process towards making our game in our blog  but the overall description of our project is:

We created a product where couples who are in long distance relationships, can wear a bracelet that will vibrate and light up based on their partner’s pulse.

For this to work, you place on the bracelet (which looks like a Power Ranger watch) and attach the pulse sensor to your finger and through WIFI it will send the data to and from one another’s bracelets.

Even if you are not in a long distance relationship, but want to spice up your love life test out Lover Corner.

So, continue on to the rest of our page to read more about the ‘Love Corner’ team and our journey.


About team

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.

Emilia Mason: Emilia Mason is a content creator with a background in community-based work. She is currently an MDes student in Digital Futures at OCAD U and she is interested in using art, technology and design as tools for educational purposes.  


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF OUR EXPERIMENT:

November 13, 2017

Today we were introduced our fourth assignment in our Creation and Computation class.

This assignment is to come up with and design a custom device that allows the both of us (Emilia and Savaya) to communicate in a new way.

During class hours, Emilia and I discussed some possibilities of what we could do for this assignment.

Emilia had a great idea that is based off of Rafael Lozano-Hemmer project called Pulse Room. The Pulse Room is a interactive art installation where he uses pulse sensors that set off light bulbs in the room to the same rhythm of the person’s heart rate.

Here is a link to show his project: http://www.lozano-hemmer.com/pulse_room.php

We both loved this idea so much, so we decided to iterate it a bit and follow this other concept a little more.

We decided to do this because we thought it would be a cute gift for both of our boyfriends who do not live in Toronto haha.

Emilia came across this link on how we want to iterate Rafael Lozano-Hemmer project.

Here is the link: https://blog.arduino.cc/2015/06/18/your-first-wearable-with-gemma/

The above link is from the Arduino Blog where they have made a wearable pulse sensor.

We also took a look at this website: https://www.precisionmicrodrives.com/tech-blog/2016/05/16/how-drive-vibration-motor-arduino-and-genuino

However, it is using a Genuino, but it is helpful information and research for us to do, so we can understand how it is ran. And the materials we will need to look into.

So, we took this idea and will make our project to be this:

You will have a bracelet that has a pulse sensor and your partner will have a large LED (light bulb) that will show the rhythm of your partner’s heart rate.

Our input being: pulse sensor

Our output being: 1 big LED

After we figured out our concept, we asked Kate and Nick about it and they approved of our idea.

We also asked Kate what kind of pulse sensor we should get. She said the one from Creatron is great.

We also asked, for design ease, if we could / should use the GEMMA, but she said we should stick to using the Feather as it has WIFI – which is needed for this experiment.

So right after this class we went directly to Creatron to purchase one, but the gentleman there said they were sold out both in store and online.

We ended up going on Amazon to purchase one, but that was not going to work as we would not be able to receive on till December 4th.

attempt-to-order-pulse-sensor

We kept looking around to see where to get a pulse sensor because after calling Creatron again to say that had it available online and see if they could bring it in store – they still told us they were sold out. So we are still searching for where to get one and also other needed materials and tools.

From there, we kept talking via text messages about what we would need for this project.

Here is what we talked about needing for the project:

  • 1 feather for the pulse sensor (input)
  • 1 feather for the LED (output)
  • Materials to make the bracelet
  • Materials to make a box for the LED

We did talk about making 2 sets of this, but because we figured it would be too much money to purchase two more feathers, we decided to make one set.

What we need to do going into this week:

  • Ask Sana who used a light bulb (LED) for her third experiment, so we have to ask her how she set that up to function.
  • Order our pulse sensor.
  • Do more research on how to get this working.
  • Decide who is going to work on the input and/or output.
  • Design the bracelet we want to make.

End of day one.

Thanks!


DAY TWO

DAY TWO OF OUR EXPERIMENT:

November 14, 2017

We met today to go over what we discussed yesterday about our project and we decided to change a few things.

After searching how we would make a bracelet for this project – that would look good but also be large enough to fit all the intended components for it to run – we decided to do some changes.

The changes we made and why:

  • We decided to scratch the idea of using a single large LED set up to give a light show of your partner’s heart rate.
  • We decided to use 3 small LED lights that would go on the bracelet and turn on and off based on your partner’s heart rate that is being sent.
  • We did this because we want to have this project be as though you and your partner each have a bracelet where you get a light show based on your partner’s heart rate and we want to add in a vibration component where you can also feel the pulse rate of your partners.
  • In the end our changes allow our project for this experiment to become more intimate, which is what we wanted for our long distance relationships.

Here are the sketches for our two ideas:  sketch-1

sketch-2

After we changed the look of our experiment we started to order the intended materials and tools to make this experiment happen.

What we need:

  • 2 pulse sensor kits
  • Vibrating mini disc motor
  • IN 4001 Diode
  • Resistor ~200 1 K OHM – We have in our kit
  • 100 MAH Lipoly Battery – Having trouble sourcing
  • Heat Shrink Pack

Here is the new concept of our project:

You and your partner will EACH have a bracelet that will include a pulse sensor, a vibrating disc, battery, and LED’s where you can both send and receive your partner’s heart rate where you can visually see it on the LED’s and feel it from the vibrating disc.

In the end what we are making will be a prototyped version of where we see this experiment going.

After doing some research to source the materials we need to find we, we came across a few snags. Mainly with sourcing a battery.

Here is a link to show the recommended battery to get: https://www.adafruit.com/product/1570

Here is a link to show the battery we are going to get: http://www.canadarobotix.com/battery-chargers/battery-lithium-1000mah

We decided to not go with the recommended battery because the one we sourced has a longer lasting life period, even with having the same voltage.

Here is the email Emilia sent Nick and Kate about some battery dilemmas we are finding:

email

We ended up ordering most of our materials from Canada Robotix, which was also great because it was half the price.

Here is the list of what we ordered from Canada Robotix:

canada-robotix-orders

Here is what we purchased from Creatron: 

creatron

The last thing we need to get are the batteries. Which Kate and Nick will help us with hopefully before Friday, if not on Friday at the latest.

Kate was in the DF Lab when we were looking to order these batteries, so we asked her about what we were looking at, and she said to go for it! So we added the battery to our order.

Now we just have to start working on the code, and designing the bracelets.

We also sketched some ideas of how we want to design the bracelets.

Here is the sketch:

front-side-bracelet

back-side-bracelet

We just need to figure out the materials to use for this.

End of day two.

Thanks!


DAY THREE

DAY THREE OF OUR EXPERIMENT:

November 17, 2017

We re-connected today to start working further on our project.

All of our materials have arrived via mail, and we have purchased all the extended materials needed for coding and tool building.

However, we still need to purchase the bracelet materials needed to make them. But we want to start to work on the coding right now.

And also because we have class today with Kate, and it would be a great time to ask questions with problems we are having.

Here is a image of Emilia with our new tools and materials to build this product:

nov-17-new-materials

For our first stage of coding research we stuck to these two websites:

https://makezine.com/projects/make-29/beating-heart-headband/

https://pulsesensor.com/pages/code-and-guide

https://pulsesensor.com/pages/code-and-guide

These websites gave us the direction we needed to set up our Arduino and get our pulse sensor working with an LED. And to also see how we can implement other outputs into our product.

We had some technical issues arise with our computers where it was having trouble ‘compiling’ our board – but Savaya would just have to upload the code twice for that message to disappear.

Here is a video of our first attempts to get our pulse sensor reading our pulse rate and having the LED blink. We are still testing to see our pulse rates to know where / how to turn the LED OFF and ON rather than just blinking.

We are also finding that when we want the LEDs to blink – it takes a while for them to catch onto the pulse rate ‘beat’ that is being done. And then takes a few minutes to stop when you have removed your pulse from the pulse rate monitor. We are unsure as to why this is.

After we were to successfully get this portion working, we decided to call it a day and meet tomorrow to continue working on this experiment.

Some notes from class we took about our project based on what Kate taught us:

Facts:

A pulse sensor does not read the BEAT of the heart – it reads the blood flow to and from the pulse you are testing.

We also learnt our battery is a great battery to have because the Feather MO can charge it too.

However, if in need of replacing they are more difficult to source than just going to Shoppers Drug Mart to grab a pack of batteries. So if it were to come to commercializing this bracelet that would be something to think about. But there are ways around it, as the Feather can charge it.

We also learnt about Neo Pixels. These are lights that you can dress up individually. It is done in an array with wiring to have an individual dress. We are thinking about switching these with our LED lights to make the bracelet look better. There is information on Canvas we will be looking at to learn how to make these work.

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF OUR EXPERIMENT:

November 18, 2017

We met up in the DF lab to continue working on our project.

Our goals today are to:

  • Finish the coding for the pulse sensor and LED.
  • Find code for the vibrating motor.
  • Get these two things talking to one another.

Lets see how it does! Continue reading to find out…

So we had a few issues with the pulse sensor being / we did not know if the code we were using already had a programmed heart rate beat in it, or if it was reading Savaya’s.

On the other hand today, we also worked on getting the vibrating motor working.

We followed this link to how we need to solder the motor to work when putting it in place with our breadboard in a more condensed matter:

And we used the code from our Pages link in Canvas to see what code we need to get the motor working.

From this link here: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-output

After a few hours of soldering and coding, we got the vibrating motor to work! Just one of them right now! But it is working. They are very fragile so we have to be careful with how to solder this piece together.

Here is a image of the wired vibrating motor:

vibrating-motor-set-up-emilia

We are still confused with the pulse sensor code as to why there is movement when we are not touching the pulse sensor. So we are going to wait for class on Monday to ask Nick why this is happening.

Savaya went home to continue working on it and lost all the coding files she had been working on for the pulse sensor – so she started clean with what code to use.

In the end we will be using this code from GitHub: https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF OUR EXPERIMENT:

November 20, 2017

We are back in action!

We have class today with Nick and we are going to ask him about the pulse sensor.

To simplify our rant yesterday, this is what we are unsure of:

  • Why is the serial plotter graphing movement when we are not touching the pulse sensor.
  • Is this because the code we are using has a pre-programmed heart pulse to it?

We kept working with the code from the link posted yesterday, but here it is again: https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

After talking to Nick about the questions we had he said, it is not a pre-programmed code in the code we are using, it is just showing and graphing in the serial plotter the way it is, because there is a lot of noise. It won’t move the way it is now when we have it put into a bracelet.

It calms down when you place your finger or wrist where there is a pulse on the pulse sensor.

So, here is a video to show the pulse sensor working with the proper code and LED: 

After that was up and functioning, we started to work on getting the pulse sensor working with PubNub. Roxanne H helped us to understand what code we needed and how to link it.

We got the examples working in class, but it is a new learning curve we are finding when we have to do it ourselves haha.

What we needed to understand for it to link was the pMessage being sent from the data the pulse sensor was gathering.

And also with that, the data that is needed to be published is the void loop code – which is the Signal > Threshold information being made, and then sent. So to have it being published you have to say you need to add: “publishToPubNub”.

Once we got this very small part working, we continued to work on getting the vibrating motor working.

The soldering has been an issue for us because it is so fragile! After hours on hours of soldering on Saturday – it was working but then fell apart on Sunday. So we went back to the soldering table but it was not working again. So we are going to go to Reza and ask him to assist us in this issue – or else we will need to purchase a new one.

After we do this, we will need to then work on getting code to link the pulse sensor signal to the vibrating motor – as we already have the code from canvas working.

We also scheduled a meeting with Kate and Nick tomorrow at 3:15 PM to ask some questions about our project.

When we got both of the vibrating motors soldered, we started to work on connecting the vibrating motor to the same code the LED is running on. Aka our pulse sensor code.

We had Mudit help us with this.

img_9507

Some of the things he noticed with our previous code with the pulse sensor was how we were using Signal as a variable.

He said if we just use Threshold as a variable it will help to define a number when the LED and vibrating motor will go off and on.

img_9508

We got this working by adding the same loop code the LED is using but with the vibrating motor. And after doing this, it still was not working…until we realized we didn’t have it wired up correctly. We were missing the USB wire. As soon as we added that wire in – YAY it started working again!

We sadly didn’t get any video of this because then Savaya’s vibrating motor became unsoldered – so we will re-solder that tomorrow and get video.

But here is Emilias vibrating motor working:

We decided to take a break from this and work on setting up our PubNub code with the code we sorted out today, tomorrow.

img_9506

And also to work on building our bracelet. We are deciding between two ideas – to 3D print one or find out and then add some other pieces to it, like a pocket.

End of day five.

Thanks!


DAY SIX

DAY SIX OF OUR EXPERIMENT:

November 21, 2017

Today we are meeting at 11 AM to work on building our bracelets!

So we got our motor and pulse sensor code work together! But it did not happen with a few broken wires.

Both of our vibrating motors wires broke, and Savaya’s pulse sensor wire broke too. After talking to Nick and also reading online, the most important thing for us to do is hot glue or electrical tape the wire to there aren’t any pressure points.

Here is a image to show of the hot glue we put on our pulse sensors and vibrating motors: ADD IMAGE HERE.

After we went to Reza to get his to assistance on soldering and glueing what we needed, we got working on the code as to why our motor was not working with our pulse sensor and LED code. We set it up the same way as our LED but it was not working. We ended up getting Orlando to look at it, and he suggested we re-look at our wiring because he said our code should work.

And with re-wiring, we realized that our Transistor was facing the wrong way, and when we switched that – it started working! Thank you Orlando.

Here is a video to show this process working: ADD VIDEO HERE.

After this, we took a break and started to talk about how we wanted to make these bracelets. The idea we talked about is to 3D print a whole bracelet and / or 3D print a box where the Feather and battery will sit – kind of like a Power Ranger / Watch concept.

In the end we decided to do the Power Ranger / Watch concept!

We booked a meeting to go and 3D print our concept with Reza for tomorrow after our class tomorrow.

We also scheduled a meeting with Nick and Kate for tomorrow (November 22) at 3:15 PM to ask about our code to see if our PubNub is reading one another’s information.

End of day six.

Thanks!


DAY SEVEN

DAY SEVEN OF OUR EXPERIMENT:

November 22, 2017

Today we re-connected after our Research Methods to go and start our 3D printing box to make our bracelet, and to talk to Kate about our code.

The concept for our bracelet is, we will have a box that will hold the Feather and our battery (with all our wires too) that will sit on the top of our wrist, and then make it so we can have a band (velcro probably) that will have the pulse sensor on the inside of our wrist.

Some of the challenges / things we decided to change about this concept are:

Our meeting with Kate:

At our meeting with Kate, we had two main questions – the code and our wearable concept.

She said our concept of the Watch and Power Ranger look is great! Which was awesome to hear.

And with our code, she said we should include a BPM variable so then it is only reading it in a specific time range rather than all the time. – Which we need to source. We have done some sourcing on this already with looking at past examples, but we cannot seem to find the proper library for this. We have tried multiple things from Github but it does not seem to work. So Kate said she would send us some code that should fix that. But for now, we will do more research to see if we can find something to work.

img_9537

We are finding that the Adafruit code does not read well with our Feather board – and we cannot seem to figure out how to change that. So we are looking for BPM code that does not come from Adafruit.

After our meeting, we wanted to go over our code with PubNub to make sure that is it reading what it should be.

So we broke it down.

Here is a image of the notes we took while breaking it down:

howtounderstnadpubnub

howtounderstandpubnub3

howtounderstandpubnub1

We broke it down by: changing the value to our names so we knew who was reading and sending what.

In the end we got it working and now know FOR SURE that Savaya’s pulse is showing on Emilia’s LED and vibrating motor, and vice versa!

Here is a image to show our channels sending the right information to one another:

pub-nub-channels

We are so happy haha.

So now we are going to work on researching BPM code to add in our code.

We are also going to meet tomorrow to work on our bracelet!

End of day seven.

Thanks!


DAY EIGHT

DAY EIGHT OF OUR EXPERIMENT:

November 23, 2017

Today we went to get the materials we need for our bracelet.

We stopped in at Creatron to purchase our enclosure boxes, went to a fabric store to purchase some velcro to use as the band, and went to Michaels to purchase some paint and stickers to make the enclosure boxes look good.

We also talked about our presentation and how we want it to look and work.

We decided to make ‘Love Corners’ where we will be on opposite sides of the room with a designed ‘romantic feel’ (pictures to come in tomorrow’s blog post), where we will each sit and place on our Love Corner pulse bracelets where we will read each other’s pulses.

Also the name Lover Corner for our product came from us chatting about our presentation look and appeal.

We got back to school and started to work on our BPM code and also get designing our bracelets.

For the BPM code we sourced this code, from this website (link here: https://github.com/bmbergh/cheerios) and it worked! So we followed this YouTube video of how to code it, and added it into ours.

Here is the YouTube link: https://www.youtube.com/watch?v=gbk5T67KYcs

Once we added this in, we started to work on our bracelet. We need to condense our breadboard into a smaller one – which includes soldering our pieces and figuring out where we need to put our pins.

Here is a image to show that process so we know how many things we need to connect to GRND or what goes to what Pin:

figuringoutwhatconnectstogrnd_fornewbreadboard

Here is our fritzing board before we soldered and condensed the board: pulsesensor_breadboard

pulsesensor_schematic

 

This helped us to see where everything needs to go and to also see if we need to add in another + – GRND because we have multiple things connecting there. And might not have enough room on the smaller breadboard.

Here is our breadboard before transferring to a smaller breadboard version to be placed in the bracelet:

breadboard-close-up-emilia

breadboard-set-up-emilia

We also decided to spray paint our enclosure boxes pink and add stickers to it, to evoke that romantic vibe and feel.

Here is a image of our enclosure boxes:

enclosurebox

Here is a video of the enclosure boxes being painted:

Here is a image of the painted enclosure boxes and the stickers we are going to use to decorate them:

enclosureboxpink

We are having a issue with our vibrating motors where Savaya’s is not working, and Emilia’s is vibrating non-stop.

But we got our bracelets built and soldered and placed into the smaller proto-boards:

building

building1

finish-build-inside

img_9567

But we are going to take a break tonight, and work on it before class tomorrow and see if we can fix it, because it was working before we transferred the boards. But we are assuming it is a wiring issue.

One of Emilia’s wires broke too when we put the top on, so we have a soldering date tomorrow.

End of day eight.

Thanks!


DAY NINE: THE REVEAL OF LOVE CORNER

DAY NINE OF OUR EXPERIMENT:

November 24, 2017

Before the big finale of our presentation we had to solder and fix some of our items.

We are going to re-make Savaya’s vibrating motor because the wiring on the breadboard was done correctly, so we are thinking one of the other wires broke, but everything has been hot glued, so we can not see what is happening.

So off to the solder table we go in DF lab.

We also had to change the threshold again in our code because we could not get our pulses to go over 1000, so we lowered it so the LED and vibrating motor would go off, and with us doing that we see that Emilia’s board it working.

In the end, the first run did not work – and we know the wiring it correct and same with the code, so we think it is Savaya’s Pin 13. So we re-soldered it to Pin 12, and changed the code.

And that did not help the troubling.

We were not able to sort it out, and for some reason our code stopped reading and sending fully. Which was a huge bummer when we went to present it to the class, but that’s ok! We did a great idea and strong prototype and it was working well at some point during our process and journey.

screen-shot-2017-11-24-at-2-42-50-pm

To take a look at our code click this link:

https://github.com/SavayaShinkaruk/experiment4

https://github.com/emiliamason/Experiment4

Here is a image to show the product:

img_3336

Here is a video to show you to put on the bracelet:

What we learnt:

This project was a true learning curve on so many levels:

Writing a code that takes a pulse sensor and having two different outputs, using pubnub to send the input to be read by another device and vice versa. That was complicated!

Also, understanding the basics of how to handle pieces that will go in a wearable device. During our process we realized it would have been easier, more convenient and less expensive to have two sets of everything. One set to be used for the breadboard to make sure the code works and a second set to solder for the final device. This way the pieces (resistors, wires, sensors, motors, etc) wouldn’t be so worn out by the time we figured out the code, resulting in less soldering and less breaking of pieces.

Another valuable lesson from this project was to sketch where in the wearable device we want each piece to be located. Depending on this is how each piece should be soldered, this will also give a better idea what type of wires to use stranded wires or sole cords and how long each wire should be.

Examples:

screen-shot-2017-11-27-at-3-15-34-pm

The first board the soldered has a significant amount of wires that could have easily been shorter and soldered on the backside. Having so much wire made almost impossible the closing of the box.

If we could do this again, instead of soldering the motor to the diode we would for sure solder longer stranded wires. This would give the opportunity to place the vibrating motor on the wrist band so the vibration would be stronger on the flesh. Having the vibrating motor inside the enclosure did not really allow the user to FEEL the pulse of person wearing the other device.

End of day nine.

Thanks!


FINALE PROJECT BLOG POST

Lover Corner Product:

We created a product where couples who are in long distance relationships, can wear a bracelet that will vibrate and light up based on their partner’s pulse.

For this to work, you place on the bracelet (which looks like a Power Ranger watch) and attach the pulse sensor to your finger and through WIFI it will send the data to and from one another’s bracelets. (There is a video to show this process).

Even if you are not in a long distance relationship, but want to spice up your love life test out Lover Corner.

Project Members: Savaya Shinkaruk and Emilia Mason

Code:

https://github.com/SavayaShinkaruk/experiment4

https://github.com/emiliamason/Experiment4

Supporting Visuals:

There are images and video of our process and journey in our blog post above.

Design Files:

There are images and video of our process and journey in our blog post above.

Project Context:

The Lover Corner bracelets were created to be a accessory couples can wear when in long distance relationships or when couples are looking to feel a connection to their loved one. Savaya and Emilia created this product because they are both in long distances relationships and thought this accessory would be a way for each of them to connect with their boyfriends back home.

Through our understanding of Arduino, PubNub and product design we were able to create this prototyped version of Lover Corner bracelets.

We see this bracelet being less Power Ranger like, and more smooth so couples don’t need to just wear it in the privacy of their own home – but can wear it out in public when they are missing their significant others. We also see this coming in other colour options too, so it can be for men, women, and uni-sex options.

With the goal of our given assignment – to create a product that sends and receives notifications – we took that concept in less of a digital format like a screen, and implemented it into a wearable.

Bibliography:

Brandy Morgan. (2015, November 21). BPM with and Arduino Tutorial [Video file]. Retrieved from https://www.youtube.com/watch?v=gbk5T67KYcs

Pulse Sensor. (n.d.). Pulse sensor servo tutorial. Retrieved November 17, 2017 from https://pulsesensor.com/pages/pulse-sensor-servo-tutorial

Hartman, K., Puckett, N. (2017).  KIT: Connections / Code – INPUT. Retrieved from OCAD University Creation and Computation Canvas website: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-input

Hartman, K., Puckett, N. (2017).  KIT: Connections / Code – OUTPUT. Retrieved from OCAD University Creation and Computation Canvas website: https://canvas.ocadu.ca/courses/24263/pages/kit-connections-slash-code-output

Precision MicroDrives. (2016).  How to drive a vibration motor with arduino and genuino. Retrieved from https://www.precisionmicrodrives.com/tech-blog/2016/05/16/how-drive-vibration-motor-arduino-and-genuino

Github [yury-g]. (March 24). Getting Advanced Code / PulseSensor & “Arduino”. Retrieved from https://github.com/WorldFamousElectronics/PulseSensor_Amped_Arduino

Github [bmergh]. (2016). Cheerios. Retrieved from https://github.com/bmbergh/cheerios

Pulse Sensor. (n.d.). Getting Started. Retrieved from https://pulsesensor.com/pages/code-and-guide

Arduino. (2015). Make your first wearable with arduino gemma. Retrieved from https://blog.arduino.cc/2015/06/18/your-first-wearable-with-gemma/

Rafarl Lozano-Hemmer. (n.d.). Pulse Room. Retrieved from http://www.lozano-hemmer.com/pulse_room.php

Adafruit. (n.d.). Lithium Ion Polymer Battery – 3.7v 100mAh. Retrieved from https://www.adafruit.com/product/1570

Stern, B. (n.d.). Beating Heart Headband. Retrieved November 17, from https://makezine.com/projects/make-29/beating-heart-headband/

Earl, Bill. (2014). Using millis() for timing. Retrieved from, https://makezine.com/projects/make-29/beating-heart-headband/

 

Underdress’d by Savaya Shinkaruk

Title: Underdress’d

By: Savaya Shinkaruk


Project Description

My product was created by thinking of a project for our Peripheral assignment.

“For this project you will work individually to create a new peripheral for your computer that customizes an input or output function specifically for you.  This could be a new input device that changes the way you interact with your digital world or a means of notifying you about physical or virtual events around you.  To achieve this, you will use P5.js in conjunction with a variety of web APIs and a USB connection to your controller to create your new device.  Beyond the intended functionality of your new peripheral, you should also consider its materiality and spatial relationship to you and your computer.”  – Nick and Kate.

When I was first assigned this project, I wanted to create something that would work in the fashion industry. During class we started talking about weather API and I got an idea. When getting dressed I run into issues where I am not wearing something 100% appropriate for the weather conditions – so I thought what if I could make something to help with this fashion problem.

I go more into depth about the journey of my process towards making my product in my blog  but the overall description of my project is:

My Underdress’d experiment is a project for people to work within their own closet to know what to wear for the day based on the temperature outside.

The goal of this project is for people to never be too warm or too cold again based on what they are wearing for the day.

The weather revolves around what the current temperature is, for me I did it based on Toronto’s weather – as that is where I am living. And then to know what clothing item would work for the current temperature you have UID Cards that act as a ID label for your clothing. So, upload the website link, take your UID Card and scan it on the scanner (you have in your home) and on the screen will either pop up the word Yes! if should should wear it, and it is a Nope! you will get a blank screen.

And wow! You will never be underdress’d.

So, continue on to the rest of my page to read more about me and my  journey in creating Underdress’d.


About Me

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF MY EXPERIMENT:

October 30, 2017

Today we were assigned our third experiment for this class, and although I am nervous with this assignment because I am working alone – I started to look for ideas through Google on what could be created.

This assignment is to come up with a new peripheral for your computer and that customizes a input and output method.

Some of the ideas I am interested in exploring more of:

  • Link the weather on my computer to link to Instagram on my iPhone to give me ideas on what to wear.
  • Take a photo of outside with your iPhone and it will connect you to a hashtag of what to wear on Instagram.
  • Study party – where a timer goes off when it is time to stop studying and ‘party’
  • You share your pulse and it will give off a scene on your computer of what you are feeling. How much coffee you need??
  • Morning selfie to tell you how much coffee you need based on your motion.
  • Morning type in to see how much coffee you need based on the mood you input.
  • Moving display with a snack time feature when you stop typing after a bit.
  • Using a slider to input your price on Amazon or Ebay.

Notes on breaking down my ideas with input and output:

  • On Idea # 1: INPUT = voice command (Alexia or Google) OR a what do I wear today button  // Links to the weather through IF Statements // OUTPUT = Instagram  
  • On Idea # 2: INPUT = photo // OUTPUT = Instagram
  • On Idea # 3: Need to do more research on.
  • On Idea # 4: INPUT = pulse sensor or slider?? // OUTPUT = Digital Scene
  • On Idea # 5: INPUT = photograph // OUTPUT = Digital Scene
  • On Idea # 6: INPUT = keyboard // OUTPUT = Digital Scene
  • On Idea # 7: INPUT = keyboard // OUTPUT = Digital Scene
  • On Idea # 8: INPUT = slider // OUTPUT = Ebay numbers

The bolded idea is what I like the most.

From there I talked with Finlay about how I can do this when it comes to coding, as I am new to this.

Here are his suggestions:

  • Create this as a way where you read your clothing (not using Instagram now) (using RFID) and if it is ok to wear based on the current weather.  // get at Creatron + stickers to do this and place on clothing.
  • Create a spreadsheet on Excel to have characteristics and YES and NO answers and statements based on the YES and NO temperature ranges I input.
  • Characteristics include: shoes // bottoms // tops // outerwear – and so on.
  • Use RFID reader as a way to keep inventory

He suggested it will be easier for me to do the coding with the RFID Reader than sourcing a way my API Weather links to an Instagram Hashtag.

In class on Friday November 3 we will be learning more about how to use input and output sources for this project. And I will get Nicks input on what he thinks about this.

End of day one.

Thanks!


DAY TWO

DAY TWO OF MY EXPERIMENT:

November 3, 2017

Today we are working with Nick in class to understand more about how to create this project.

I asked Nick and some classmates about my project ideas and what they think will work…

So, in the end I am going to do (in short form):

I will have a ‘what do I wear button” that you click which will bring you to your weather input and through RFID it will tell you what clothing you can wear that day based on the weather.

Today I also started to work on my YES and NO answers and statements in an excel sheet.

Here is a link to the excel sheet:

https://docs.google.com/spreadsheets/d/16twY9lrF14V6dvcrtCDqIBHXuPN-m6WgZop_KFjVDyE/edit?usp=sharing

I also went to Creatron to purchase the needed materials: the RFID scanner and scanning cards.

Here are the new items I have never used, recently purchased, and will be incorporating into my experiment:  

new-items

Now, because even 15 of the RFID cards were expensive, I only purchased that amount so everything in the Excel sheet will not be available for this experiment. * I am not going to use all 15 in the end – I am going to use 12, so I also have some left over incase something happens.

The clothing items that are colour coded in RED are what will be available in this experiment. Which is shown in the Excel sheet link above.

And all the clothing in this experiment is from my personal wardrobe.

End of day two.

Thanks!


DAY THREE

DAY THREE OF MY EXPERIMENT:

November 6, 2017

Today I worked on my sketches of how I want this experiment to look in the end and how I want it to function.

Here is an image of those sketches:

sketch-1

sketch-2

I also worked on getting my Weather API to work. And yay! And thank you to Feng, she helped me to get it all working.

I needed to create a function where the information linked to the button was grabbing the Main.Temp information in a constant way. This is what I was missing in my code.

I followed the information Nick gave us about the TTC code, and followed that system – which for the most part worked! Just was missing a few elements. Like Console.log.

All the issues I seemed to have, when I added in a console.log variable it would work!

The research I did to get my Weather API working and functioning with P5 was from these links:

https://openweathermap.org/current#name

https://www.youtube.com/watch?v=ecT42O6I_WI&t=802s

The YouTube video I followed step by step – which got me all the correct information, I just needed to expand it a bit to include all the weather functions like Main.Temp and / or Max.Temp and Min.Temp (which I didn’t end up including in the end). Through this process (of expanding the original code), I learnt a lot about the need for variables. For something to run and gather the data needed – I need to create a variable for this to happen. The function is the grouping of what data is being transported.

Here are the images of what the Weather API will show for information (BEFORE AND AFTER CLICKING THE BUTTON):

The numbers do not change that frequently as the weather does not update every second.

Image of BEFORE clicking “get current weather”:

before

Image of AFTER clicking “get current weather”:

after

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF MY EXPERIMENT:

November 7, 2017

Now that I have my Weather API working to get information on the current weather and the max temp of Toronto’s current weather, I am off to work on getting my RDIF scanner up and running.

When I tried to hook it up yesterday, my port on my Arduino disappeared. So I had to go and get a new Feather MO – however it is now working today. So, I have backup.

I originally found a PDF – RFID Quick Start Guide : Arduino to get my Feather MO talking to my RDIF Reader, but the code it indicated to use, for some reason isn’t working. So since the setup of the Feather MO to my RFID Reader matches all other charts I Googled, I followed this setup on my board.

Here is a image of the what the setup should be on my board (i.e. what wires link to what):

setup-image

So, I am back to the drawing board to find different code as that is what seems to be the issue now, because both my Feather MO and my RDIF Reader light up when on the board.

In the end, from doing research, I found this code on this website:

NOTE: This website also confirmed I had my pin wiring correct.

https://create.arduino.cc/projecthub/Aritro/security-access-using-rfid-reader-f7c746

//

So… after typing in all that process and trying to get my Feather to link to my RFID Reader – it was a NO GO. Which was a lot of time wasted, BUT also a learning curve. But it was good to know it was not the code in the end. It was just the system I was using. I tried switching the code to fit a Feather MO but it would not function as well – I think that is also because I was getting a WARNING on the systems being used on my data information on P5 (though, Roxanne told me it shouldn’t really matter because it is just a warning), but it started working when I switched my system to the UNO.

I asked some of the second years to help me and give me any insight, and with that I was given a Arduino Uno and all of a sudden the RFID Reader worked and I was able to use my RFID/NFC Classic Card to get an UID number.

Here is a screenshot of the first card scan I did:

first-card-scan

Here is a link to the YouTube video I watched to get the card scanner to work:

https://www.youtube.com/watch?v=uihjXyMuqMY&t=166s

So now that I know the best thing to use is a Arduino UNO instead of my Feather MO, I am going to work on learning more about ‘if statements’ and how to link the weather number with the UID number on the card.

And when searching what code based on the GitHub zip file I downloaded to use from the link below this paragraph – the best code for what my project is, will be the DumpInfo file.

Here is the original DumpInfo Raw file:

https://raw.githubusercontent.com/miguelbalboa/rfid/master/examples/DumpInfo/DumpInfo.ino

For the DumpInfo code, I changed it a bit to match the way this link showed it:

I did this because it was more simple, and I understood the goal of the coding better. It is code that looks at two functions “Access” and “Deny” – which in my case I will change it to “YES” and “NO”.

http://mertarduinotutorial.blogspot.ca/2017/03/security-access-using-rfid-reader.html

Here is the where I downloaded the GitHub zip file:

https://github.com/miguelbalboa/rfid

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF MY EXPERIMENT:

November 8, 2017

Here is a image of the Fritzing of my board that I will be using for this project:

fritzing_experiment3_bb

After doing some research about IF / ELSE statements last night, in general this is what I need to figure out:

This will be for me to add into P5: This is written in a way I would understand what I am coding.

IF CARD ID AND TEMPERATURE = THE CORRECT RANGE OF TEMP THEN DISPLAY YES.

IF CARD ID AND TEMPERATURE DON’T EQUAL THE CORRECT RANGE OF TEMP THEN DISPLAY NO.

In Arduino I need to add in code to link my UID Card number to the weather in P5. Here is a link to show the general code I need to look into:  Which in the end I DID NOT need to use.

https://www.arduino.cc/reference/en/language/structure/control-structure/else/

I also spoke to Finlay about my research for my IF / ELSE statements and he said I shouldn’t have to do anything with Arduino. I should just make all my UID Cards “YES”.

So let’s see how this goes later on…

Taking a break from doing the above ^ research, I started to work on getting the UID card numbers done.

Here is a image to show that process: 

uid-card-number

uid-card-process

Presentation Ideas:

I also started to think about how I was to show this assignment to the class, and I decided I need a rolling rack to bring in the clothing I have decided to use.

However, bringing in a rolling rack to school will be interesting.

So, I am also going to print a small picture of the item of clothing that matches the UID Card and stick it on that – I will do this either way if I do or don’t bring the clothing items in.

Here is a image to show what I am talking about with the cards:

uid-cards

Project / Product Names:

  • Suitable Dress
  • Weather Pants
  • Underdressed = WE HAVE A WINNER! Which I later changed to: Underdress’d because when I was designing the box to go over my wires and board, I bought a pack of stickers which didn’t have enough ‘e’ letters in it, so I changed the word to this.

Meeting with Nick:

Later on today at 5:30 PM I am meeting with Nick – so hopefully I will have my Arduino and P5 connecting.

I want to ask him about my IF statements, and see if what I have planned is the right direction to go.

Nick assisted in updating my IF statements to have my weather, text, and UID Card number talking to one another.

I needed to also add in a drawYes function to add in my IF statements because I had them in just my draw function – which Roxanne H pointed out to me! So thank you.

Roxanne also pointed out to me I need to add in a String function like I had for my weather into my Arduino code because when trying to get the information to link it didn’t know what code to grab to send.

I needed to update my IF statements to link the inData code and the mainTemp code which was done by adding in “==”.

So the new code (in simple terms) would be:

IF (ARDUINO DATA == CARD ID) THEN THE MAIN TEMP RANGE.

In the end, the concept of my project is:

When wondering what to wear based on the current weather in Toronto, you will use a Card that is linked to a clothing item in your closet, where you will scan it to receive a YES on your screen if it is appropriate to wear based on the weather.

Here is a video of the first trial run of this working:

End of day five.

Thanks!


DAY SIX

DAY SIX OF MY EXPERIMENT:

November 9, 2017

So yay! Code is working and I am feeling much better about this project. I am so happy I stuck through it, even though there were a few times I thought I would have to change it.

But I love the concept and idea I came up with – so I wanted to make it work.

And it does!

So, today I worked on making a box for my board and Uno so it is covered up, and also a way to show my UID Cards.

I booked an apt with Reza so he could assist me on building something.

Here is a image of the sketch of how I want it to look like:

sketch-of-presentation

Here are some images to show what I crafted with Reza – it still needs to look better: 

craft-one

craft-two

I also decided I won’t bring  in my clothing, I will make a ‘style book’ where the UID cards will sit with the images of clothing on the cards. 

Here is a image to show what I am talking about:

display

box-design-with-reza

I have been having issues with my ‘Else Nope!” variable on my code. I did was Nick suggested where I type in all my If statements, and then at the end add in the else Nope! variable, but it when I would do two things. 1. If i did a Yes! item before a Nope! Item, it would just go on top of the Yes! word and 2. It would either just have the text word Nope! always showing.

So I tried adding a delay to this, but that would make it stop all together. I also tried to do the same IF statements like I did with the Yes!, but with the Nope! But that didn’t work at all. I had Finlay look at it, and he was also confused as to why it would not show.

So I did take a shortcut, and just decided that Nope! would not show on the screen. So if you should wear an item of clothing you should just wait to see Yes! on the screen.

End of day six.

Thanks!


DAY SEVEN

DAY SEVEN OF MY EXPERIMENT:

November 10, 2017

Today is the day of presentations!

I am so excited to show what I created. I feel so proud and satisfied with what I was able to design and functionally create.

To take a look at my code go to this link: https://github.com/SavayaShinkaruk/Experiment3

To take a look at the website of my product go to this link: https://webspace.ocad.ca/~3164420/experiment3/indexcopy.html

Here is a image to show my final set up of my presentation of my product:

display2

For this presentation I set it up, like a desk set up in your room or office area. Or wherever you clothing might be.

But for later down the line in this project, I would change it so there were not any Cards, the code was linked to an article item number where you would scan that on a scanner in your closet and you would either get a colour coordination of yes or no to answer your question – What Should I Wear?

I would also add in other weather tools like humidity and weather conditions too.

End of day seven.

Thanks!


FINALE PROJECT BLOG POST

Underdress’d:

My Underdress’d experiment is a project for people to work within their own closet to know what to wear for the day based on the temperature outside.

The goal of this project is for people to never be too warm or too cold again based on what they are wearing for the day.

The weather revolves around what the current temperature is, for me I did it based on Toronto’s weather – as that is where I am living. And then to know what clothing item would work for the current temperature you have UID Cards that act as a ID label for your clothing. So, upload the website link, take your UID Card and scan it on the scanner (you have in your home) and on the screen will pop up the word Yes! if should should wear it, and it is a Nope! you will get a blank screen.

And wow! You will never be underdressed.

Code:

https://github.com/SavayaShinkaruk/Experiment3

URL link:

https://webspace.ocad.ca/~3164420/experiment3/indexcopy.html

Supporting Visuals:

Here is my prototype video: 

Here are some images to highlight my journey to my  final product Underdress’d:

There are more images of our process and journey in our blog post above.

uid-cards

display

Design Files:

Here is my Fritzing image:

fritzing_experiment3_bb

Project Context:

The Underdress’d product is a way for men and women to never have to be underdressed for the weather based on the current weather of where you are living. It is a way for you to never feel too cold or too warm when walking to work or school, or just going out with your friends.

Since I am new to coding, I did a lot of research on YouTube of basic tutorials for P5 and also  Googled how a RFID Scanner works and runs on Arduino.

During my brainstorming and idea hatching steps in this project, I wanted to try and link something with fashion. I wanted to do this because it will look good on my portfolio to see technology I built and coded that could be used in the fashion industry. And so my Underdress’d  product was created!

The goal of this assignment was to create a peripheral that customizes an input and output function. I am using Arduino and P5 for this project too. For my product I am using my mouse to click the “Weather Now” button, so it will update the Main Temperature to the current weather in Toronto. From there you are using a card as a item of clothing to then scan and receive a Yes or No text message on your computer screen. So I am using a input of a mouse press and card scanner and an output of API weather updates and text.

Bibliography:

OpenWeatherMap. (n.d.). Current Weather Data. Retrieved November 6, 2017, from https://openweathermap.org/current#name

Arduino: Project Hub. (n.d.). Security Access Using RFID Reader. Retrieved Novemeber 7, 2017, from https://create.arduino.cc/projecthub/Aritro/security-access-using-rfid-reader-f7c746

Arduino: Project Hub [Photograph]. (n.d.). Retrieved from https://create.arduino.cc/projecthub/Aritro/security-access-using-rfid-reader-f7c746

Github. (n.d.). Miguelbalboa/RFID Library for MFRC522. Retrieved November 7, 2017, from https://github.com/miguelbalboa/rfid

The Coding Train. (2015, October 30). 10.5: Working with APIs in Javascript – p5.js Tutorial [Video file]. Retrieved from https://www.youtube.com/watch?v=ecT42O6I_WI&t=802s

logMaker360. (2016, January 13). MF522 RFID Write data to a tag [Video file]. Retrieved from https://www.youtube.com/watch?v=uihjXyMuqMY&t=166s

Mer Arduino and Tech. (2017, March 26). Arduino Tutorial – Security Access Using RFID Reader (MFRC522) [Video file]. Retrieved from https://www.youtube.com/watch?v=3uWz7Xmr55c

Security Access Using RFID Reader (MFRC522).  (n.d.). Retrieved November 7, 2017, from http://mertarduinotutorial.blogspot.ca/2017/03/security-access-using-rfid-reader.html

Arduino. (n.d.). Else. Retrieved November 8, 2017, from https://www.arduino.cc/reference/en/language/structure/control-structure/else/

 

DOUGH NOT GAME BY MAX AND SAVAYA.

Title: Dough Not Game

Group: Savaya Shinkaruk and Max Lander


Project Description

Our game was created by thinking of a project for our Multiscreens assignment.

The goal of this experiment is to create an interactive experience for 20 screens. This could be 20 laptops lined up in a row, 20 phones laid out in a grid, or something of your own imagining. Possible inputs include camera or mouse. Possible responses will be reviewed in class. Students are responsible for developing their own conceptual framework. – Nick and Kate

When we were first assigned this project we wanted to create something that would be fun but also play with people’s emotions when working with a pre-designed art installation. We need an audience to create an image, with no audience there is no art piece.

We go more into depth about the journey of our process towards making our game in our blog  but the overall description of our project is:

We created a game where people can interact together as much as they are interacting with their phones. The idea is to be in an exhibit where you and others need to work together to create a large image. Large enough that 20 screens will be needed to do this game because on each phone being used will be a piece of the whole image.

To play the game, navigate to dough-not.glitch.me on your smartphone, once loaded do an erase like motion on your phone screen to get a image, and once you have ‘scratched’ enough to get most of the image, shake your phone to stop the image from disappearing from the ellipses (circles). Zoom in on the image if necessary to align the edges with your screen and line up your phone with your fellow players complimentary images.

Be sure to test the game because there are lots of fun emotional factors!

So, continue on to the rest of our page to read more about the ‘Dough Not Game’ team and our journey.


About team

Savaya Shinkaruk: Savaya Shinkaruk is a fashion stylist and journalist with a keen interest in wanting to blend components of the online fashion industry with design. She graduated with a BA in communications in 2017 and is completing her MDes in Digital Futures at OCAD University.

Max Lander: N. Maxwell Lander is a photographer, designer, game-maker and hedonist. His work often blurs the line between disgust and desire, and involves a lot of fake and real blood. He enjoys making things that engage with gender, kink, violence, and neon. In particular, his work critically engages with masculinity in ways that range from the subtle and playful to brutal and unnerving.


BLOG: PROCESS JOURNAL

DAY ONE

DAY ONE OF OUR EXPERIMENT:

October 16, 2017

Today we were introduced our second assignment in our Creation and Computation class.

This assignment is to come up with a digital concept that is both interactive and is shown on 20 digital screens. These screens can include an iPhone, computer, or iPad.  

During class hours, max and I discussed some possibilities of what we could do for this assignment.

NOTE: Our research for this project mainly came from us using Make: Getting Started with p5.js. And just testing code from there and re-testing it till it worked the way we wanted it to.

Also from the Reference page on the https://p5js.org/reference/ website. And again, just testing and re-testing code till it worked in a way that we wanted it to. 

Here are some initial ideas we started with:

  1. We want our digital screen to be a phone.
  2. Each person gets a piece of a full image on their phone screen.
  3. The interactive element is to have people walk around and look for their puzzle match so people can create a large image.
  4. The other interactive element is to have each person go to the link where their image will be and have to do a ‘scratch and sniff’ method to find out what their image is.

We liked the puzzle concept so much, that we decided to keep this concept but figure out a way to make it more challenging when it came to coding and designing this. And also a more frantic interactive element.

Here are things we need to remember when putting our project together:

  • This assignment has to be interactive on a digital and physical level – which we accomplished in our brainstorming ideas.
  • We have to code it in a way so people don’t get the same image every time.
  • We need to make it digitally challenging for us to code but for people to play too.

In the end our concept is:

You download the code – erase to see what image you get – then assemble a puzzle – with a frustrating factor (we need to create).  

We came up with a game where you interact with people in the room by using your phone to create a larger and full image. Each person in the room will have a different piece to the puzzle – and to figure out what your image is, you have to use your finger on your phone to make it appear.

Additions to our concept:

  • Will the image you are trying to see stop so you don’t have to keep using your finger to get the image on your screen?
  • What will the frustrating factor be? One idea we really like is to have random ellipses popping up on the screen while you are erasing. As if the computer is fighting back.
  • A colour theme.
  • A image.

From here we both went home and researched and tested ways to make this more intricate in the design and coding and how we can take the game to a next level. And to visually look better.

Here are some sketches of the ideas we came up with:

img_3032

img_3031

End of day one.

Thanks!


DAY TWO

DAY TWO OF OUR EXPERIMENT:

October 20, 2017

Over the week Max and I were busy with other school projects – but kept in touch via Facebook when we had an idea or coded something new.

During our conversations we both felt we needed to add another element into the interactive portion of people ‘scratching’ to see the image on their phone screen. – Which was mentioned in day one where we talked about using random ellipses.

In class on October 20, 2017 we re-connected face to face about some of the ideas we thought about and started to get to work.

Things to think about / research to code / we want to incorporate:

  • Cut a full image into an even number of squares / cut it into 20 pieces OR cut it in less than 20 pieces where things appear more frequently.
  • Have a reload button so people can reload to get a new image if someone else already has their image.
  • Figure out a way to have a image fit to the screen.
  • Random versus sequential (x+1, x+2) when it comes to who gets what image.
  • How does the image stop?
  • List of images // each page refresh will place someone randomly in that list // and will then have a button to move sequential through that list.
  • Pick a phone // ‘scratch and sniff’ process with that phone to figure out the image // see if it links to a buddies image // if not press next image
  • We will use 20 phones // but talked about only using 10 // could change this later.

Images to use:

    • Doughnuts.
    • Something poetic.
    • Something sensational.
    • We need an image that doesn’t have a lot of detail in and it will be easy for people to find their match.
    • So we decided on the: Doughnuts image – shown below.

doughnuts_crop
Testing period for some of the things we wanted to add / change / test:

P5 Code / what we looked at / what is working / what is not:

  • Background image – full image or to cover with a colour to punch through?
  • Adding and using Javascript also – so we can see how to load images from a different folder – as we are unsure if p5 can do so
  • P5 – copy code – where it copies whatever you tell it to – in our case we are telling it to copy the image we put in the background
  • Loaded image . then when you click its copying onto your canvas / it’s stamping (which is the copy code)  the pattern of the image we loaded into script  BUT because it’s not making the image full width its not filling the window – but we can stretch or find a general sized image
  • Trying to find a image to make it take the size of the screen – like fit to content on Indesign
  • Issue we keep seeing – only copying in it default size. – why is this?
  • Maybe try and make a canvas and fit the image to that….
  • We are trying to not have to put another background colour over top of the image to then erase or copy – after testing can’t erase

Codes we are using / are wanting to use:

This list will keep being updated as we move throughout our project

  • Random Background image code
  • Copy code
  • Random Ellipse code
  • Fill code
  • Stroke code
  • Random Background code
  • Button for next image code
  • Anti-Bounce code
  • Shake code
  • FrameRate code

Game names:

    • Digital scratch game
    • Puzzle scratch game
    • Puzzle party game
    • Scratch and party game
    • Erase and puzzle game
    • Scratch array game
    • Do Not game – Winner! – but we changed it… See what we changed it to as you read through the rest of our blog.

Ways to make the game visually interesting:

  • Play with the colour theme.
  • Add a background colour image – jk leave it white.
  • Make the circles a different colour / grow / change colour /
  • Have the copy be white.

After today and testing and trying our ideas – some things worked and other things didn’t. But we assigned jobs for each of us to work on over the weekend until we met on Monday.

Based on the ideas we had last week though, here is a video of the first trial run:

Max: work on figuring out a way to make the image go the size of the page without messing up with copy code.

Savaya: work to make it more visually appealing and play with the copy code and its colour and work on adding another ‘frustrating’ element.

Here is a video to show the practiced colour theme of the random ellipses (not final):

End of day two.

Thanks!


DAY THREE

DAY THREE OF OUR EXPERIMENT

October 23, 2017

Today we worked on the blog a little bit more to showcase our process – both as a group and separately (always thinking of the group).

Here is a image to show the colour theme of the ellipses:

screen-shot-2017-10-23-at-12-57-29-pm

These colours were chosen from the Doughnut image using Adobe Color CC

We also finalized the concept of the game:

The full image our class will be putting together – if they can – will be a image of doughnuts. We will be using 20 screens and each person should get a different image to create the final image.

You will download the code given to you by us you will see a blank screen use your finger in a erase like motion on your screen to try and see what piece of the image you get if you get one that someone already else has, click the NEXT IMAGE button and attempt again until you get a image no one else has.

THE CATCH: as you are trying to see the image on your screen there will be another function that is also deleting the image (random ellipses) – so keep swiping fast!! And, will you ever be able to put together the complete image? Yes! When you have your image shake your phone to stop the ellipses from copying over the image.

Image of the emotional goal for the player using the game:

img_3033

So, now that we have figured out the context of the game and are working to finalize the colour theme, we are seeing a few issues.

A couple problems we are seeing:

  • When we go to use the erase like motion on your phone screen to see your image, for those with an Android and iPhone the movement isn’t as smooth – it is moving the whole screened image rather than just being able ‘scratch’ to see your image with no extra movement. – But yay! We fixed it. (In css, position:fixed and overflow:hidden)
  • The random ellipses are a little bit TOO crazy. We need to figure out a way to slow them down. (the discovery of frameRate!)
  • We need to re-size the doughnut image to make it larger so when people have their piece of the puzzle that small image will cover the screen of their phone.
  • When putting the ellipse code and the game code together, the scale of the ellipses was doing something weird to the image.

The next step after solving these issues / steps to fix:

  • When cutting the images we realized we cannot cut it into 20 pieces because of the size. So – the glitch in our assignment will be only 18 people will actually be apart of the whole image, but everyone will be apart of the game. – Decided not to do this – and we made it so it would be for 20 screens.
  • When sizing the image into pieces we should air on the smaller size because people can enlarge the image after to ‘connect’ the dots as close as possible.

Here is a image to show how we are going to break up the whole image:

22851652_10155066211522897_429250688_o

End of day three.

Thanks!


DAY FOUR

DAY FOUR OF OUR EXPERIMENT 

October 24, 2017

Today we worked on making our ‘get a new image’ tab on the page into a image so when you scrolled over it with your finger it wouldn’t get highlighted – as it was doing that with it not being an image.

Here is a image of the New Image link:

newimage

Because the default p5 button doesn’t seem to have a way to make it an image built in, we decided to change the css properties of all buttons to show the above image – this solution would not work if we were using multiple buttons, but we are not, so yay! If we were using multiple buttons, we imagine the way to do it would to be to create the buttons in html and then look into how to like them into a p5 function.

We also worked on finalizing the speed and look of the ellipses. They feel much faster on a phone than they do on a computer so after fiddling around decided to slow them down a little but more so as to only produce a mild panic.

We also decided to update the name of our game from The Do Not Game to The Dough Not Game.

End of day four.

Thanks!


DAY FIVE

DAY FIVE OF OUR EXPERIMENT

October 26, 2017

Today we worked on making our game look good and finalizing all the code, and process of the game. We did this via Facebook when we thought of something that we would need to add to make the interaction of the game smooth.

Here is a image to show how it will look once everyone has gotten an image on their phone:

22833414_10159549073055451_1429570892_o

We asked on our DF Grads 2019 Facebook page what kind of phones everyone had to make sure the sizes of the pieces of the image would fit on people’s screens. And based on everyone’s answers we are seeing that some people will have to enlarge their image on their phone screen to match to their partners.

Here is a image to show everyone responding to our question on Facebook:

img_3060

This information also re-assured us that using 20 screens will work!

For the interaction portion of the game, we were talking about how it might be a little crazy for everyone to be running around looking for their puzzle partner – so we added in our instructions a little tip.

The tip: To make this process easier for you, try getting into groups or partners so you can work on pieces of the puzzle together than just doing it individually and hoping you will find someone with the next image to match yours.

In the end of all our brainstorming, coding, and hard work here is a GIF to show you a pre-show of our game, The Dough Not Game:

End of day five.

Thanks!


DAY SIX – THE REVEAL OF THE DOUGH NOT GAME

DAY SIX OF OUR EXPERIMENT 

October 27, 2017

To take a look at our code click this link: The Dough Not Game Code

To play the game go to this link: http://dough-not.glitch.me

Here is a image of our branding:

dough-notHere is the final video of everyone playing our game:

Here is a image to show everyone’s interaction with the game:

img_20171027_144917

Here is a image to show the whole image everyone created by playing the game:

Processed with VSCO with f2 preset
Processed with VSCO with f2 preset

In the end there was not 20 phones being used, but it created a cool picture!

End of day six.

Thanks!


FINALE PROJECT BLOG POST

Dough Not Game:

The Dough Not Game is a fun art and digital installation where you can connect with others in a single room and also to play with your emotions.

To play The Dough Not Game: you download the code that is shared to you – you will do an erase like motion on your phone screen to get a piece of a complete image – and once you have ‘scratched’ enough to get most of the image you will shake your phone to stop the image from disappearing from the random ellipses  – from there you will find others playing the game and connect with them to match your pieces of the puzzle – once you have found your puzzle partners you will then create a full image on your phone screens. Remember, you need up to 20 people to play this game.

Here is a tip to make this game a little more interactive and fun: To make this process easier for you, try getting into groups or partners so you can work on pieces of the puzzle together than just doing it individually and hoping you will find someone with the next image to match yours.

Take a picture and tag us in it to show us the image you made with your peers!

Project Members: Max Lander and Savaya Shinkaruk.

Project Context:

The Dough Not Game was designed and created to be a fun art and digital installation game for those who are bonding in a common space. It is more common for people to come into a common space and automatically go to their digital screens and connect via online rather than face to face – so Max and Savaya created this game so people could connect with others face to face but they still need a digital screen to make a interaction happen.

We also made it so people have to come together and work together to create a bigger picture – but they also need to work on their own ‘erasing and scratching’ skills to find a piece of the bigger image. Most of our research on how to created this game, consisted of us brainstorming fun game ideas, and then using our P5.js book and online webpage P5.js to make it happen.

During our brainstorming ideas we both like the games where we are given a mismatched complete image and have to move the pieces around to make the complete image. So, we took this concept and when with it and talked about it – through our understanding of the P5 code how we could do this. And so, The Dough Not Game was created.

With the goal of our given assignment – to create an interactive experience for 20 screens – we took that concept for you to have an interactive component on your screen, but also for it to be an interactive with other people in a common space. With that in mind we created something that was fun to make and play, and explored different areas of a new coding system (p5). With a broad concept, we thought of things that we would like to use and do when it came to making an interactive piece.

Code:

The Dough Not Game Code

URL link:

http://dough-not.glitch.me

Supporting Visuals and Design Files to show our process to our final code and URL link to our game are throughout the blog post above. ^^

Bibliography:

McCarthy, L., Reas, C., & Fry, B. (2015). Make: Getting Started with p5.js. (1st ed.). San Francisco, CA: Maker Media

P5.Js. (n.d.). References. Retrieved October 16, 2017, from https://p5js.org/reference/

End of experiment.

Thanks!