Pocket Oracle

Pocket Oracle

Roxanne Baril-Bédard & Sean Harkin

description

A small wooden box fitting a Wifi connected derivative of the old “Magic 8 Ball” novelty item.  At the press of a button the unit randomly chooses a text based bit of “advice” from a handmade API containing an ample vocabulary, for a possibility of 203’112’000 unique answers.  After having downloading the JSON, the device can work offline. 

It’s meant to be cryptic and mysterious, so to have the user try to interpret what the oracle means, much like seers of the old days. Users, pretty much anyone with an interest in the mysterious, can ask advices during the day whenever they hesitate between options. This device needs to be portable so to be able to keep asking it question throughout the day.

Due to the requirement of being portable, we wanted to design the device to be as small as possible. Although we wanted the final design to be small enough to fit on a key-chain, we had to compromise time and component size. The device is still small enough to be easily portable and is very light-weight.

We want to test how useful the esoteric advice received from the  Oracle is in everyday settings, and understand the limitations of the device.

 

The Device

portable oracle

20171207_184950

Production 

20171206_125350 20171206_151155

20171207_134328

20171206_152851 20171206_160456 20171206_160505 20171206_154516 20171206_154523 20171206_125311 25035333_10159646484310057_1042479634_o 25075468_10159646484275057_1372687742_o

20171206_185059 20171206_184654 20171206_184703 print-lid print-box

20171207_193951

final Bill of Materials (spreadsheet including costs & suppliers)

 

final circuit diagram

screen-shot-2017-12-08-at-11-35-15-am

final code 

https://github.com/metanymie/portableguru

Journal

Day One (Monday, November 27)

Discussion in class regarding form and function of the project.  Two ideas seemed to fit the group’s vision:

  1. A derivative of the “Fitbit” wearable fitness/exercise monitor, but with included timers designed to help with HIIT type training and reps.
  2. A derivative of the novelty “Magic 8 Ball” which would pull “advice” or “answers” from random API sites and display them on a small (and wearable or “keychainable”) OLED screen.

After some back-and-forth, the simplicity of the “Magic 8 Ball” application appealed to all three of us and we decided to go with that.  Team to meet tomorrow approximately 16:00.

Day Two (Tuesday, November 28)

Roxanne having gone to Creatron to pick up the required OLED FeatherWings, extra stackable headers in case of need and order the batteries we soldered the assemblies and downloaded and installed the required libraries (Adafruit_SSD1306 and Adafruit_GFX) then ran the example code provided on the Learn Adafruit site (https://learn.adafruit.com/adafruit-oled-featherwing/usage).  Sean then successfully tested the battery we’ve chosen to use (being the only one with a battery so far).  We have also decided that there should be a “no internet available” subroutine that would draw from a preset array of “answers” (see below).  i.e. – if the “Guru” is connected to the web, it draws from a random selection of API’s for its answers.  If no internet connection is available, then it draws from its preset array.  

Roxanne tried some of the available fonts to see what could work best. Most were too big for the screen. They also figured out how to wipe the screen from previous answer.

Dec 1st

Roxanne: Trying to get io.adafruit working with the board

First follow this tutorial to update ssl certificate

https://learn.adafruit.com/adafruit-feather-m0-wifi-atwinc1500/updating-ssl-certificates

Also install libraries: ArduinoHttpClient, Adafruit MQTT, Adafruit IO Arduino

Trying to make this tutorial work, had to ask Nick for help because it is not explicit you have to have the ssl certificate for both ArduinoHttpClient and Adafruit MQTT libraries so the compiler would not work (rather frustratingly).

Tried to make this tutorial work so we’d be connected via wifi to the internet https://learn.adafruit.com/adafruit-io-basics-digital-input/overview

Success! Able to send stuff from the arduino to io.adafruit, albeit random number but the bridge is working !

Got it working with a slider and a button, sending value on press. Getting there !

Nick says adafruit io is not useful, so most of the things succeeded today served nothing. Instead, we must look for tutorial make Http client, point it to web address, take json message, put it in a json object.

Sean: We also spent some time working on the back-up option of preloaded responses on the feather. We already had the basic functionality of a display screen asking if you need advice, being able to push 1 of the 3 buttons and getting a different response.

From here, we wanted to build a basic random function where any button push would pull from a list of responses

Dec 5

Day 1 Build: The original idea was to build the casing from wood. We enjoyed the idea of a little wooden box you carry around and ask for advice, as well as the contrast of the organic wood with the digital OLED screen. However when Sean went to build, he ran into some issues:

  • The wood we had available was mostly ply, and mostly low-grade (3-4 ply)
  • Even if we were to source better materials, the time constraints of the build would not allow for the quality we are initially pictured for our product.

Tomorrow, we will either begin 3d printing the prototype cases or use acrylic. If we are unable to create the 3 casings tomorrow in sufficient time, Sean can quickly build another 2 ply cases for testing purposes – with some work they could be to a fair standard before presentation.

At last week’s class, Kate suggested that we remove the last answer from the screen after a certain amount of time, which seemed like a great idea to encourage interactivity with the user. Despite some small hiccups, we got this functionality into the code. 

 

Dec 6

Day 2 of build went well. Sean had access to the Ultimaker, meaning he began printing a version. Even if we do not have time to print 3, he thought it might be interesting to see an example of how the MK1 case design might look printed.

Sean also experimented with some different materials. First of all he tried some acrylic. Although it was easy to work with and joints were easy using the adhesive bond, it shared the weakness of the plywood being too brittle. Although there is little-to-no worry of it splintering or fracturing, I was cautious of components snapping during prolonged use. While I was waiting on the print, we decided to return to wood. We found some MDF in the workshop and began working. It was also around 1/4inch thickness but seemed to work well. With some help from the esteemed Reza, Sean was able to build a fairly well crafted box. He will return in the morning to complete the lid and then all we have to do is upload the final version of the code and begin testing.

Roxanne set out to figure out a way to get the feather to be able to make request to the server. After many hours of code, the feather is able to connect to the raw json file on their github. They used a JSON written using the visual tool Tracery (http://www.brightspiral.com/tracery/).

We didn’t succeed in splitting one string array into many strings so that it would create a sentence structure from the origin array as featured in tracery structure. Since we did not succeed, we hardcoded the sentence structure instead, telling the program from which array to pull a random phrase. Doing so, the advice sentence is always created according to the same pattern.

We also had problem with the server address because we put the JSON on github and it was hard for the feather to connected to a protected https server. We resolved it by using the website rawgit.com .

Dec 7

Sean finished the box today with some help from Reza. The 3d print, although functional, had a less pleasing aesthetic feel. He continued working with the MDF casings as these seemed to be the most stable and easy to work with. Originally we had planned to use a simple dowel lid which could be pulled off for maintenance and  repair, but Reza suggested and helped build a magnetic lid – less stable but much easier to get access to the internal components. Similarly, we had thought of using rubber or foam for the button, however Reza suggested adding some contrast with the wood in the form of colourful acrylic. With the addition of some hot glue – which acts as suspension – the button works effectively and consistently.

Overall, we were fairly happy with the final product shape. Going forward, we would definitely want to print our own board in-order to cut down to a more convenient size – namely something which could attach easily to a keychain and be unobtrusive in a user’s pocket or bag.

Roxanne was finishing up the code and making the JSON create more interesting sentences. With all of the vocabulary they added, there is a total of 203’112’000 possible unique advices. When writing their code, they were really inspired by oracles sur as the Yi Ching, so their new sentences are more mystical than goofy.

Example sentences:

You have to not let go the answer vis-a-vis death . You’ll get hurt.

Have you tried not to see life and them . Who cares.

Have you tried to fight your mom vis-a-vis life … but it’s a waste of time.

I know! Just implement her with love . Alea jacta es!

You could always not say the advice concerning her … LOL

User Testing Materials

user testing plan

Prep

No extra supplies needed, possibly just a short time to recharge the battery during testing sessions. We also should not require any repair materials.

Plan

Testing will be recorded through a combination of photographs, video and journaling.

There may be some deviation in the recording methods, as they will be dependant on where they are while testing the product. However, we will aim to be as consistent as possible.

As of today, we’re planning on testing Wednesday, or possibly Thursday depending on build-time.

Since our devices are not in constant use, our plan is to test for 24 hours, using the device whenever we need to make a decision or need advice. The users (us) will make our own interpretations of the advice given and will record this process of interpretation and outcome.

The testing will be conducted during day-to-day activities. We will most likely be independent when testing, as the devices do not require interaction.

After

The debrief will be held as a discussion/write up session after testing, and before our blog-post is submitted.

The user photo/video journal will be added to the blog post, along with comments from the users.

link to end-of-session report forms

https://docs.google.com/forms/d/e/1FAIpQLScYSNOkyMNSGYy30nYR-r6JU89ptl06flzWJ_yRy-4hUoioBg/viewform?usp=sf_link

photos, video, or other media gathered in the field

20171207_18155220171207_215854 20171207_221113 20171207_224903 20171207_204359  20171207_21050520171207_184950 20171207_185006 20171207_185025 20171207_193652 20171207_193658

20171207_205057 20171207_205110 20171207_205117 20171207_205128 20171207_205807 20171207_170047 20171207_211515 20171207_213045

summary of the testing process 

We set out to use the device for a day to see how useful it would be and to propose its advice to other people we were to be interacting with.

For Sean’s testing period, he spent around 20 hours testing the device on any and all sorts of decisions he made during this time. He also conducted mini-user testing in a social environment when he asked friends and passers-by to test the device. Due to the short amount of testing time we had available, we were unable to hand the devices off to users for prolonged user testing, thus only being able to conduct these ourselves.

reflections on your findings

The results of the user testing were interesting, though mostly proved what we already knew. Some users took to the philosophical and mysterious overtones of the experience very well, however a lot of the users did not seem to understand their role in the interpretation of the advice. This is not necessarily a weakness of the product as much as evidence that we have a specific target market.

EXHIBITION

20171211_152058 20171211_152114

References & Related Works

references

Adafruit Feather M0 WiFi with ATWINC1500: Updating SSL Certificates: https://learn.adafruit.com/adafruit-feather-m0-wifi-atwinc1500/updating-ssl-certificates

ArduinoJson: Manual: http://arduinojson.org/doc/

What and where are the stack and heap?: https://stackoverflow.com/questions/79923/what-and-where-are-the-stack-and-heap

WiFi101 Library: https://www.arduino.cc/en/Reference/WiFi101

Arduino HTTP Client library: https://github.com/arduino-libraries/ArduinoHttpClient

RawGit: https://rawgit.com

 

Experiment 3: Palmistry Yi Ching

yichingpalmistry

A Project By
Roxanne Baril-Bédard

Find it here, activated on click.
Here is the repo.

This project consists of a lazercutted acrylic box with an hand engraving, two small holes for photocels. When pressing the hand on the hand engraving, when the photocels are completely obscured, a random number between 0 and 63 is generated. This number is found in two APIs JSON, getting the character associated with the number, the title and the description. It is shown in browser with a cute background.

Sketches and process

sketch_plamistry

Box model

box model

I made the box outline using this . The red lines are cut. The black area is etch.

Process of my illustration with inspiration for palette

illuandinspo

Pictures of the box and microcontroller’s circuit

img_20171110_122859_mh1510335257239 img_20171110_122923_mh1510335223390 img_20171110_123003_mh1510335322046

Journal

I have decided to make a palmistry type machine, like the ones in the entrance of theatres. I am interested in all type of divination and i wanted to use the yi ching, which conveniently has ascii character that I could go and write. i find that the yi ching has really interesting description and is not widely known, so it has a little more spice than all the tarot themed projects i have seen in esoteric tech. I like the idea of having a computer tell you your fortune

i also wanted to make a project in which i could make cute visuals, so i decided to go for a physical sensor and some p5 sketches and vector illustrations. i want it to be a kawaii overload. i thought of using a bio sensor but they were really expensive, so instead i decided to use a light sensor and to design an acrylic box to remind people of the palmistry machines, inviting them to put their whole hand on the box in rder to have a more immersive experience, even if in the end it’s picking up light or lack thereof, more like a button almost but with a more immersive interaction.

i first worked on designing a box that i could lazer cut. i figured having the final box, being that it takes time, should be done earlier rather than later. i first designed the illustrator file that I would send to the lazer cutting.

then, i started working on making the light sensor work.

once the box was cut, I glued it together. i wanted to put some glitter under the hand to give it a bit more pizzaz but it did not see through the sketched acrylic. i went for a cloud texture, to give it a drwamy feel.

my light sensors are working together. i just added their total divided by 2 to have the data i want to send to p5 byte sized.

i had a hard time figuring out how to “talk” to the api’s json but I was helped by my friend via screensharing. it was hard for me to understand how to navigate an array and to understand how to build a function – I still barely understand what the variable in the parenthesis is for.

another problem I had was finding the perfect api with all of the info I wanted on each hexagram (description and character of the Yi Ching hexagrams selected via a random generated number) so I ended up taking from two apis instead.

for hexagram description:

https://raw.githubusercontent.com/la11111/willie-modules/master/texts_iching/iching2.json

for hexamgram character:

https://cdn.jsdelivr.net/npm/i-ching@0.3.5/lib/data.json

i finished by spending some time making an illustration and background. I added also a little bit of responsivity, and the website works for other size of window than 1680 x 1080 if it is in a 19:9 ratio, the text resizing proportionally.

Experiment 1: Material Mad Libs – “Small Craft Warning”

Small Craft Warning

by Roxanne Baril-Bedard, Kylie Caraway & Dave Foster

Project Description

From our mad lib cards – proximity (sensor), DC Motor (actuator), paper (material), and angry (adjective) – we created an animated diorama depicting a boat in an angry sea. We used two DC motors: one that makes the cyclone revolve, and the other making a small paper boat with a miniature sailor move up and down. As people get closer to the piece, the motors increase speed, giving the audience the all mighty power to control the elements.

We decided on an agitated sea because we wanted something that expressed anger in a way that’s symbolic, but also playful and relatable in a representational way. As water is capable of taking on many forms, we wanted to experiment with various ways to form paper. The properties of paper we amplify are its lightness and malleability, or how it can be folded and cut to reach its volumetric potential.

img_0356

Circuit Diagrams

Code

https://github.com/kyliedcaraway/DIGF-6037-001-Creation-Computation-Small-Craft-Warning-RoxanneBarilBedard-KylieCaraway-DaveFoster/blob/master/EXPERIMENT1_CREATIONCOMPUTATION_SMALLCRAFTWARNING_ROXANNEKYLIED.ino

Design Sketches

exp1-design1-rjbb-kc-df exp1-design2-rjbb-kc-df exp1-design3-rjbb-kc-df exp1-design4-rjbb-kc-df exp1-design5-rjbb-kc-df exp1-design6-rjbb-kc-df

Finished Piece

exp1_front_rjbb_kc_df exp1_back_rjbb_kc_df

Process Journal

Sept 29th: After learning how to use the Feather controller with Arduino, we began by testing the proximity sensor with the code provided by Kate and Nick. Once we downloaded the NewPing library, we were able to successfully operate the ultrasonic proximity sensor. Testing the distance numbers in the serial monitor, we changed the maxDistance it could pick up input from 200 cm to various numbers, in order to see how close someone would need to be to our sensor for it to operate. We kept the maxDistance at 200cm so only people in close proximity would be recognized by the sensor.

Next, we tested our DC Motor using the code provided by Kate and Nick. One problem we were unaware of was the need for an additional USB for power as the power output directly provided by the Feather can not provide enough power for the motor to run. We successfully got our DC Motor to run at 0, 128 and 255.

After successfully getting our proximity sensor and DC Motor to work in Arduino, we wanted to combine the two to react to one another. We decided to create a project that increases the motor speed based on proximity to the project. We wanted to create 3 phases: no movement (0), slower movement (128) and faster movement (255) based on a maxDistance of 200 cm. With Nick and Kate’s help, we were able to combine the two codes together and use “if” statements to allow the two pieces to “talk” to one another.

img_0342

Once we were comfortable with a basic code, we began discussing our ideas for the project. Dave wanted to do straws with flags that said “go away”, Kylie wanted to create a character that spun, and Roxanne wanted to make something more abstract, such as a flower or grass. We broke after class to recollect our thoughts and ideas on where we would go. At that point, we were pretty puzzled by the way we could use the DC motor, how to transform rotating power into a movement that could be interesting to harness.

Oct 3rd: Kylie and Roxanne met to implement a plan and begin working on diagrams, the code, and the blog post. After failing to make grass move (the paper was too flimsy and the movement was not noticeable), we changed our idea to a tropical storm. Kylie was inspired by paper animations, as depicted in the “fourteenballstoy” and “robives” (links below in our project context.) We decided we wanted to create two different types of movement with two separate DC Motors: one would spin an object, extending the reach of the DC motor’s rotational points with a straw, and one would drive a cam system causing vertical motion. We decided to use the cam system to move a small boat up and down, and a simple connection to spin the “cyclone”. Roxanne did a form research for the cyclone cutting different plys of paper, as well as the folding the paper boat, while Kylie began constructing and setting the box up for our scene (cutting holes for plugs and proximity sensor, making shelves to place the motors in, soldering wire to motors, wedging the breadboard in without gluing it down.)

img_0345

img_0346

Oct 4th: The next day, Dave reconstructed the box using foamcore and white glue. Kylie attempted to connect two DC motors, but after many hours tinkering could never figure out how to get two motors to work at the same time. Nick fixed this in about 30 seconds. Rather than rewiring the second DC Motor on a new circuit, we connected it to the same circuit as the first one. 

We had to wait for the glue to cure before trying the motors moving the different diorama elements, being unwilling to cause an inadvertent self-destruct circuit (as it were). While we waited for the glue to dry, Kylie and Roxanne drew various waves simulating a force 7 gale on the Beaufort scale (hence the project title “Small Craft Warning”) using 2 different shades of blue-gray paper and cut them up to arrange on top of the project in order to make an “angry” sea. We also went and bought a “sea-sick” little miniature person like the ones used in architecture maquettes to put in the boat. Its scale really allowed for a dramatic intensity in the scene.

img_0349

Oct 5th: Things being glued up and solid, now was the time to test our piece. We didn’t know how strong we wanted the motor so when we tried 10 it wouldn’t spin at all. We were cautious because we did not want to break the fragile paper pieces. After trial and error, we ended up testing 255, the fastest speed for the motor, and it surprisingly was not too strong. We had some problems with the code. It didn’t want to activate from further than 10 cm. The sensor was emitting a high pitched beep that we didn’t really know what to do about. We tinkered with the code, doing A/B testing, and it seemed that the motor wouldn’t start at too low a speed but could continue turning if first turned on at a higher speed. We also modified the tresholds of activation, thinking about the exhibition context and the way we think people will occupy the space and interact with our piece. People facing our piece but who are further away than 80 cm from it don’t activate it, standing between 20 and 80 cm activate the motor at a speed of 160, and standing between 0 and 20 cm from the front of the piece activate the motor at a speed of 255.

Project Context

Reactive artworks are something that always seem to pique the public interest. At UQAM’s Fashion School in Montreal, one of the most famous designers and professors is Ying Gao. She focuses on interactive garments often exploring ideas such as emotions and other intangibles. In that optic, we wanted to capture or question the feeling of power in relation to something that it activated by proximity. Indeed, the viewer becomes the reason the sea is agitated, making them question their agency, literally making waves from their sheer proximity. This reactive agitation looks like a form of agency too, then putting in perspective the concept of agency as a whole. 

Another artist that has inspired us is Zimoun. Located in Bern, Switzerland, he creates installations that evoke emotional, yet effortless soundscapes using raw materials and numerous motors. While we decided to create a visual rather than audible piece, Zimoun’s works depict an effective method of combining minimalism and simple components to create raw emotion.

After pondering the various permutations of the word “angry,” and following our artistic inspirations, we decided that rather than direct “angry” gestures or movements, we should go with a more abstract appela to the idea.  Various natural phenomena produce what are called “angry” scenes or situations.  We decided to fall back on the old sailor’s description of the sea being “angry under stormy conditions (specifically about force 7 on the Beaufort Scale as a gale at this level triggers small craft warnings in most ports worldwide).  This would be achievable with available materials as a diorama with mobile elements that could hopefully capture the intensity of such a scene.

We were intrigued by the idea of creating an automaton using paper. Fourteen Balls Automata provided various pieces we used for both inspiration, as well as a guide to explore various techniques to make an automaton work properly. We also used Robives.com Design for Paper Animation as a resource for mechanisms in paper animations. We specifically used Rob Ive’s cam system guide to make our boat move up and down in our diorama.

Bibliography

ARCSTREET.COM. “‘FASHIONING THE INTANGIBLE’ : YING GAO CONCEPTUAL CLOTHING / NOV 14 – DEC 15, 2013 / ‘UQAM CENTRE DE DESIGN’ / MONTREAL, QUEBEC – Arc Street Journal / En Mode Art Fashion Design Style Music Architecture News.” Arc Street Journal, 18 Nov. 2013, www.arcstreet.com/article-fashioning-the-intangible-ying-gao-conceptual-clothing-nov-14-dec-15-2013-uqam-centre-de-121181666.html.

Huler, Scott. Defining the Wind the Beaufort Scale, and How a Nineteenth Century Admiral Turned Science into Poetry. Crown Publishers, 2004.

Ives, Rob. “Mechanism.” Robives.com Designing Paper Animations, edited by Rob Ives, 2017, www.robives.com/. Accessed 5 Oct. 2017.

Smith, Matt. “Automata.” Fourteen Balls Automata, edited by Matt Smith, www.fourteenballstoy.co.uk/index.htm. Accessed 5 Oct. 2017.

“Zimoun : Compilation Video 3.7 (2017).” Vimeo.com, uploaded by Zimoun, Aug. 2017, vimeo.com/7235817. Accessed 5 Oct. 2017.