What Is My Purpose?

Project Title: What Is My Purpose?

By: Nilam Sari

Project Description:

This project is a 5x5x5 inches wooden cube with a 3D printed robotic arm. A hammer head shaped piece is attached at the end of its arm. The arm repeatedly hits the top part of its own body, a sheet of clear acrylic. This robotic piece appears to be self-harming itself.


I started this project by creating a timeline because I thought I should be more organize with this project to meet the tight deadlines.


I modeled my design on Rhino3D to help me visualized the arm that needs to be fabricated with the 3D printer.


At first I created the arm to hold a chisel, but after printing and testing it with a servo, the servos couldn’t handle the weight of the chizel so I compromised with an IKEA knife. That didn’t work either, so I compromised with this 3D printed hammer head that hold a 1in bolt.


At first I had trouble attaching the servo motors into the 3D printed parts, but Arsh suggested that I cut out the little servo extensions and fit it into the 3D printed parts instead of trying to make a hole that fits directly into the servo, and it worked perfectly (Thank you Arsh!).

Next is time to code the movements of the arm.

At first, I achieved the movements that I wanted, but it is too predictable, it feels too ““robotic””. So I randomized the movements within a range and got the results that I wanted.

Then I worked on the wooden body part. This went smoothly and pretty uneventful. The lack of tools in the maker lab makes the process take longer than it needed to, but I managed to do it according to my timeline.


When I was done with the wooden body, I attached the arm onto it.

When I did a test run, it was doing okay. The only problem is the chisel/knife problem I mentioned above. Then I installed the inserts for bolts at the corners of the top of the box to secure the 1/16 inch acrylic sheet.

At first I wanted this piece to be battery run, complete with an on/off button at the bottom of it. But when I tried using 9V battery it wasn’t strong enough to run the two servos. So I asked Kate for help and learned that it’s more about the current rather than the voltage. So I got four AA batteries on series and try to run it again. It was still not enough. Kate suggested to put another four AA batteries and attach it parallel to the other four. And it worked!

However, the battery couldn’t last long enough and the servo started getting weaker after a couple of minutes. It was a hard decision, but I had to compromise and use cable to power the robot from the wall outlet.

For the show, I originally wanted to add delay() so the servo motors get breaks in between and don’t overheat. But when I added delays the motor doesn’t run the same way it did without the delay.

Code: https://github.com/nilampwns/experiment5





Circuit Diagrams: 


Project Context:

We are constantly surrounded by technology that do tasks for us. When machines do not carry out their tasks properly, they are deemed broken. Can we co-exist with machines that are not designed to serve us, humans? What kind of emotion is evoked by seeing a machine hitting itself over and over? Are we projecting our feelings onto non-living objects? These were the questions that lingered in my mind when I decided to create this robot.

I believe that robots have the potential to teach us, humans, how to empathize again. That is basically the belief that I have that drove me into graduate school and life in general. This piece that I created for experiment 5 is in some way a part of my long term research in life. Can robots teach us, humans, to empathize with a non-living being, and ultimately, with each other?

There has been multiple research that ask the participants about the possibility of the robots getting hurt or where participants are asked to hurt the robot. As Kate Darling (2015) wrote on her research report paper, “Subjects were asked how sorry they felt for the individual robots and were asked to chose which robot they would save in an earthquake. Based on the subjects preferring the humanoid robots over the mechanical robots, Riek et al. postulate that anthropomorphism causes empathy towards robotic objects.”

People tend to empathize more with robots that look like them. I want to push how far can I remove my piece from anthropomorphization as much as I can, and push it even further by making a robot that its whole purpose is to hit itself. That’s why I created the body to look like a wooden cube, with visible microcontrollers, the only thing that makes it looks a bit anthropomorphized is the robotic arm.

The act of self-harming is jarring. It’s emotional. It’s a sensitive topic. But what does self harming mean to a robot that cannot feel pain? It does not have sensors that detect pain.

Not so much about self-harming, but Max Dean’s piece, “Robotic Chair” (2008) is a robotic chair that explodes it self into multiple parts, then it slowly search for its missing piece and putting itself back together again autonomously. The viewers’ reactions were positive emotions. “As stated further on the Robotic Chair website, the chair has been shown to elicit “ empathy, compassion and hope ” in its viewers.” (Langill, 2013).

I acknowledge that my approach is very philosophical. The title of the work itself is “What Is My Purpose?” a question that even us humans have not found the answer to yet. I hope to make people think that it’s okay for machines to not have a purpose, to not serve us, still exist around us, and for us to still empathize with them. That way maybe humanity could learn to empathize more with each other.

Exhibition Reflection:

The exhibition was lovely. I got to talk to so many people about my research, receive feedback and reactions, or just simply chat with them. Unfortunately, the servos of my piece got overheated and busted 30 minutes into the show, I thought it was the arduino but I unscrewed the piece to hit the reset button but it still didn’t change anything. I tried to let it cool down for a couple of minutes but it also didn’t work. I have a video of it running so I was showing the videos to people who were interested in seeing it. Thankfully it was running when it was the busiest.

People told me they liked my work. I asked them what they like about it, and a couple of them said that they think it’s funny and silly. Some said they can feel the frustration of the robot. Some felt so empathetic that they felt bad watching the robot hitting itself over and over. One person even said “idiot robot” at it. It was a mixed bag of reactions but most people enjoyed the piece.


Darling, Kate.Empathic concern and the effect of stories in human-robot interaction”. with P. Nandy and C. Breazeal. proceedings of the IEEE international workshop on robot and human communication (roman). 2015. 

Dean, Max. “Robotic Chair”. 2008.

Langill, Caroline Seck. “The Living Effect: Autonomous Behavior in Early Electronic Media Art”. Relive: Media Art Histories. MIT Press. 2013.

Eternal Forms

Experiment 5: Proposal


Catherine Reyto, Jun Li, Rittika Basu, Sananda Dutta, Jevonne Peters

Project Description

“Eternal Forms” is an interactive artwork incorporating geometric elements in motion. The forms accelerate their rotating speeds when a user interacts from ranging proximities. The construction of the elements is highly precise along with accurate measurements to generate an optical illusion of constant collision. The framework establishes from a circular grid system, which is designed by several interlinking ellipses running across a common circumference of the centre ellipse. 

Parts/materials/ technology list 

  • Servo
  • Distance sensor
  • Pulley
  • Acrylic
  • Wood
  • Arduino – distance sensor and servo motor based coding
  • Laser Cutting

Work Plan

Base Circular Grid – Sets the basis of the design constructions.
Design Explorations
  • 24th Nov (Sun): Pattern exploration and design created based on the finalized patterns
  • 25th Nov (Mon): Design a network of connected illusions using the finalized patterns. Also, try out prototype laser cuts of the final pieces. Check out the weight and scaling optimizations.
  • 26th Nov (Tue): Work on the code – Arduino side. Also, try the servo functioning with the prototype model.
  • 27th Nov (Wed): Combine the servo and the sensor part of the experiment and check code functioning.
  • 28th Nov (Thu):  Create the final design laser cuts with the materials finalized – acrylic or wood or both. Additionally, creating the mounting piece that needs to go up on the wall.
  • 29th Nov (Fri): Iterate and work on creating multiple kinetic sculptures and make them interactive. Also, work on the display set-up of the installation.
  • 30th Nov (Sat): Trouble-shooting
  • 1st Dec (Sun): Trouble-shooting
  • 2nd Dec (Mon): Trouble-shooting
  • 3rd Dec (Tue): Final Presentation

Physical installation details

They will be multiple interactive artworks mounted on walls where the user can observe the changing rotating speed based on distance proximities. These artworks will be animated by Servo Motors.

Resource List

10 Nails and hammer – For mounting the artwork.

Extension Cord

Power Plug

Wall space of 5ft x 3ft to mount the piece.


  1. Kvande, Ryan. “Sculpture 40″ – Award Winning Optical Illusion Sculptures”. Spunwheel, 2019, https://www.spunwheel.com/40-kinetic-sculptures1.html.
  2. Whitaker, Lyman. “Whitaker Studio”. Whitakerstudio.Com, 2019, https://www.whitakerstudio.com/.

Experiment 5 — Leave A Window Open For Me

1.Project title: Leave A Window Open For Me

2.Project description: 

For this project, I’m aiming to create an installation that has space within a box build with mirror acrylic sheets and wood. It is meant to be filled with mirrors to form a room that has infinite reflections. It is meant to reflect my own experience in new york when I was going through an emotional break down for a long period of time. I was suffering from insomnia and depression that I had no motivation to do anything, the window was the only thing that I stare at the most, from day to night, from rain to shine.

I managed to step out of the emotional trap eventually, but looking out of the window seemed to become one of my daily routines. Except for this time, things have changed.

There will be a 3D printed bed placed in the middle of the installation to represent my room and my mental status. One side of the board would have a cutout and play the role of the window, which will have the led matrix panel placed behind it.

The audiences are expected to observe the installation from the peek hole at the front board.

3.Parts / materials / technology list:

LED matrix panel, Arduino UNO, acrylic sheets with mirror backing, glue gun, 3D printer, laser cutting machine, transformer adapter, tool power cord, etc.

4.Work plan: 

Nov 21-22: Proposal & Discussion

Nov 23-24: Coding & Researching

Nov 25-26: Purchasing materials & Building small prototype & test

Nov 27-28: Coding & 3D printing & Laser cutting

Nov 29- Dec 3: Debugging & Assembly

Dec 4: Open show

5.Physical installation details: 

The small room in the graduate gallery which is always used to play films because preferably a darker space, a pedestal that has the installation on will be placed against the wall.


Initial sketches:

img_3487 img_3488 img_3489 img_3490 img_3491 img_3492





Project Title – CityBLOCK

Team Member – Rajat Kumar

Kate Hartman & Nick Puckett

Project Description
CityBLOCK is a modular city builder experience where the user controls the city building blocks with designed cubes to build their desirable dream city by rearranging the cubes.

What makes Toronto so unique? Being the largest city, most diverse, It’s home to a dynamic mix of tourist attractions, from museums and galleries to the world-famous CN Tower and, just offshore, Toronto Islands. And just a short drive away is Niagara Falls.
This project will let you build your version of Toronto city.
Everyone has their own imagination of their city. Since I came to Toronto city for the first time and i really surprised to see the fusion of old and new architecture of the city. I always wanted to show what I think about this city and I almost everyday go to the harbourfront and admire CN Tower, thinking about how it became the “ICON” of the city. The 0ther historical architecture of the city like Flatiron building, St Lawrence Market makes the city more unique and identifiable.
This is my take on to represent Toronto city from my perspective and of course, this city builder is modular so anything can be added later.



The Reactable is an electronic musical instrument with a tabletop tangible user interface that was developed within the Music Technology Group at the Universitat Pompeu Fabra in Barcelona, Spain by Sergi Jordà, Marcos Alonso, Martin Kaltenbrunner, and Günter Geiger.

What I like about the retractable is any physical object can be smart and can interact, alter, do some modification with digital content.

The only input device is used here is only camera and it looks very clean in terms of setup for an open show.

Working flow

This set up is for the table to interaction and requires a lot of effort to make a table and configure the camera with a projector.


  • reacTIVision 1.5.1
  • TUIO Client – Processing
  • OV2710 – Camera
  • Projector – Which supports the short-throw lens with a throw ratio of 0.6 and below.
  • Infrared LED -850nm IlluminatorMaterial
    • Wooden/Paper blocks.
    • Table 4.5ft*3.5ft
    • Projector Mount
      These were the minimum requirement that I collected from several posts from reacTIVISION forum and

First of all, I wanted to try the reacTIVISION software and break it down to understand its working and constraints. Also, there is nothing much on the internet about it so I just started with documentation.

Work Plan

22nd – 23rd -ideation finalizing the concept
24th – 27th -Coding, configurating reacTIVision with processing.
27th – 30th-Testing and making ready for exhibition.
1st – 2nd-Installation/ setup
3rd – 4th-Exhibitions

After the first meeting, my whole project got flipped. Initially, I was thinking to make a tabletop interactive city builder and it became more awesome which was tracking fiducial markers from the front and projecting on the vertical plane. It was a smart move to hide the markers and making the aesthetics of the project more cooler.

I removed IR LEDS, Camera and started with webcam in order to look clean.

group-11Initial graphics and their coupled fiducial markers



In this test1 I tested the recognition ability of reacTIVISION to detect the fiducial marker. I captured the image of the test marker and tested with reacTIVISION with different sets of the brightness of my smartphone and it recognized in almost every test run.

This time I was testing USB camera by Logitech and at this point, I don’t know how to configure the camera with reacTIVISION(it was .xml file) so without autofocus, it detects all the marker and it provides very low depth and with autofocus, it detects max 4 markers and it provides the larger range of depth but the marker that I have placed earlier in near to the camera gets out of focus when I try to put another marker far to the camera and hence limiting me tho the very low depth range interaction area.

After so many testing with a web camera, I found the minimal distance between camera and marker should 40-50 cm and its max distance should be 75-80 cm thus giving me a nice interactive region for interaction.

In test3 I tested the lag between the user’s input and the feedback on the screen. Text tracking was pretty good and constantly stick together with the marker.

In test 4 I tested with the graphic. initially, it was not rendering on the screen because there was another line of code to render a white background over the top of the image. After fixing that new challenges arises which was aligning images in one horizontal plane. The anchor point of the image is set in the center by default and it was fixed by offset with one common reference image. The distance between the two images is also affected by the camera distance so this created a need for a fixed platform.


I selected some historical buildings from Toronto city and made their digital vector for laser cutting.  Nick gave me a suggestion to have multiple buildings for one building like 4 CN Tower. For CN Tower I have assigned one marker so I have to make 4 identical markers for each CN Tower. So in this way, I have multiple replicas of the same building and to keep the track of buildings and assigned markers I created an excel sheet. Then I attached markers to their respective buildings.

Internal Critique

Some valuable inputs from peers and mentors
1. A staircase like a platform will help the user to interact with the BLOCK pieces in an intended way.
2. Add some animation or triggers to move images so that the project will be more interactive.
3. A bigger TV will be more impactful for the project rather than the projector.

For 1 a quick fix would be making region on the table with a marker but there was some user holds the building from the base and in this process, they unknowingly hinder the marker from the camera. So the staircase (did the laser cut on the same day after critique thanks to Jignesh for helping me out staircase ) was the perfect solution as it turns out to be after the OPEN SHOW.

For 2 I did make the background animation.

While playing it was too laggy and too slow to process the video although it is 5 MB and still I was getting too much lag. So disabled the video.

For 3 I used a TV screen as the graphics were much brighter and sharper and the most important thing was that the set got cleaner and minimal.


Exhibition SetUP



Take-Aways from the Exhibition

Some people were so surprised and looking for the sensor from where I am tracking the buildings. Some people touched on the table to check is there any Magnetic field or not. Some people thought that I am using some algorithm that detects the shape of the buildings ( kept me thinking for a while) and replicating them on the screen.
After explaining to them how cityBLOCK works, they complimented the project smart and clean in terms of set up.
There was an old couple and the man was too happy to play with the blocks and his wife told him to we have to see other’s work as well and I think this was a success for my project about being a simple interactive project.

Challenges & Learnings:

The biggest hurdle for me was that I had no idea where to start. There is no proper resources but I did know that this can be done all I need was just one project to understand the communication between the camera and processing.

Getting image graphics on the screen when I show a marker to the camera was the most time-consuming phase of the project. This was just a code issue that draws another white background on the top of images.
Once I got the images then it was just completing the task which was necessary for the project.






Project Title
Animated Robotics – Interactive  Art Installation

Project by Jignesh Gharat

Project Description

 An animated robotics motion with a life-like spectrum bringing emotive behaviour into the physical dimension. An object with motion having living qualities such as sentiments, emotions, and awareness that reveals complex inner state, expressions and behavioural patterns.
He is excited to peak outside the box and explore around the surroundings but as soon as he sees a person nearby he panics and hides back in the box as if he is shy or feels unsafe. He doesn’t like attention but enjoys staring at others.What if a blank canvas could try to express its self instead of viewers projecting their interpretations, emotions, beliefs, and stories.


  • Mackbook pro
  • 1 Arduino UNO/ Nano
  • 2/3 Servo motors
  • VL53L0X laser distance sensors/ 
  • LED Bulb
  • Power Bank 20000 mah


  • Acrylic/Paper/3D printing (Periscope)
  • 4 Plywood 30” x 30”

Work plan

22nd – 23rd _Material procurement, storytelling, and ideation
24th – 27th _Code- Debugging, Calibrating distance sensor, Mockup, Testing
27th – 30th_Iterating, Fabrication, and Assembling
1st – 2nd_Installation/ setup
3rd – 4th_Exhibitions

Physical installation details

The head of the robot(Periscope) is observing the surrounding and is curious to explore things happening  around. As the viewer comes in the range of the distance sensor the robot hides quickly into  the box and peeks out. when there is no one in the range of the sensor the robot pops out again.

Exhibition Space

Preferably a silent art gallery space with a plinth. Spotlight on the plinth.

Interactive Canvas

  1. Project Title

Interactive Canvas (Working title) by Arsalan Akhtar


  1. Project Description


Ambient advertising is key to win consumers today which encourages discovery of various story telling tools. The “walls” around us are an empty canvas that could tell a lot of interactive stories. Thus, I would like to make an interactive wall that tells a story through sound or display when in close proximity. The subject of the story is about breach of digital privacy and how we in this digital age have given permissions to mobile apps that we use daily.


  1. Parts, Materials and Technology


  • Arduino micro
  • Resistors
  • Conductive Material
  • Projector
  • Piece of wood
  • Processing and Arduino Software
  • Photoshop for illustration


  1. Work Plan


  • Nov 20-22 : Low fidelity prototyping with arduino, conductive material and resistors
  • Nov 23-24: Work on processing to discover storytelling drawing
  • Nov 25-26: Procure wood and engrave storytelling assets
  • Nov 27-28: debugging of hardware and software
  • Nov 29-Dec1: debugging of physical installation
  • Dec 2: test run
  • Dec 3-4: Display


  1. Physical Installation

A rectangle piece of wood or paper (around 6ftx6ft) attached or hung against the wall with story engraved on it and facing a projector. The visitors could interact with features on the wall and learn about the story. Thus, a flat wall such the DF gallery could be a great place.

  1. Resource list 


  • a bracket to hold the piece of canvas against the wall.


Thank you







Experiment 5 – proposal

by Nadine Valcin


Project description
Much of my work deals with memory and forgotten histories. I am constantly searching for new ways to portray the invisible past that haunts the present. (Un)seen is a video installation about presence/absence that uses proxemics to trigger video and sound. It recreates a ghostly presence appearing on a screen whose voice constantly beckons the viewer to get closer, but whose image recedes into the frame as the viewer tries to engage with it. Ultimately, the viewer is invited to touch the cloth it is projected on, but if they do, the ghost completely disappears from view, leaving an empty black screen.

With permission, I will be using unused footage from a previous project comprised of closeups of a Black woman on a black background and will be recording and mixing a new soudtrack.

Parts / materials / technology list
MacBook Pro
Arduino Nano
Distance sensors (2 or 3) HC-SR04 or KS102 1cm-8M Ultrasonic Sensors
King size bedsheet, hanging from rod
2 speakers (or 4?)

Work plan
22.11.19-24.11.19     Edit 3 video loops
24.11.19-25.11.19     Write ghost dialogue and research sound
26.11.19-27.11.19    Record and edit sound
22.11.19-27.11.19     Program distance sensors and interaction
27.11.19                       Mount bedsheet on rod
28.11.19-02.12.19    Testing and de-bugging
03.12.19-04.12.19    Presentation

Physical installation details
The ideal space for the installation would be Room 118 on the ground floor.
With permission, I will be using footage shot for another project comprised of closeups of a Black women on a Black background. The ideal would be for the image to be projected from the rear onto the sheet. This would require a dark space and enough space behind and in front of the sheet. The mount for the sheet will be kept deliberately light. Metal wire can be used to hang the rod holding the sheet from the ceiling, but would potentially require discrete hooks screwed or attached to the ceiling.

Set-up Option A


Set-up Option B

unseen-setup-option-b1Resource list
Hand drill and other tools TBA
2 (or 4?) speakers
2 pedestals for the sensors (?)
Cables for the speakers
Power bar and electrical extension cords

– Can I have 4 speakers and have the play different sounds in pairs? I.e. the speakers behind the screen wouldn’t play the same sound as the speakers in front of the screen
– Do I actually need 3 distance sensors – 1 behind the screen for the functions triggered by people touching the screen and two mounted (possibly on pedestals) slightly in front of the screen at each side?
– Is it possible to hang things from the ceiling?
– Would a motion sensor also be useful to activate the installation when someone comes into the room?

Experiment 5 Proposal

Zero UI (Working Title)

For Experiment 5, I’d like to expand on tangible interfaces and explore the use of Zero UI (invisible user interfaces) and having technology fully incorporated within a room (or household) with the use of pressure sensitive furniture and sensors with auditory feedback to elevate regular objects (a book) to create a more immersive experiences instead depending on screen based experiences. This experiment is an exploration in creating a multi-sensory reading experience with content catered towards the book’s contents.

The experiment would involve the use of a pressure sensor chair that lights up a nearby lamp when the participant sits down. The pressure sensor may be installed physically on the chair or hidden away with the design of a cushion or lining. The participant can pick up the book and read or flip through the book and hear the music referred in the book playing from a speaker hidden away (possible below or behind the chair). The audio would be mapped depending on what section of the book the participant is on.


The book I’d like to use is still undecided but one with many musical references such as Haruki Murakami’s book, The Wind Up Bird Chronicle, where as the book begins with the protagonist listening to Rossini’s the Thieving Magpie and refers to many other classical musicians. Another possible book could be J.R.R. Tolkien’s The Hobbit with the movie franchise’s music by Howard Shore playing instead.

Project Main Components and Parts

  1. Arduino Nano
  2. Flex Sensor
  3. Pressure Sensor
  4. MP3 Shield (?)
  5. External Speakers
  6. Lightbulb and Wiring (Lamp)

Additional Components and Parts

  1. Chair (Supporting Prop)
  2. Fabric/Cushion (To Hide/Place Sensor)
  3. Book (Prop)
  4. Mat Rug (Prop To Hide Cables)

Workback Schedule

Fri, Nov 22 –  Proposal Presentation
Sat, Nov 23 –  Coding + Gathering Digital Assets + Building Lo-Fi Breadboard Prototype
Sun, Nov 24 – Coding + Gathering Digital Assets + Building Lo-Fi Breadboard Prototype
Mon, Nov 25 –  Coding + Creatron for Final Components
Tues, Nov 26 –  Presenting Progress of Lo-Fi Breadboard Prototype + Revisions
Wed, Nov 27 – Prop Purchasing
Thurs, Nov 28 – Laser Cutting Components and Coding
Friday, Nov 29 – Troubleshooting / Bug Fixes
Sat, Nov 30 – Troubleshooting / Bug Fixes
Sun, Dec 1 –  Troubleshooting / Bug Fixes
Mon, Dec 2 – Troubleshooting / Bug Fixes
Tues, Dec 3 – Final Critique
Wed, Dec 4 – Open Show

Physical Installation

I’d like to ideally place the set up in the corner of a room and with dimmer lighting so the lighting from the lamp is more visible when it turns on. Supporting objects within the set up would be the chair participants can sit on with the sensor attached.



Resource List

  1. Chair and Side table
  2. Will need extension cords for power
  3. External speakers

Info for Open Show

Preferably displayed in the Grad Gallery room. I will just need an electrical outlet nearby or extension cord. We will need to book external speakers from AV.

Project Proposal: What is my purpose?

Project Title: What is my purpose?

Work by: Nilam Sari

Project Description: 

This project will be a small part of my thesis project. This project is going to be a 5 x 5 x 5 in wooden cube with microcontrollers inside that controls its arm to repeatedly stab itself with a chisel, slowly chipping off its body. The purpose of this project is to evoke an emotional reaction from its viewers.

Parts / Materials / Technology list: 

  • Wooden case (hard maple)
  • 3D printed arm parts
  • Chisel
  • Arduino Uno
  • 2 Servo motors

Work Plan:


Physical Installation Details:

The work will be battery powered, autonomously moving, non-interactive, sitting on top of a pedestal. There will be an on and off switch.

Resource List:

One small width pedestal to hold a 5 x 5 x 5 in cube at display height (around hips).

Information for Open Show:

Would like to borrow the pedestal from the graduate studies. Work can be shown anywhere in the gallery or the 7th floor space.