Category Archives: P2 : Tools of Collaboration

Catching Photons

7j010x

Catching Photons

Anas Raza | Firaas Khan

At the beginning of the course, we developed an interest in the potential of robotic arms to produce impressive light trails using light painting techniques. And how can we use the still images of lighting painted trails to create a stop-motion animation with a human interaction involved in this narrative? Project 2 gave us this exact opportunity to explore the co-creative environment made of light painting, human interaction, and robotic arm movement. Our narrative focused on a human actor attempting to catch the light, which we developed with an ironic twist, given that light painting can only be viewed through the camera lens and not with the naked eye. Furthermore, we aimed to integrate human interaction with robots into the stop-motion animation to create a sense of playfulness, including immersivness by introducing light painting. The collaborative nature of our project, which brought together different disciplines such as photography, robotics, and storytelling, exemplified the potential of co-creation and interdisciplinary collaboration in artistic and creative endeavours.

Video Link

Process work

To create Light Painting, you’ll need a few key components: a darkened space, a camera programmed to take long exposure shots, a movable source of light and a willingness to play and explore. The process becomes more sophisticated when the light source is moved by a robotic arm to create precise movements.pxl_20230412_195858769

The first step was to make the process room dark by covering any exposed areas from where the light could enter the room. 2nd step included setting up the stage for the robot performance by making sure that the robot is performing in front of a background that’s as dark black as possible.

 

pxl_20230412_165727622-2

 

In the next step, we setup the camera with an exposure time of 5 to 10 seconds, using a tripod and a remote shutter release, an aperture of f10 and an ISO set to 100 to reduce colour noise to a minimum.

 

 

ezgif-com-video-to-gif

In the next step, we moved the light source with the robotic arm. The path was dictated by the Processing sketch. We were expecting to move the arm on a vector path, but our limited knowledge of writing code for Processing was a big obstacle. We ended up using mouse coordinates as our live data to move the arm on the desired path. Our idea was to write letters and draw shapes to create a stop-motion animation sequence. We achieved this by placing an image in the Processing and moving the mouse cursor over the alphabet. We used the mouseMoved() function to publish data to the robotic arm.

ptop

The image shows the robotic arm movement holding a light source, corresponding to mouse coordinate data coming from the Processing sketch.

 

digitalstillshot

 

The image on the left is the result of the light source motion when taken under a long shutter speed exposure. aka light painting.

 

 

7j010x

The image on the left show the result of stitching still images together to create a stop-motion animation effect.

 

anas-light

In this step, we tried to create an illusion of catching light. We created a table in Processing that corresponds to the light scribble position in each frame.

ezgif-com-crop

 

 

The image on the left shows mouse movements in a single cell to create each frame of the above animation.

img0075

 

The photo on the left is a single-frame example of using the above approach.

 

 

 

In the post-production, we stitched all frames together to create the video

https://www.youtube.com/watch?v=c2fbgxyvzds

 

Project 2 – Robot Tools

Robot Conductor

Shipra Balasubramani. Mona Safari. Zaheen Sandhu. Ricardo Quiza

img_3541

Description

For this Project 2, we wanted to explore the relationship between a human and an orchestra conductor

in the form of a robot. A conductor during a performance serves as a messenger for the composers. Their responsibility is to understand the music and convey it through gestures (hand movements) so transparently that the musicians in the orchestra understand it perfectly.

For Devs4 we worked with this idea in mind and considered this experiment as a prototype for the final project. Therefore, we tried to understand the hand movements of a conductor, how that reflects in the music composition and whether the robot will be able to mimic those movements.

This exploration helped us in understanding the first few steps of playing with sending data and state machines. This was then further developed and refined for the final output of Project 2. The intent was to be able to mimic the movements of a conductor while synchronizing the movements to a classical sound piece.

Case Study

YuMi takes center stage in Pisa

A case study we looked at while exploring was ABB’s Robot YuMi, wherein a robot’s performance was developed by capturing the movements of maestro Andrea Colombini through a process known as lead-through programming. The robot’s arms were guided to follow the conductor’s motions meticulously; the movements were then recorded and further fine-tunes in the software. Taking inspiration from this example, we wanted to see how a robot could be in charge of conducting a performance.

screenshot-2023-04-21-at-2-39-10-pm

Blob Opera

Blob Opera is a new machine learning experiment by artist David Li that lets you create your own festive song inspired by Opera on Google Arts & Culture. Users guide the pitch and vowel sound of our four festive blobs who stand ready to transform their musical ideas into beautiful harmonies. This interactive website also has prerecorded popular tracks/songs based on location. Currently, there are 6 preloaded locations and a total of 22 prerecorded soundtracks. The website also allows users to record their compositions or the preloaded tracks, although it does not allow them to download the recorded pieces, it only allows the user to share it as a video that would be linked to the browser.

screenshot-2023-04-21-at-2-39-00-pm

Storyboards, Process and Fabrication

darkness  screenshot-2023-04-21-at-2-39-30-pm

For this project, we wanted to create a collaborative experience between one human user and one UR10E Robot Arm. Keeping this in mind, we decided to work with the idea of the robot as an Orchestra Conductor, although it is the human user who would be controlling the robot from the laptop.

The idea was to create a program on processing where the user would use the mouse to scroll over the 4 notes/ buttons seen on the screen. The interaction is designed such that the user would be able to scroll over these notes to create their own symphony. A reaction to this is that the state machines get activated and the robot arm in response would move to the predefined positions to make the experience seem like the Robot is conducting the orchestra.

During the development process, we decided to create miniature tools and props for the scene, which resulted in creating a more dramatic effect. We were also keen on understanding how the robot’s movements can be manipulated if it held a miniature baton. Keeping these aspects in mind we developed a storyboard to frame the scene and understand lighting. This also helped us in building the props required for the scene.

While going through this process we were also keen on giving the robot some character, hence to add to the mood and experience we decided to step away from traditional and classical sound pieces, instead, we decided to work with Blob Opera (Google Arts & Culture). Blob Opera is an interesting tool that allows the user to compose and generate sound pieces while adjusting the pitch and vowel sound as per requirement. We found the characteristics of this very interesting and it provided an opportunity for us to build our own notes for this interactive experience. After understanding the basics of working using this method, we were able to create a program on processing to send mouse data and trigger the note and motion of the robot arm holding the baton.

screenshot-2023-04-21-at-2-37-43-pm  screenshot-2023-04-21-at-2-39-20-pmscreenshot-2023-04-21-at-2-37-34-pm  screenshot-2023-04-21-at-2-36-28-pmscreenshot-2023-04-21-at-2-36-42-pm

Final Code

Processing Code – https://github.com/zaheensandhu/Robot-Conductor.git

Final Video

Final Video on Youtube

final shot.mp4

Project 2

Mr. WakeMeUp

–By Nicky Guo

screenshot-2023-04-21-at-9-28-27-am screenshot-2023-04-21-at-9-28-17-am

Description

In project 2, I decided to make a robot program that wakes up people who are studying or working at the table and are about to fall asleep or have fallen asleep. Specifically, there will be a huge hammer of my own design and assembly attached to the robot arm gripper, by sending data to the robot arm to make him from a static standby state to active, and then from top to bottom to smash the sleeping person. The initial source of inspiration came from the fact that I often accidentally fell asleep in the middle of studying, thus preventing me from completing the tasks I had planned in time. So I hope that in the near future, there can be such smart technology that can help me stay awake in a gentle way. My research/design question is how I can make robots wake people up the users in a gentle, enchanting, and effective way. Such a robot program should not be designed to wake up people exactly like an alarm clock, because that would be boring and startling, and thus the user would become bored and uncomfortable and stop using it. Therefore, with the design goal of being fun, and gentle, but effective, I began my design and production process. 

Video Link:

https://drive.google.com/file/d/17NBxIJuY0x2BwphfcK32KRLxijJPxZ3K/view?usp=share_link

 

Process 

Design of the physical tool

The first is the production of the physical object: the hammer. In order for him to effectively wake up the user, the hammer is designed to be very large, so that the increase in the force surface will also enhance the user’s feeling of being hit. As for the choice of materials, I chose a softer material for the head of the cone, so that I could wake the user up but not hurt them or to the point where they couldn’t take it. At the same time, I chose to use a hard material for the rod of the hammer, compared to the soft material, so that the hammer can be more stable to be attached to the gripper. Therefore, I used a rolled-up blanket as the head of the hammer, and then inserted the internal cardboard wallpaper roll, and then used the tape to fix its joints.

screenshot-2023-04-21-at-9-17-19-am screenshot-2023-04-21-at-9-17-32-am

Motion design

The next step is to design the trajectory of the robot arm/hammer. I want to design it a little more interesting, rather than a simple rigid move and hit people. To do this, I added some anthropomorphic elements, imagining the robot as a treacherous but lovable character. Before the hammer falls on the user’s head, the robot will do a set of preparatory activities. If the user can wake up in the middle of the preparatory activities, he still has a chance to avoid being hit by the hammer.

Programming

For programming purposes, I use the method of sending data to the robot, referencing the example in the course module. When the mouse moves to a different numbered area, it triggers the robot arm to move. For example, area 1 represents the stationary state, while area 2 will cause the robot arm to start moving until the hammer falls, and the mouse over area 3 will cause the robot to return to its original position. Due to time and technical limitations, this prototype can only be controlled by the mouse for now. But in my vision of the finished robot, it is able to automatically detect the user’s state and then perform a series of activities.

 

Reference

https://canvascloud.ocadu.ca/courses/6182/pages/data-sender-setup-+-processing?module_item_id=359888

 

Project 2

screen-shot-2023-04-19-at-10-25-24-pm 

Summary

Project theme: Sweep the Duck

Nature of interaction: mainly collaborative with one human involved, and the human is in charge. Slightly self-competitive, from the robot’s perspective.

Focus: sending data to the robot

Physical tool designed: sweeper tool, using the robot’s gripper as part of the tool.

Related motion/interaction: left to right to sweep, open to close to obtain prize.

Question investigated:  Can robots play to win?

VIDEO: https://ocadu.yuja.com/v/digfs5005501project2dchoi

 

Project Description

Project 2 is called ‘Sweep the Duck’, which centres around the themes of youthfulness, fun, and teamwork. The question we sought to answer through this project looks at if robots can play to win. More specifically, whether robots can perform a specific movement and receive accurate results repeatedly. In Sweep the Duck, using the fish as an attachment to your arm, also known as your sweeper, the goal is to sweep the duck across the pond, so it can reach the sunflower pad. If the duck touches the sunflower pad, you progress to the next level, where the sunflower pad is slightly farther in distance. If you win three levels, you get a prize at the end of the game. The game is meant to be friendly with encouraged positive reinforcement, through assistive interaction. There are no losses. In this game setting, the robot acts as the player, and the human acts as the game referee and assistive coach. The game is self-competitive and collaborative in nature. From this project, we were able to deduce that robots can perform specific movements accurately, but not necessarily precisely. The outcome of specific movements is not only dependent on the robot, but they are also dependent on supporting factors within its environment.

Process

Inspiration

The themes for project 2 are inspired by children’s fishing games, mini golf, and play-to-win games at the Canadian National Exhibition (CNE).

screen-shot-2023-04-19-at-6-58-46-pm

Elements and characters in the setting

Elements and Characters Description
The robot

screen-shot-2023-04-19-at-8-45-26-pm

The main character in the story. Aims to win a prize by sweeping the duck toward the sunflower pad. Does not always have precise aim.
The human

screen-shot-2023-04-19-at-8-44-22-pm

The secondary character in the story. Aims to help the robot win the game, by re-positioning the duck. Gives the robot their prize at the end.

 

The duck

screen-shot-2023-04-19-at-8-19-48-pm

The secondary ‘character’ in the story. Aims to reach the sunflower pad with the effort from the robot.
The sweeper arm tool

screen-shot-2023-04-19-at-8-24-03-pm

Rather than creating a separate individual tool, I decided to create a tool that acts as an attachment to the robot’s arm/gripper. I was inspired by intelligent robots as seen in movies or in video games (below), who have these capabilities built in.

screen-shot-2023-04-19-at-10-31-08-pm

 

The sunflower

screen-shot-2023-04-19-at-8-46-59-pm

Is the designated goal for the duck and robot.
The setting Blue-green tissue paper colours aims to mimic water in a pond.
The robot’s prize

screen-shot-2023-04-19-at-8-41-35-pm

A box of sour candy. (In this story, the robot loves candy, particularly sour tasting flavours).

The set-up

Figures from left to right: A) Robot’s fish sweeper to propel the duck, B) Successfully brings duck to the sunflower pad, C) Human brings robot its prize, D) Robot takes its prize.

 screen-shot-2023-04-19-at-7-35-41-pm

Cinematography

As the themes for this project centred around youthfulness, fun, and collaboration/teamwork, sound effects were used to support the scene’s approach. The camera angle is from the perspective of the human, with a realistic feel. Slight post-production effects were used to help put the clips together and create the scene atmosphere, by using iMovie video editing software. As the cinematography was filmed by one person, it was difficult to coordinate multiple parts in one continuous filming. As such, filming was broken down into chunks. The minor editing post-production helped to bring all the filmed clips together.

Behind the scenes – the process and exploration

Figures from left to right: A) Attempting to sweep the duck to the sunflower, B) Protective stop from robot’s software, C) Duck landed successfully on sunflower via touchpad control, D) Multiple attempts of duck landing on sunflower via offline animation control, E) robot picking up its prize with gripper activation.

screen-shot-2023-04-19-at-7-17-31-pm

Script file: https://ocadu.yuja.com/m/sweepduckdotscript

Key challenges and insights during the process

  1. It was difficult to film as one person, using one hand, while balancing the timing of the filmed clips versus the robot’s movements. In future production, I hope to utilize access to a supportive device, such as a tripod, to assist me with filming a solo project.
  2. Grip vs blundering. Sometimes the duck would get stuck right underneath the sweeper/robot’s gripper, due to the speed of the robot’s movement or due to the texture of the scene’s man-made setting. This triggered the protective stop from the robot’s software. This was overcome by strategic positioning of the duck. In future production, I would like to consider more slippery textures to utilize as a strategy to reduce friction between objects.

 

References (images)

  1. Fishing Game with Hook and Reel. (n.d.). Giant Tiger. https://www.gianttiger.com/products/fishing-game-with-hook-and-reel?variant=40336447209533
  2. Play Tiny Fishing – Reel in a legendary fish | Coolmath Games. (n.d.). https://www.coolmathgames.com/0-tiny-fishing
  3. Stortz, M. (2022, August 20). 35 games at the CNE ranked from easiest to hardest. blogTO. https://www.blogto.com/sports_play/2019/08/games-cne-toronto/
  4. (2023, April 13). The Best Mini Golf In Toronto Is Just A Putt Away (Our Top 7 Picks) – Indie88. Indie88. https://indie88.com/mini-golf-toronto/
  5. Robotic arm Anime Mecha, weapon, game, electronics, fictional Character png | PNGWing. (n.d.). https://www.pngwing.com/en/free-png-yncih
  6. HD wallpaper: Anime, Metal alchemist, Blond, Robot, Arm, Weapon, one person | Wallpaper Flare. (n.d.). https://www.wallpaperflare.com/anime-metal-alchemist-blond-robot-arm-weapon-one-person-wallpaper-hvhpi
  7. Xuan, Y. Z. (2019, December 26). From The Bottom. Pinterest. https://www.pinterest.ca/pin/from-the-bottom–802414858595500565/

Project 2 – Group 5

My Robot Coach

Group 5: Wentian Zhu, Ellie Huang, Jiamin Liu, Yifan Xu

Sti

Group5

Bounty Hunter on Cloud: https://www.youtube.com/watch?v=jXso3idTFWA

In this project, we investigated the potential of robot arms to enhance sports and gaming training in a simulated environment. In the physical world, expertise in sports and games usually entails repetitive exercise and access to appropriate equipment. Unfortunately, such resources may not be accessible to everyone, which is where robot arms come in as a valuable aid. By generating virtual scenes and offering support for sports or game-based training, robot arms can be of tremendous assistance to a wider population.
Through the use of technologies such as Arduino, WebSocket, and Unity, we were able to demonstrate how robot arms can be used to create an immersive and engaging training experience that promotes hand-eye coordination and quick thinking. However, challenges such as the optimal delay time in the communication between softwares need to be addressed to improve the overall gaming experience.

In addition, the feedback on the game’s representation of guns and shooting from a political perspective highlights the potential for exploring the concept further. By tying it into current societal, cultural, and political issues, we can create a thought-provoking experience that engages players in a dialogue about violence, human society, technology, and ethics.

 

Workflow

wwwww

The Arduino converts the data into JSON format and sends it to the WebSocket server, which in turn receives the data and sends it to Unity. Unity then converts the received data and convert it into C# format.

 

Arduino Diagram and Container Design

w

Environment Setup

ezgif-com-video-to-gif

User Flow

wewew

 

Design Process

Step 1: Connect both Arduino and Unity to Websocket

In Step 1, we connected Arduino and Unity to the WebSocket for real-time data transfer, use a light sensor and laser pointer to trigger animations, add a score system, reset button, and LED light strip for feedback. This created an engaging game experience that responds promptly to user actions.

ezgif-com-video-to-gif-1

 

Step 2: Refinement

In Step 2, we refined the scene design and created storylines that will add depth to the game. We also started to consider the design of the Arduino container and cable organization to enhance user experience. Lastly, We changed the scoreboard to a reward system that will make the game more engaging and encourage players to continue playing.
ezgif-com-video-to-gif-23222

Step 3: Final Adjustment

In Step 3, We laser cut the monster as the target and added ambient light to create a more immersive gaming experience, then we programmed the robot arm to respond to user’s movements, and enhancing the design of the Arduino container. These improvements allowed a more seamless and enjoyable game and helped us to reach a wider audience, including gamers and professional athletes looking to improve their reaction time and critical thinking skills.

323232

 

Link to presentation slides: https://docs.google.com/presentation/d/1UARgt_aAtIAvsono_Bd3K7zpYPW3exxm33bZj-n9tMY/edit?usp=sharing

Github Link: https://github.com/NarrowSpace/ArduinoUnityWebsocket

 

Credits:

Unity Assets:

Magic Effects Free: https://assetstore.unity.com/packages/vfx/particles/spells/magic-effects-free-247933

Free Pixel Font – Thaleah: https://assetstore.unity.com/packages/2d/fonts/free-pixel-font-thaleah-140059

Pixel Art Icon Pack – RPG: https://assetstore.unity.com/packages/2d/gui/icons/pixel-art-icon-pack-rpg-158343

Simple Heart Health System: https://assetstore.unity.com/packages/tools/gui/simple-heart-health-system-120676

LowPoly Environment Pack: https://assetstore.unity.com/packages/3d/environments/landscapes/lowpoly-environment-pack-99479

Adventurer Blake: https://assetstore.unity.com/packages/3d/characters/humanoids/adventurer-blake-158728

Character Cactus: https://assetstore.unity.com/packages/3d/characters/creatures/character-cactus-32933

Sound Effects:

Minecraft Villager Hurt Sound Effect: https://www.youtube.com/watch?v=1wJsOoUYKyY

“8 Bit World!” Fun Upbeat Chiptune Game Music by HeatleyBros: https://www.youtube.com/watch?v=VijZQa6hT9U&list=LL&index=6

3 2 1 0 Countdown With Sound Effect | No Copyright | Ready To Use: https://www.youtube.com/watch?v=DiUGv1vsuSU

WIN sound effect no copyright: https://www.youtube.com/watch?v=rr5CMS2GtCY

Image:

Comic speech bubble with vs text: https://www.freepik.com/free-vector/comic-speech-bubble-with-vs-text_14201655.htm#query=vs&position=21&from_view=search&track=sph

Big win surprise banner in comic style:

https://www.freepik.com/free-vector/big-win-surprise-banner-comic-style_4192021.htm#page=3&query=win%20title&position=4&from_view=search&track=ais

Project2: Naughty Robotcat – Group1

Group members: Siyu Sun, Yueming Gao, Maryam Dehghani, Victoria Gottardi

Investigation

  • Brainstorming 

To generate ideas for a project involving interaction, we considered the type of interaction, physical tools related to motion and interactions, and how to interact. This led us to the initial idea of a pet robot.Given that our equipment was limited to a robotic arm, we decided to model the actions of a cat’s paw and incorporate playful movements.

As we progressed, this concept evolved and became more refined. Ultimately, we settled on the idea of a cat’s paw that displays movement and behaviour that the viewer can communicate with and develop a connection to.

miro-brainstorming

 

  • Pet Therapy

Pet therapy, also known as animal-assisted therapy (AAT), is a type of therapy that involves trained animals interacting with people. The animals are trained to interact calmly and gently with people and are often used in hospitals, nursing homes, and schools to provide emotional support and companionship to patients and students.

  • Research questionIs it possible to create an interactive and collaborative Naughty Robocat and can all of our separate components work together to create a fun, inviting, and somewhat stress free work environment?

References

  • The Loona Robot: This is an adorable robotic pet. Some people find comfort in interacting with robotic animals, as they can provide a sense of companionship without the potential allergies, mess, or responsibility of caring for a live pet. (https://keyirobot.com/products/loona)

the-loona-robot

  • PARO Therapeutic Robot: This is an advanced interactive robot developed by AIST, a leading Japanese industrial automation pioneer. It allows the documented benefits of animal therapy to be administered to patients in environments such as hospitals and extended care facilities where live animals present treatment or logistical difficulties. (http://www.parorobots.com/index.asp)

paro


Procedure

  • Laser Cut for Cat paw

To create a robotic arm that resembles a cat’s paw, we began by altering its appearance in the first phase. This involved utilizing a laser cutting machine and acrylic craft felt sheet to make a glove-shaped covering that shows the form of a cat’s paw. Then we placed this covering over the robotic arm.

lasercut01 lasercut02microsoftteams-image-4

  • Interactive Interface – Arduino& Robot

However, in order to increase engagement with the audience, we decided to include a touch sensor to initiate the arm’s movements. The Arduino was set up under the table. And the sensor was hidden under the sheet.

diagram-of-set-up

We design different movements for two Robot State: State 1st is usually like a cat observing the environment tentatively. State 2nd is the cat doing something for the environment. 

robotstate-and-touchsencor

  • Working environment 

Ultimately, we aimed to simulate a real-life setting by creating a suitable environment and positioning a person in a workspace. Instead of a live cat, we used a robotic arm in this space. By incorporating familiar cat’s behavior and making slight adjustments to the arm’s appearance, we endeavored to evoke a sense of owning a pet similar to the real-life experience for humans.

microsoftteams-image-2microsoftteams-image-2-copyAs we expand the range and diversity of the movements, it is possible that we may observe more interactions between the robot and humans than exist in traditional robot-human relationships.

  • To Mimic a cat 

In the next phase, we focused on modifying the movements of the robotic arm to emulate the desired actions. Our goal was to replicate the natural behavior of a cat, while also incorporating playful and mischievous movements, such as touching an object and suddenly knocking it over with a swift motion.

  • Three movements of robot cat 

Slapping the cup

moment1_slap-the-cap

Interrupting work

 moment2_interrupting-work

Patting a new toy

 moment3_pat-the-pig

 


Challenges

  • We had to consider how the robot arm’s movements would affect our created environment, as some movements did not go as planned. (knocking over a plant and hitting the laptop).

smallsizehumanrobot-project2-backstage01

  • The pig was too soft to be recognized by the touch sensor, so we had to adapt by taping a hard object (in this case a USB stick) on the bottom of it. Even then, it took many tries to get the USB stick to touch the sensor successfully.

microsoftteams-image-3smallsize


Conclusion

For this assignment, we considered the use of animal therapy robots, such as PARO, the therapeutic robot. However, we aimed to develop an animal robot that could potentially help people in the workplace. Our cat robot, “The Naughty Robocat,” would serve as a companion in the workplace, helping to destress the environment and provide workers with a fun, healthy distraction and break from their work as they interact with the Robocat.

As we considered the interactions that The Naughty Robocat would perform, we began to address some important questions. The established interactions would be collaborative between us, the Robocat, and a touch sensor attached to an Arduino. Currently, the project requires the use of one human in a simulated work environment. Other humans behind the scenes would oversee preplanning the Robocat’s movements through two different robot states, which would then be carried out when the human in the work environment interacts with the touch sensor while working.

In future work, our project can be extended to explore other aspects, such as designing more interactions between humans and robotic cats, investigating whether robotic cats can have different reactions to users’ actions, and exploring the possibility of playing games together between robotic cats and humans. Researching these issues can help us delve deeper into the relationship between humans and robots.


Video URL: https://vimeo.com/819194839

Robot script URL: https://github.com/vicisnotokay/NaughtyRobocat_Script

Arduino/processing code URL: https://github.com/vicisnotokay/TouchSensorCode

Dev 4 – Group 6: AnasRaza,FaraasKhan

g3headerdev4

Light painting with the robotic arm caught our attention at the beginning of the course. We are fascinated by the glowing effect of light patterns synced with sound effects and this is what we are trying to produce for our final project. At this stage, we are investigating how to move the robotic arm on a vector path. For example, how to draw a rectangle in Processing. The rest () function draws the rectangle completely and renders it at once but we are trying to rt9sbfigure out how to draw one point at a time. We are expecting to find some approaches in which we can output one point of any pattern at a time. For example, in the image below the point is moving on a path outputting an x-value a frame per second. In order to publish the x and y values of a moving point to the robotic arm we need xy coordinates at each frame. The provided code that sends x and y values on each click is help full but we might change it to mouse move to control arm movement freely.

Another approach is to feed Sin() values to the arm to move it on a smooth path.

What you learned

  • The moveL function of the robotic arm is not suitable for smooth uninterrupted motion.
  • mousemoved() is our best bet at this point since we were unable to find a solution to output individual coordinates values of each point in a rectangle or any vector path.

Process work

The first step is to make the room completely dark, we place black cardboard panels on all windows and a black backdrop.

pxl_20230412_195858769  pxl_20230412_200228627

Some Text runs

2 2 1 3

Dev 3 – Group 6: AnasRaza,FaraasKhan

headd3-g6

Investigation

What is it you are trying to figure out?

Our main goal at this stage is to test sending and receiving data from the robot arm. From this, we are currently investigating visual effects that can be generated with robotic arm data, including visceral effects emerging from the synchronized physical motion of the arm with on-screen graphical media, and whether this combination creates a unique and immersive experience. A potential application of this combination is to create interactive installations that incorporate both visual and participatory physical elements. In the next stage, we hope to explore the possibilities of having the robot arm respond to human movement (if circumstances allow).

Documentation of process and results

For this exploration we are using an existing P5 code example and replacing the motion or interaction co-ordinates with X and Y of the robot arm. Because of the arm orientation, we have mapped Y-value of the arm with the screen’s X-coordinates and Z-values of the arm with screen Y-coordinates.

let rx = map(rposY, -260, 200, 100, windowWidth-100);
let ry = map(rposZ, 455, 875, windowHeight-100, 100 );

dev3-g6

in case GIF doesn’t work
https://drive.google.com/file/d/1t-Iz9S-UbgfYsIG6ERZCywWqvDcqUoWX/view?usp=sharing

How did it go?

This experiment went well, we think. We had the opportunity to practice preliminary yet crucial steps, including establishing server connections, data calibrations and creative explorations.

Learning Experience

Every interaction with robotic arm is a learning experience at this stage. Just knowing how to control the robot externally opens many possibilities for us. The most important lesson while working on this part of the assignment was to figure out data Mapping. The published values are very different from the graphic’s coordinates. Data calibration is the most critical part of any project where two different digital systems interact. We are using Map() function to calibrate data in this setting, but as this project becomes more complex, constraints for incoming or outgoing values may be needed. We hope to learn more about controlling the robot arm as we progress to the next stages.

Forked code: https://editor.p5js.org/anasraza/sketches/S-1tIFDG2