Experiment 3 Final Prototype – Cam Gesture

Michael Shefer – 3155884

Andrew Ng-Lun -3164714

Rosh Leynes – 3163231

Cam Gesture

Project Description

For the project we set out to create a physical interface where, depending on the user’s physical movements, manipulations displayed on a screen would occur. The user would have two gloves fitted with stretch sensing fabrics that would read values emitted by movement. These values would then transfer over to a display reading a webcam feed and process them into various manipulations such as increasing the quantity boxes/pixels, increasing and decreasing the sizes of the boxes/pixels, and manipulating the intensity of the stroke. At first we had the intent on the final sensor adding multiple filters to the screen but trouble with the code forced us to adapt. The screen aspect uses P5 which reads values from the Arduino and our three analog sensors.

The project went through various alterations when compared to its initial stage. We sought out to connect the Arduino with Touchdesigner as we intended on constructing a matrix of pixels for an animation, then a real-time webcam feed, that could be easily manipulated with hands. The concept was that as your hands would open up the pixels would expand giving the illusion of control via the physical interface. The initial idea had to be quickly altered as we encountered various challenges with the values from the sensors and Arduino transferring onto Touchdesigner. It is there we switched to P5 which was more familiar to us.

As for materials we used two gloves, black and white, for a simplistic presentation to blend with the dark gray sensors and to counter the multicolored alligator clips, and three stretch sensing fabrics connected to fingertips because we wanted to emphasize various combinations of movement of the hand.

The initial stage of our project where we constructed a matrix of cubes to build an animation.

20181113_110011

20181113_104008Which later progressed into using the matrix of cubes to construct an image from the webcam feed. The farthest we got in this stage was having brief values being read within Toughdesigner but it wasn’t able to create a consistent image of the webcam feed.

20181113_131821Pictured above is the first build of our glove. Initially all the sensors were scattered across the fingertips on one glove but we decided to make two as over time it became difficult to manipulate certain functions.

20181113_125418This was the first attempt at reconstructing our Touchdesigner concept with P5.

20181115_110928Pictured above is the final build of the two gloves used for the critique.

Provided are videos with the first prototype glove working with the P5 code

https://drive.google.com/file/d/1VB5bvoKFE_A9Cye1EFV6X5dcEQyWHXO9/view?usp=sharing

https://drive.google.com/file/d/1cQFlUNgzErpail8aY0hJIDDKHp0MO3oQ/view?usp=sharing

https://drive.google.com/file/d/1PVzw5WnK9ABVWVlx3PQaTlcyjAMcEINa/view?usp=sharing

https://drive.google.com/file/d/1NY4Zog-s1ACMlxkXvsuDUniqPl_C3X_z/view?usp=sharing

Project Context

When given the project we automatically wanted to utilize body movement that would have a relationship with the display. Upon looking for inspiration we came across a company called Leap Motion which specializes in VR and more specifically hand and finger motions as sensors. From their portfolio we decided to implement the idea of having various finger sensors performing different functions.

Code with comments and references

https://github.com/NDzz/Atelier/tree/master/Experiment-3?fbclid=IwAR3O6Nm8dLJ1ZMWGYfHoAZdNMrf8qYHPqX-nz5xDunLjfR5xTTWmfsNbfHM

https://github.com/jesuscwalks/experiment3?fbclid=IwAR357iwqUCAjnUOe3US_vSQIvfToqX51yMsmZMubwS-RNf6bCblsQj7RSDs

 

Leave a Reply