I made this sketch while learning how to use touch design. This sketch uses a shape and noise to give life to an inanimate object.
See video – https://youtu.be/aIQ-i0PxcXg
I made this sketch while learning how to use touch design. This sketch uses a shape and noise to give life to an inanimate object.
See video – https://youtu.be/aIQ-i0PxcXg
Sketch 5 : Metamorphosis
For Sketch 5, I wanted to explore idea of using dynamic particles to create visuals. For this sketch I worked with Arduino, potentiometer and Touchdesigner. The piece responds to the potentiometer, Arduino reads the value from the potentiometer, which is directly
related to the number of particles created. Lower the value lesser
particles, higher the value more particles used to create the 3D form.
Full Video
Link : https://ocaduniversity-my.sharepoint.com/:v:/g/personal/shiprab_ocadu_ca/EasRl5PkM3VKmo9KMI0EPLMBib9L9bQbbcHZo6NUCNyl4g?e=Q9446n
Screen Recording : https://ocaduniversity-my.sharepoint.com/:v:/g/personal/shiprab_ocadu_ca/EaF-HtemO_xKvHKQlyDLOvABroV8h0e_1NfIV0gMd2zoSg?e=bwJBgw
References
3D shapes & particles : https://www.youtube.com/watch?v=Y3qoGoY1NyQ
Connecting Arduino to TouchDesigner : https://derivative.ca/community-post/tutorial/how-use-touchdesigner-arduino-together-beginner-tutorial/65273
For this sketch, I wanted to experiment with creating an almost glittery, polychromatic effect by playing with the effect of the pitch and roll values on my Processing sketch. The Processing sketch was derived from Daniel Schiffman’s “Purple Rain” coding tutorial, which I used to practise the application of object classes and movement. I then tweaked it and asked the code to read the pitch and roll values from the Arduino to alter the colour and size respectively.
The act of moving the Arduino to create these visuals felt like I was playing with a kaleidoscope or snowglobe and I would have liked to experiment more with the direction and speed of the particles as well.
The code I used for the Arduino was similar to the one we practised in class.
Here’s the code for the sketch: https://github.com/anushamenon/Sketch-5
For this sketch, I attempted to use p5.Party to create a p5.js sketch that many users can interact with at the same time, from different browsers. My attempt wasn’t very successful but I will continue to work to figure it out.
Here is the code:
https://editor.p5js.org/mufaromukoki/sketches/74c8zJoHM
In this project, I wanted to run a gif on the OLED screen, for this I converted different frames of a photo into code and entered it into coding in a smaller size. In this exercise, I used the old screen 128×64 and set the gif size to 30×30. In this way, it is possible not only to convert images into code, but also to display them in an animated form.
For this, from the beginning, I divided the GIF image file into 27 separate frames and changed the size. For this purpose, I used websites and free software such as IrfanView & ezgif.com and after that I converted the batch frames into code with javl.github.io/image2cpp weblink.
Code on Github: https://github.com/DehghaniMaryam/Create-Computation-Sketch5/commit/9536b98cd03288b1548245be5860dd28a67f18fd
I am trying to design a simple quiz game relative to my idea, but I am still exploring how to make it.
I use the keyPressed function to make the audient input some message.
Code link:https://github.com/YuemingGaoMINGMING/MINGMING/blob/main/sketch_5.pde
References:
1.https://processing.org/tutorials/typography
2.https://processing.org/examples/button.html
For this sketch, I was testing out how to make Processing recognize a series of .png files, and display them in a sequential order, so that it looks like an animated image.
I utilized imageCount, which you can program to recognize the format of your filenames that Processing is drawing data from. Therefore, you don’t have to manually type in each file within the code itself. This is especially helpful if you have a lot of images.
I also used mousePressed, so that the lines would “grow” if you held down the mouse button; and would stop growing if you released the mouse button.
Github of code:
https://github.com/geipanda/Sketch5_ImageSequence
Resources:
– This example on Processing helped me to figure out the imageCount function.
– Animation created by me.
For this sketch I wanted my Arduino board to communicate with P5JS. I used the p5.serialcontrol application to have the two communicate through a single port.
I decided to map the brightness of my LED to the sketch in my P5JS web editor, setting 0 as the lowest brightness level and 255 as the highest.
My next step was to try and link Arduino with Ableton. I wanted to use the potentiometer to control the volume or frequency of a sound.
I downloaded the Ableton live app, set up the connection kit, and uploaded the standard firmata code to my Ableton board after editing the boards.h file in order to make it compatible with the Arduino nano.
The method that I used didn’t work, hoping to figure it out at a later stage.
Here are the files (standard firmata + updated boards.h file)
Using the connection kit
this simple sketch is to move the text on OLED like how the billboard/advertising screen work in our life.
Code:
https://github.com/Nickyggggg/Arduino-Project/blob/f03de97b2bdf9bdff0f0c6b362fe41a509b12282/Sketch%205
Don’t forget to turn up the volume while watching!
The first project, named “Noise Generating,” was done in Pure Data with OSC or Open Sound Control, a protocol that, like MIDI, transmits music-related data to allow musicians to perform on instruments. We can wirelessly transmit OSC data from our iPhones or other devices, unlike MIDI data, which is transmitted over a network. For this project, I was inspired by the “Interactive Music” workshop held by the “Digital Innovation Hub” in tpl.
Use of this service is governed by the IT Acceptable Use and Web Technologies policies.
Privacy Notice: It is possible for your name, e-mail address, and/or student/staff/faculty UserID to be publicly revealed if you choose to use OCAD University Blogs.