In this Sketch, I used an Optic sensor with Arduino and Serial DAT to bring in light data into Touch Designer. The data from the light sensor was divided into 4 ranges that were tied together by the “if” statement. The four categories of ‘DARK’, ‘DIM’, ‘LIGHT’ and ‘BRIGHT’.
I also made an attempt to bring the data in via touch OSC and CHOP in operator. I had difficulty figuring out how to single out a channel and connect it to a specific parameter like Birthrate or Turbo Speed.
Link to the video: https://youtu.be/ttQAbockRaA
I made this sketch while learning how to use touch design. This sketch uses a shape and noise to give life to an inanimate object.
See video – https://youtu.be/aIQ-i0PxcXg
For this sketch, I wanted to experiment with creating an almost glittery, polychromatic effect by playing with the effect of the pitch and roll values on my Processing sketch. The Processing sketch was derived from Daniel Schiffman’s “Purple Rain” coding tutorial, which I used to practise the application of object classes and movement. I then tweaked it and asked the code to read the pitch and roll values from the Arduino to alter the colour and size respectively.
The act of moving the Arduino to create these visuals felt like I was playing with a kaleidoscope or snowglobe and I would have liked to experiment more with the direction and speed of the particles as well.
The code I used for the Arduino was similar to the one we practised in class.
Here’s the code for the sketch: https://github.com/anushamenon/Sketch-5
For this sketch, I attempted to use p5.Party to create a p5.js sketch that many users can interact with at the same time, from different browsers. My attempt wasn’t very successful but I will continue to work to figure it out.
Here is the code:
In this project, I wanted to run a gif on the OLED screen, for this I converted different frames of a photo into code and entered it into coding in a smaller size. In this exercise, I used the old screen 128×64 and set the gif size to 30×30. In this way, it is possible not only to convert images into code, but also to display them in an animated form.
For this, from the beginning, I divided the GIF image file into 27 separate frames and changed the size. For this purpose, I used websites and free software such as IrfanView & ezgif.com and after that I converted the batch frames into code with javl.github.io/image2cpp weblink.
Code on Github: https://github.com/DehghaniMaryam/Create-Computation-Sketch5/commit/9536b98cd03288b1548245be5860dd28a67f18fd
I am trying to design a simple quiz game relative to my idea, but I am still exploring how to make it.
I use the keyPressed function to make the audient input some message.
For this sketch, I was testing out how to make Processing recognize a series of .png files, and display them in a sequential order, so that it looks like an animated image.
I utilized imageCount, which you can program to recognize the format of your filenames that Processing is drawing data from. Therefore, you don’t have to manually type in each file within the code itself. This is especially helpful if you have a lot of images.
I also used mousePressed, so that the lines would “grow” if you held down the mouse button; and would stop growing if you released the mouse button.
Github of code:
– This example on Processing helped me to figure out the imageCount function.
– Animation created by me.
For this sketch I wanted my Arduino board to communicate with P5JS. I used the p5.serialcontrol application to have the two communicate through a single port.
I decided to map the brightness of my LED to the sketch in my P5JS web editor, setting 0 as the lowest brightness level and 255 as the highest.
My next step was to try and link Arduino with Ableton. I wanted to use the potentiometer to control the volume or frequency of a sound.
I downloaded the Ableton live app, set up the connection kit, and uploaded the standard firmata code to my Ableton board after editing the boards.h file in order to make it compatible with the Arduino nano.
The method that I used didn’t work, hoping to figure it out at a later stage.
Here are the files (standard firmata + updated boards.h file)
Using the connection kit
this simple sketch is to move the text on OLED like how the billboard/advertising screen work in our life.