Midterm Assignment

Group: Zoey Lu, Sara Hosseini



Me and Zoey wanted to use the song Frozen by madonna & sidekick forthis project because of the beats and minimum lyrics included and we thought it could be a very cool to create a audio visualization with it. After looking for a few inspirations from the p5.js references and coming up with some ideas as well, we started to pick and choose which ones we wanted to include and what our final goal was, for example I had this idea of having a galaxy theme with stars cause I thought it would look good with the song. Ive been struggling with Java because its not a language I use a lot but Zoey was so much help with this and I learned so much from her. I would love to work with her again. ūüôā

Links: https://editor.p5js.org/tong.lu2002/sketches/hzsI-Nh82



Team Member: Zoey Lu, Sara Hosseini

Links: https://editor.p5js.org/tong.lu2002/sketches/hzsI-Nh82

The main goal of this project is to have multiple scenes of audio visualization and use the amplitude to control the elements in the scene. The .js file comprises four main components: change scene functions, draw scene functions, music amplitude functions, and other side functions (ex. fullscreen). We used the AnimationWithScenes file from the lecture as the framework, and adapted attributes from AudioPlaybackWithAmplitude into it. Then we add scenes into the framework. The first part is the global variables, they are values that can not be reset. The second part is the change scene functions, where all the scenes are organized into draw’Scenename’() functions. The third part is the preload function, where the song, shader, and font are loaded before starting the sketch. The fourth part is the setup function, where variables and arrays are given a stable value, so they can be involved in the later calculation and not be reset. The fifth part is the draw function, besides assigning dynamic values to the variables, there is also an 80-sec loop that displays the scenes. The last part is where all draw scene functions are located. 

We started with interesting examples in p5.js reference for each scene, such as particles (https://p5js.org/examples/simulate-particles.html) and analyze (https://p5js.org/reference/#/p5.FFT/analyze). Then we built the scene in a separate sketch file since it is easier to test than the whole change scene framework. When we get a satisfying result, we import the scene back to the main framework. During this process, we found that our code can be more simplified, which is very helpful and make the entire structure organized and clear. Then we make sure the change scene function is working correctly and alter a few numbers to make the final result fit into the framework. We had a lot of creative decisions along the way. The first draft of the scenes is very basic. As we got more comfortable with p5.js, we started to add more elements to the scenes. For example, scene2 (third scene) originally only had soundwaves. As we were testing it in the main framework, we began to add the dots and the cube into the functions.

The most challenging part has to be placing the elements properly on the canvas. We spent a lot of time debugging and trying with different numbers. Testing the scenes and the shaders also required a lot of time as our code became lengthy. One small change of variables and break the entire sketch. We felt a sense of accomplishment when we saw the shaders we made in The Book of Shaders Editor are applied to the scenes, it gave us a lot of confidence to keep exploring new possibilities. There are still unsolved problems, such as the canvas now only can switch between fullscreen and full window (windowWidth x windowHeight), rather than a fullscreen and limited size (ie. 800×600). Overall, we really like the final result.

For this midterm project, I worked together with Sara Hosseini. She had a wonderful idea about audio visualization, and chose the song Frozen РMadonna vs Sickick. We found that we had similar interests and goals for this project, and we decided to collaborate. I learned a little bit about javascript in the last semester year, so I took the job of building the scenes and shaders, then integrating them into the framework. For the first scene, I used the particle example from p5.js to imitate the galaxy. I made a gradient shader to simulate the shift in light. For the second scene, I used lyrics from the song combined with falling particles as snow. Yet sadly the text can not be affected by the shader, only the color of the snow can be changed. For the third scene, I used resetShader() function to isolate an object from the sound spectrum. I also added randomly generated dots when the amplitude is higher than 0.20. The last scene is a simple audio visualization, yet it is interesting to see the sound waves on canvas. It is much more difficult to make shaders in p5.js, since they are no longer displayed on a large screen but a small element (ie. rect or circle). Sara gave me a lot of motivation and inspiration. We worked really well along the way, and I hope that we can work together again in future assignments.

Midterm assignment

Link to our animation: Here


Group member: Doris GAO, Aimee CHEN

This is group project with Aimee Chen. We developed and edited the shader which based on our previrous weekly assignment. We wanted to create a shader animation with color changes, so in a nutshell, we animated the shader based on the Color Harmony theme from weekly homework 4. First of all, based on films of John Whitney, we wanted to have some connections between each shader animations, instead of being separate. So we choose the connection between colors and applied different shapes to make the animation interesting.

For this project, although we are building on the weekly homework four, we are integrating what we have already learned. By combining those knowledge, we came up with the pattern we wanted. For example, I defined specific color set, instead of being static, I used “i+=1” to control the color’s changing rule, so that the color blocks can move or switch with a specific frequency.¬† In addition, for example, I used professor’s noise example to create the color beat or create waves. There are the two methods I used most often to create these shaders, however, there are several more useful ones, such as distance(), mix(),mod(),floor(), trigonometric functions and so on. While the project was still going on, we had honor to communicate with Professor Jeremy, and learned to use “if statement” to set conditions for parameters. For example, in our last animation, we used “if (u_time)” to control the time and sequence of animation playback. After the good part, it’s time to face the hard. For instance, when using Fragment coordinate, there were some texture errors. But after changing to texture coordinate with the help from Jeremy, the problems were well solved.

Overall, our final product wasn’t that far from what we expected, and the color set we picked out of Photoshop were just suit. In this project, I finished the shaders in the Book of Editor, and improved it after discussion with Aimee. We went thourgh from simgle animation changing at the beginning, to a more complex later. This animation also reflects my growth on one hand. Thanks to the samples and lectures provided by the professor, I learned so many new functions. It was interesting to combine the new functions with the old knowledge or adjust the parameters.

Week 7 (3D & Vertex Shaders)





Creating 3D objects and making them move in P5.js was attempted in my midterm project, but the course gave me a better understanding of some of the concepts. For example, in the second experiment, I tried to use bump mapping to change the surface of the object, which is related to the exposure, the lightness and darkness of the image, and I also used the same image on a sphere in C4D to understand this concept better.

Midterm Assignment

This is a collaborative work with Yue Lin. Our shader uses 3D objects and¬†I learned a lot about how 3D shaders work in P5.js in this week’s study. To emphasize the emotions in different relationships, the shaders are varied and preferably interactive with the viewer. The two main organs are the hand (touch) and the ear (sound), and we simulate these two sensations through the interaction of the mouse and the microphone. The object size changes with the microphone volume level and the mouse movement. Switching the scene to another object while rotating the object is a common scene switching method I used to use when creating 3D animations, and I use it here as well. We shifted the noise texture coordinates to make the geometry surface appear noisy and set the colour of the object in the frag file. This is something I have not experienced before with P5.js.
We explored any feasible shaders and features together, I was mainly responsible for the colour and texture change part, and Yue Lin was mainly responsible for the interaction and organizing the programming language. I am still trying to add more shaders to p5, and we will try to add more content to enrich the visual experience if feasible.

Link: https://editor.p5js.org/panjiaximj/full/8KQ4SgoxW

Joycelyn Wong – week 7 hw



During building 3d geometries in p5.js, I tried to create space by scaling up as well as rotating a square to mimick a room space and add a change of perspective. I also placed a square surround on two floating rotating spheres and torus to create an eerie atmosphere.



In bumpmapping, I explored displacement with step function, which created a laggy and pixelated distortion on my image. The distortion only appears when the slider value is between 0 and 1. For the image itself, I tone up the exposure and vibrance to make the color more defined and distinct to maximize the bumpmapping effect.

Joycelyn Wong – Midterm assignment

Link: https://editor.p5js.org/jowongg/full/pfnocqp2f

For the midterm assignment, I decided to develop multiply mirror shaders based on the shader’s knowledge I have learnt in class including hsb color, time, shape function, grid, layers, and color theory.  

In the first mirror shader, I incorporated the negative color and the hsb circular wave from week 2 homework to create a pinkish pixelated effect on the live video. Animation was also applied to gradually reduce the size of the cells to establish the effect of pixelating. 


In the second mirror shader, I wanted some effects like frosted glasses. I experimented with distance function to distort the live video with a vague circular effect. There are two layers of the shader, one is a grid of distorted live video, and the other is a full-size distorted live video in which the color is divided by the grid one. It surprisingly created a bubbly cyan glass effect. Animation was applied to reduce the size of the cells to gradually present a clear reflection of the live video. 


In the third mirror shader, I applied the light-dark contrast theory from week 4. I used time and fract() to allow random rows or cells to have the darkest brightness among the grid. The grid is formed by negative color live video with a pair of tertiary colors Рdark purple and bright pink. I added a full-size live video in the background to add some visual variety.  


In the fourth mirror shader, I used snoise to distort the live video. I added an animated rectangle in the middle to allow the interaction between static animation and live video. 


Throughout the assignment, there were obstacles such as not knowing how to work with color in p5.js and the code did not work as expected. Fortunately, I was able to solve it in the midterm workshop period and develop unexpected code into surprises.  

Midterm: Absolute incineration and deceneration in interference land.


LINK TO VIDEO:  https://youtu.be/g5R802LlbLA

TOUCHDESIGNER FILE: https://github.com/AtharvaJ3110/Shader-art/blob/main/triangle%20field%20shader.7.toe

GLSL CODE: https://github.com/AtharvaJ3110/Shader-art/blob/main/Absolute%20incineration%20and%20deceneration%20in%20interference%20land.%20GLSL%20code

‘Absolute incineration and deceneration in interference land.’ is a music video/ real-time digital mirror art piece where I further explore distance fields, which I began exploring in week three.

In this piece, I feed values from the camera input which is a vec4 into the distance field when I draw the field. to create a cool glitchy moire effect which displays some interesting fluid dynamical effects that look like burning fire.


I learnt a lot about the integration of GLSL into TouchDesigner and distance fields.

I would love to turn this into an art installation in the future. ore use the shader as real-time visuals for concerts.

Thank you for reading!



The original idea I had for my shader was to code a pattern of shapes with the p5.js editor on the sketch.js file and then have the shader be projected on the shapes that had been coded on the sketch file. But, after many experiments and hours trying to clip the shader frag and vert files onto the javascript file I realized the way I coded the file would never work for what I wanted. Thus I decided to use the Animation file given to us from the mid-term templates to create a more simplistic shader that works with the ellipses and rectangles already.

As I originally wanted the shader to clip onto the sine and cosine lines I created, I later realized that I couldn’t have an animated curve and my shader as when I activated the shader in the code the animation would freeze. My original code is inspired by sound waves. More specifically the visual aspect of sound curves.

The final version of my shader is pretty bland as it was only made when I decided I had spent too much time on my original code which wouldn’t work with my planned shaders. In the end, my final shader uses basic shapes and some mix() functions to create the gradients within the shapes.

Original Shader 

Final Assignment Shader