
Mosaic is an application designed by Emmanuel Mazza that combines Live Coding with Visual Programming in order to streamline the creative process without sacrificing artistic vision. Based on OpenFrameworks, an open source creative coding toolkit founded by Zachary Lieberman, Theo Watson and Arturo Castro, it utilizes a wide variety of programming languages including C++, GLSL, Python, Lua, and Bash. As seen in my patch above, it works by connecting blocks of sonic and visual data, facilitating an easy-to-follow data flow. Because Mosaic allows for more complex possibilities while remaining accessible, it is suitable for both beginners and advanced users who wish to create a wide range of real-time audio-visual compositions.

The diagram above outlines how Mosaic hybridizes Live Coding and Visual Programming paradigms in order to reinforce feedback and promote human-machine interaction. Its page, linked here, quotes Henri Bergson: “The eyes see only what the mind is prepared to comprehend.” For someone like me, whose comprehension of programming is limited, I am grateful for applications like Mosaic that allow me to create projects I can understand.
Having previously worked with MaxMSP, I found Mosaic’s interface buggy but still intuitive and to use. For my project, I wanted to create audio-reactive visuals that convey feelings of nostalgia and loss of memory. I found this old video of my father recording a song he had made for my brother and me. In Mosaic, I navigated to ‘Objects’ and then ‘Texture’ to find all the nodes that could manipulate and export video.



As seen above, I juggled various concepts and imported multiple videos to explore how Mosaic is able to warp and blend textures to serve whatever concept I landed on. I really liked how the video grabber blended my real-time position via the MacBook camera with the singing video of my father to convey how memories stay with us even as we change and grow. Because Mosaic can only play sound and video separately, I extracted the audio file from the video using VLC media player, and started focusing on how I wanted to manipulate the audio to convey a sense of loss.

As seen above, I used the compressor and bit cruncher objects to add distortion to the sound file so that I could lower or amplify the distortion real-time by lowering the thresh and moving the slider. The entire time, I was reflecting on how if I was using with a platform that only allowed for written code, like TidalCycles, I would have to manually write out these effects, but using Mosaic, I could drag and drop the objects that I wanted and simply connect them to achieve the control the audio the way I wanted to.

The most difficult part of my project was figuring out how to connect visual components with audio so that I could manipulate the blended video of myself and my father as I increased or decreased distortion. I really liked this audio analyzer object because, as seen by the yellow input and green output, it allowed me to do just that, and as an additional bonus, it manipulated the video by sound that were playing real-time, so I could speak into the Mac microphone and the video would distort even further. I really liked how this object strengthened the concept of my project, because I could speak about memory loss and the video would distort even further in response.

The audio analyzer object converted the sound data into numerical data that could then be converted back into visual data. And I blended visual distortion with the video of my father by controlling the sliders as seen above. I really loved how accessible the controls were, allowing me to manipulate the video and sound real-time according to the demands of the performance.
The finalized video and audio components of the project can be viewed here and here respectively. Through live manipulating the video and audio with my voice and the Mosaic controls as seen in the footage, I was able to convey concepts like memory loss and nostalgia for the purposes of this project. I really loved the potential for creativity via Visual Programming that Mosaic offers, and I will definitely continue to use this application for personal projects into the future.