The difference between live coding and traditional programming is funny. essentially they are both the same, predefined functions, known variables and different ways of variable manipulation that changes how things are from input to output. The difference stems however in how manupilatable this output is and a chance to completely change the output of a function based in one keyboard symbol to get a string of completely different outputs that were not exactly planned by the function creator is even more interesting.

“Rejected the use of prewritten code and structures” in favor of “acting in the moment, responding to context… developing a structure as we work, continually creating and resolving tension”

ALGOBABEZ (Shelly Knotts and Joanne Armitag)

Live coding brings the definition of expected/planned output to a different form of adapting to the user response. Not a coded reaction, not an expected reaction, but a way of letting a human determine the next course of action in a way that combines the viewer’s reaction with the artist’s to create an instant change that combines both together. It’s like the authors said,

Yet, for all that computing neglects— those bodies hunched over keyboards and mice, at their desk- and- chair sets in the offices of the world— the live coding performer is unavoidably embodied (“made flesh”)

Live coding brings the person manipulating the code to life in a way traditional coding does not.

Something else that also stood out to me in this chapter is comparing the live coding to a performance. If we think about dance, then all traditionally dancing is the same as traditional programming, but maybe more contemporary, like an unpracticed dance performance, a free dance performance with code. The dancers know the dances, the programmers know the code, but neither know the trajectory of the performance

liveness as a connecting principle for exploring the relation between live coding (performing with programming languages) and live art (performing with actions).

In what sense is live coding “live”? Live coding is live in that it is an active conversation between the coder, the machine, the audience, and everything else that surrounds and permeates the setting at hand; live coding is live in that it proceeds in a constant state of spontaneity and is characterized by its resistance to being defined and boxed. Live coding is an amorphous creature, very much alive in that it is never stagnant and craves change. It materializes not just as lines of code or musical notes; live coding is a representation of the relationships that manifest between everything that is present and around the performance itself.

As described in the excerpt, live coding by nature resists a singular definition—and in turn, a singular liveness. It thus demands a nuanced understanding that takes an array of perspectives into account: “The interdisciplinary nature of live coding . . . requires that its very liveness be understood from more than one epistemological and ontological perspective” (159). Live coding is unique in that it transcends spatiotemporal conventions. A performance may exist and be experienced in the present, but certain sequences and samples may be prescripted and prerecorded. Then there is the concept of the undetermined future, which live coding embraces in all its uncertainty. Live coding also creates a platform upon which the physical and digital coexist; it lives in the in-between. 

What emerges from a live coding performance is vastly different for each individual that experiences it, whether they be the live coder on the stage or a member of the audience. Live coding is thus an indeterminate manticore that assumes a different form to all and can only be defined through this ambiguity. Live coding is alive, thriving and pulsating in the bricolage of the predetermined and the yet-unborn.

The underlining of the various intersectionalities that live coding exists between, by performance scholars Matthew Reason and Anja Mølle Lindelof, helped me internalize the multiplicities inherent in this medium we have been practicing. Reason and Lindelof bring up live coding’s position between music studies and media studies, and extend this into the various definitions of the word “live” in this field. They note how music studies views liveness in terms of recording while media studies views it with a focus on transmission. Hence, the exploration of “liveness” that we have been doing over this semester is a nascent concept, with spontaneity, improvisation and performance elements borrowed from multiple disciplines.

The paper later on highlights the somatic and corporeal nature of live coding as well. Viewing our performances from a kinaesthetic lens, the author highlights our physical interaction with the machine:  “a sensorimotor movement vocabulary of micro adjustments, changes, and shifts performed in the frantic keystrokes, in the shuttling of the cursor around the screen, in the flash points of activation and execution”. This brought me back to our discussions regarding adding a performance element to our final showcase. While certain traditional instruments, like a piano or violin, lend a formality to music performances akin to an orchestra, live coding can be seen in the same paradigm as DJing perhaps. Many DJs construct a performance element around otherwise kinesthetically simple action,  like pushing faders and turning knobs, successfully transforming what would be a mundane performance into stadium-selling events.

In the chapter on Live Coding’s Liveness(es), the author discusses the concept that “for some live coders, nothing is saved, recorded, or archived in support of future replaying: the performance both begins and ends with the blank screen/slate.” This idea prompted me to consider the role of archiving in live coding, especially given its spontaneous real-time and potentially ephemeral nature.

Recently, I came across an essay in the book “Electronic Superhighway: From Experiments in Art and Technology to Art after the Internet” that discussed Performative archiving. The author critiques the notion of ephemerality as an excuse for relinquishing control and suggests that “materiality” is a more fitting term, one that evolves over time and through performance. The essay also reflects on how materiality is represented in museums, particularly during the 1990s and 2000s, noting the practice of listing the materials used in net artworks on wall labels, treating digital materials as physical substances.

I found it fascinating that even something as live and real-time as live coding can evoke a sense of materiality. This prompted me to think about the complexities of defining and preserving works in digital realms or those of a performative nature. When we view live coding as an art form, can it truly be considered as such without being archived? Is it the very act of documenting it that solidifies its significance within the discourse?

Video Demo

Sound Composition

Both sound and visual design were built around the main idea of our project — a horror-esque live coding performance. It was definitely unconventional for live coding performances, but for some reason the idea stuck as our group was super excited about audience reactions moreso than the performance itself.

We borrowed many sound elements from horror movie soundtracks — ambient noises, ominous drums, and banging doors. The performance starts with some ambient noise (‘ab’ sound played with a tweaked class example), thumping sounds (jvbass) and crows (crows) chirping at the background. We transition to a few different main ‘melodies’, a haunting vibraphone sound (supervibe), footsteps (a custom sample) and a few other sound samples that sounded unnerving to us.

The buildup happens through the ‘footsteps’ custom sample we have. We speed up the footsteps as we fade out the sound elements to make it sound like someone is running away from something. At it’s peak, we transition to the jumpscare, before fading to a scene thats reminiscent of TV static. The sound for the TV static scene is our interpretation of TV white noise through TidalCycles, and we made good use of the long sample examples that were provided to us in class.
d2 $ jux rev $ slow 8 $ striateBy 32 (1/4) $ s "sheffield" #begin 0.1 # end 0.3 # lpf 12000 # room 0.2 # cut 1;

Visual Composition

The visuals are based off an image of a hallway with a door at the end. The main idea behind it was to zoom in towards the door as the jumpscare approaches. We intensified the image with Hydra by adding dark spots, saturating the image, and creating a flickering effect as the performance progressed. At the end, we end with a shot of a zoomed in door before transitioning to the jumpscare.

Getting the jumpscare video to work ( please don’t visit the video.src unprepared ) took us a surprising amount of time, as we were struggling to find a place to upload the video. Uploading the video on Google Drive didn’t seem to work because of some CORs errors, and uploading to YouTube didn’t work either. We finally found a host that worked which was imgur.

var video = document.createElement('video');
video.loop = false video.crossOrigin = "anonymous";
video.autoplay = true;
video.src = 'https://i.imgur.com/uFj91DQ.mp4';
video.oncanplaythrough = function() {
s0.init({ src: video, dynamic: true });
src(s0).out()
}

After the jumpscare, we transition to TV static which reminded us of what you might see after being caught by a monster in a video game or what you would see on TV.


Work Division

Jun and Aadhar primarily worked on TidalCycles while Maya worked on Hydra, which was the same arrangement we had as the first drum circle project last week. However, we found ourselves giving a lot more feedback to each other on each part of the project instead of isolating our parts we were working on, and we think that helped create a more cohesive project. A big part of our project was repeatedly practicing and coordinating our synchronous changes for the jumpscare, as TidalCycles needed to trigger the jumpscare sound while in Hydra you would need to trigger the jumpscare video which needed to happen at the same time for the performance to feel coherent. We decided to use the ‘scale’ of Hydra as an internal indicator among ourselves to signal when we needed to get ready to trigger the jumpscare. When the scale of the room reaches [32], both parts of the performance would trigger the jumpscare sequence.

Hydra (Nicholas):

We decided to give our drum circle project a floaty underwater-esque energy. When messing around with video inputs, I found a GIF of jellyfish swimming in the Hydra docs as a starting point.

One of the main things I hated about this GIF was the fact that it didn’t create a perfect loop. Having it as the focal point of the visual part of the performance didn’t look good. As a result, I decided to put it in a wave of color derived from an oscillator with many types of modulation applied.

Having the colors come over the screen in a wave allowed for the looping to be more seamless and created the opportunity for us to add something in the background. Buzz Lightyear was the perfect candidate for this because of this perfectly looping, ominous, and a bit funny GIF.

Placing the sad Buzz Lightyear GIF below the flood of colors adds to Buzz’s expression of hopelessness. I found this a bit funny, and by effectively combining 2 different sources to manipulate, I think it logistically also gave more opportunities to mess around with and improvise during the performance.

Tidalcycles (Ian, Chenxuan, Bato):

Rather than splitting our roles into strict parts, we decided to freely work on the audio together and see what came out of it. This unrestrained kind of jamming had yielded satisfactory results for our previous meetings, and we thought it would be best to run with what had been effective for us. That’s the spirit of live coding, after all—spontaneous inspiration!

We started by laying out a simple combination of chords alternating between two patterns using the supersaw synth, which sounded quite airy and gave off house music vibes. Ian added a nice panning effect to it, which gave it a sense of dimension and made it sound more dynamic. On top of this, we added a catchy melody that fit with the chords using the “arpy” sample, which was then made glitchy with the jux (striate) function.

We also experimented with a few drum/bass patterns that went along with the melodies. An attempt that went particularly well saw us pair the “drum” samples with a random number generator AND a low pass filter with a sine oscillator. This combination allowed for a lot of constant variation in both the literal and spatial sound for the drums, which kept things adequately erratic and exciting.

We tried to match the atmosphere of the audio with the visuals. This meant that we chose instruments that fit a more floaty-ish vibe—the supersaw synth was a surprisingly decent choice—and gave it more of that underwater feel by adding some reverb (the “room” effect) and audial dimension (the “pan” effect). The drum line with the oscillating lpf values also lent the overall audio a fluid, mesmerizing quality that we felt went along with the aquatic theme.

SOUND COMPOSITION

For the sound composition, we used the chopped sample from Magnetic-ILLIT as the main reference. Inspired by the arpeggio in its intro, we chopped off different parts of this sample to create a new composition with original rhythms. The music first starts by reverbed the piano sound, using TidalCycle functions like jux rev and off, and slowly builds up by adding elements like drums and hi-hats. The main melody is composed of different ‘beep’ sounds that were sliced from different parts of the arpeggio sample, thereby creating more glitchy aesthetics. We mostly focused on creating a combination of sounds that go well together by carefully curating different samples and trying different techniques to handle the long sample.

VISUALS COMPOSITION

For the visuals, we first started with an image, as we did for our first group project. We then worked on adding different effects that transformed our piece from a red building to a variety of different shapes and effects.

We ran into some struggles with the image loading, and thanks to Professor, we got this code, that helps load the image on the screen:

image = document.createElement('img')
image.crossOrigin = "anonymous"
image.src = "https://blog.livecoding.nyuadim.com/wp-content/uploads/photo-1711873317754-11f6de89f7ae-scaled.jpg"
loaded = () => {
   s0.init({ src: image })
   src(s0).out()
   console.log("Image loaded");
}
if (image.complete) {
   loaded()
} else {
   image.addEventListener('load', loaded)
}

Don’t forget to let flock load before running anything! >> You can check from the browser console.

We started by using .modulatePixelate(osc(10), 200).saturate(0)

Where the modulatePixelate showed interesting effects and the saturate(0) removes the colors which we reintroduce later with Colorama by choosing our colors and gradients that are different from the ones in the image. We also use mult and blend to add styles remove parts of the screen and change the effect, repeat for our main audio interaction between other effects.

We’ve played with gradually reducing the numbers in the pixelate function to simplify our visuals. Our plan is to incorporate it as the final function, making it easier for the next group to take over from us.

WORK DIVISION

The work was divided by nature into audio and visuals, when we first met, with Fatema and Marta working with visuals and Jeongin working with audio, we found that that’s where our interests lay and how it worked best for our group. Marta and I worked on different visuals and met to try various effects and see different possibilities for how we could develop different styles. Then we worked with Jeongin to align our effects, adding different interactions, and ccv audio effects to align the visuals with the audio and form a set style for our live-coding piece.