After reading this article, I watched a performance by Derek Bailey. It is a combination of live guitar performance and dancing. The sound that Derek Bailey made is completely different from the typical impression we have on guitar. It doesn’t include delicate chords and is more like some random experiments on the instrument. Though it’s not a great pleasure to listen to it, I still appreciate how he combined guitar, percussion, and dance in a single performance and made everything out of nothing. These improvisational performances also happen in Jazz sessions, in which the players often jam together, communicating with each other only by instruments. Though they are using traditional instruments, I think they have something in common with live coders, which is the enthusiasm for creating new patterns in performance and not being afraid to make mistakes.

My impression of DJ performances used to be a performer standing behind a big DJ controller, mixing tapes and creating effects during the performance. However, things seem to be different nowadays. The DJs are actually playing pre-setted audios and visuals, so the role they play in the performance is actually more like a conductor who lets the audience dance to the beats. It is interesting that computer programs are leading people to two extremes: creating completely repeatable music by using preset patterns and creating completely random music by utilizing random functions. Though the performances with pre-recorded audio can still be exciting, I think the spirit of jamming should be celebrated, and that is what live coders are doing, just like what Bailey did in his performance. The computer is the tool and approach, but the spirit is what really matters.

This paper covered varied insights into topics including: what “live” means in a live performance; the role of the performer and the laptop plays in a set; and what is improvisation like in a live coding performance, etc. One part that really attracts me is the 5th chapter, in which the author tried to interprete the computer-based improvisation act with the old school free improvisation which is deeply rooted in the history of jazz. Jazz developed its own methods for unplanned performance, shaped by shared history and physical instruments. When transferred to a digital system, these methods must be adjusted for tasks such as bringing uncertainty with the random functions, or play with different parameters to create unpredictable sound effects.

The chapter also brings up the idea of the computer as a partner rather than just a tool. In many jazz settings, each musician has a voice in the group’s shared sound. The author proposes that a laptop or other digital device can also “respond” in real time, acting like an extra player. This suggests that computer-based improvisation is not strictly about striking keyboards or entering code—it can be performers manipulating external devices to adjust codes on the computers in the NIME field. This reminds me of a possible application: control the patterns based on EEG from both the performers and the audiences, so that everyone can engage in and create something based on their continuous brain pulses which reflect their feelings on the performance seems to be interesting.

Reading about “the instrumental impulse” in this paper reminded me of the psychological concept of Perceptual Narrowing—a cognitive phenomenon where, for efficiency, humans develop schemas that help make fast decisions, but in doing so, lose access to a broader range of possibilities. When I was studying about surrealism, surrealists believed one way to return to authenticity was through Automatism—letting the mind wander freely, uncensored. This mirrors the spirit of improvisation, where intuition precedes logic and the unknown is welcomed. I believe this can be applied to live-coding, or, in broader concept, improvisation becomes a tool not just for performance, but for rediscovering what expression and creativity can mean beyond established norms.

I was especially moved by the author’s framing of the computer as a romantic, expressive instrument—not just a cold rational machine. Just like traditional instruments, laptops in live coding are personalized by each performer’s unique techniques and approaches. In this way, every computer becomes a partner, emotionally bonded with the artist. This echoes how traditional musicians develop intimate relationships with their instruments, shaping their identity over time.

Lastly, I found the contrast between live coding and tools like Ableton or Radial insightful. While those tools prioritize accessibility and user-friendly design, live coding embraces the rawness of syntax and structural logic. From my perspective, neither is better or worse—just different in intention. Live coding reflects the artists’ passion for exploring the pure beauty of machine, representing a more experimental, deeply embodied relationship with technology—one that reimagines both the machine and the human in the act of creation.

Inspired by a p5.js sketch I had created during Intro to IM:

P5 Code:

let baseSize = 40;
let p5 = new P5();
s0.init({ src: p5.canvas });
p5.hide();
p5.frameRate(30);

p5.setup = () => {
  p5.createCanvas(400, 400);
};

p5.draw = () => {
  p5.background(220, 220, 220, 40);

  let cc0 = (typeof cc !== 'undefined' && cc[0] != null) ? cc[0] : 0;
  let cc1 = (typeof cc !== 'undefined' && cc[1] != null) ? cc[1] : 0;

  // determine mode:
  // Mode 0: Grid of ellipses (cc0 < 0.33)
  // Mode 1: Rotating lines (0.33 ≤ cc0 < 0.66)
  // Mode 2: Central blob (cc0 ≥ 0.66)
  let mode = cc0 < 0.33 ? 0 : cc0 < 0.66 ? 1 : 2;

  // map cc1 to color values
  let r = p5.map(cc1, 0, 1, 50, 255);
  let g = p5.map(cc1, 0, 1, 100, 200);
  let b = p5.map(cc1, 0, 1, 150, 255);

  p5.noStroke();

  if (mode === 0) {
    // mode 0: Grid of ellipses
    let spacing = p5.map(p5.mouseX, 0, p5.width, 20, 50);
    let size = baseSize * (1 + cc1) + p5.map(p5.mouseY, 0, p5.height, 0, 20);
    for (let x = 0; x < p5.width; x += spacing) {
      for (let y = 0; y < p5.height; y += spacing) {
        p5.fill(
          p5.random(r - 20, r + 20),
          p5.random(g - 20, g + 20),
          p5.random(b - 20, b + 20),
          120
        );
        p5.ellipse(x, y, size, size);
      }
    }
  } else if (mode === 1) {
    // mode 1: Rotating lines
    p5.push();
    p5.translate(p5.width / 2, p5.height / 2);
    let numLines = 12 + Math.floor(cc1 * 12) + Math.floor(p5.map(p5.mouseX, 0, p5.width, 0, 6));
    let step = p5.TWO_PI / numLines;
    let speed = 0.02 + p5.map(p5.mouseY, 0, p5.height, 0, 0.05);
    for (let i = 0; i < numLines; i++) {
      let angle = i * step + p5.frameCount * speed;
      let len = 100 + cc1 * 100;
      p5.stroke(r, g, b, 150);
      p5.strokeWeight(2);
      p5.line(0, 0, len * p5.cos(angle), len * p5.sin(angle));
    }
    p5.pop();
  } else if (mode === 2) {
    // mode 2: Central blob multiple rotated polygons
    p5.push();
    p5.translate(p5.width / 2, p5.height / 2);
    let copies = 6 + Math.floor(cc1 * 6) + Math.floor(p5.map(p5.mouseX, 0, p5.width, 0, 4));
    let step = p5.TWO_PI / copies;
    for (let i = 0; i < copies; i++) {
      p5.push();
      p5.rotate(i * step + p5.frameCount * 0.01 + p5.map(p5.mouseY, 0, p5.height, 0, 0.05));
      p5.fill(r, g, b, 180);
      let sides = 5 + Math.floor(cc1 * 5) + Math.floor(p5.map(p5.mouseY, 0, p5.height, 0, 3));
      let radius = 50 + cc1 * 50;
      p5.beginShape();
      for (let j = 0; j < sides; j++) {
        let a = p5.map(j, 0, sides, 0, p5.TWO_PI);
        p5.vertex(radius * p5.cos(a), radius * p5.sin(a));
      }
      p5.endShape(p5.CLOSE);
      p5.pop();
    }
    p5.pop();
  }
};

src(s0)
  .modulate(noise(3, 0.2), 0.05)
  .blend(osc(10, 0, 1).rotate(0.1), 0.1)
  .out(o0);

render(o0);

// hush()

Tidal:


d1 $ slow 2 $ s "house(4,8)" # gain 0.9

d2 $ stack [s "drum(4,8) realclaps(4,8)" # speed 4 ,ccv "<64 0 127 0>" # ccn "0" # s "midi"]

d3 $ stack [ s "popkick(8,16)" # speed 3 , ccv "<32 96 127 64>" # ccn "1" # s "midi" ]

d5 $ stack [ s "pad(4,8)" # speed 1.5, ccv "<127 64 32>" # ccn "2" # s "midi" ]

d5 silence

Output (please disregard the audio quality, my microphone was not working properly):

Hydra Code:

let p5 = new P5();
p5.createCanvas(p5.windowWidth - 100, p5.windowHeight - 100, p5.WEBGL, p5.canvas);
s0.init({src: p5.canvas});
src(s0).out();
p5.hide();

p5.angleMode(p5.DEGREES);
p5.colorMode(p5.HSB);
p5.stroke(199, 80, 88);
p5.strokeWeight(3);
p5.noFill();
p5.pixelDensity(1);

let r = p5.width / 3;
let rotationSpeed = 0.8;
let hueShift = 0;

const settings = [0.75, 5, 5];

p5.draw = () => {
  p5.clear();
  p5.orbitControl();
  p5.rotateX(p5.frameCount * rotationSpeed);
  p5.rotateY(p5.frameCount * rotationSpeed);

  settings[0] = cc[0] || 0.75;
  settings[1] = cc[1] ? cc[1] * 10 : 5;
  settings[2] = cc[2] ? cc[2] * 10 : 5;

  p5.rotateX(65);
  p5.beginShape(p5.POINTS);
  for (let theta = 0; theta < 180; theta += 2) {
    for (let phy = 0; phy < 360; phy += 2) {
      let x = r * (1 + settings[0] * p5.sin(settings[1] * theta) * p5.sin(settings[2] * phy))
              * p5.sin(theta) * p5.cos(phy);
      let y = r * (1 + settings[0] * p5.sin(settings[1] * theta) * p5.sin(settings[2] * phy))
              * p5.sin(theta) * p5.sin(phy);
      let z = r * (1 + settings[0] * p5.sin(settings[1] * theta) * p5.sin(settings[2] * phy))
              * p5.cos(theta);
      p5.stroke((hueShift + theta + phy) % 360, 80, 88);
      p5.vertex(x, y, z);
    }
  }
  p5.endShape();

  hueShift += 0.1;
}

src(s0)
  .modulatePixelate(noise(()=>cc[2]*20,()=>cc[0]),20)
  .out()

src(s0).modulate(noise(4, 1.5), 0.6).mult(osc(1, 10, 1)).out(o2)

render(o2)

src(o2)
  .modulate(src(o1).add(solid(0, 0), -0.5), 0.005)
  .blend(src(o0).add(o0).add(o0).add(o0), 0.1)
  .out(o2)

render(o2)


hush()

TidalCycle Code:

d1 $ ccv "0 40 80 127" # ccn "0" # s "midi"

d2 $ slow 4 $ rotL 1 $ every 4 (# ccv (fast 2 (range 127 0 saw))) $ ccv (segment 128 (range 127 0 saw)) # ccn "1" # s "midi"

d3 $ slow 4 $ ccv (segment 128 (slow 4 (range 127 0 saw))) # ccn "2" # s "midi"

d4 $ slow 4 $ note "c'maj d'min a'min g'maj" # s "superpiano" # legato 1 # sustain 1.5

d5 $ slow 4 $ note "c3*4 d4*4 a3*8 g3*4" # s "superhammond:5"
  # legato 0.5

xfade 4 $ "bd <hh hh*2 hh*4> sd <hh [hh bd]>"
  # room 0.2

once $ s "auto:3"

hush

Link to the video.

Hydra Code:


let p5 = new P5()
s0.init({src: p5.canvas})
p5.hide()

// No need for setup
p5.noFill()
p5.strokeWeight(10)
p5.stroke(255)

// Shape positions
let shapePositions = [
  { x: p5.width / 4, y: p5.height / 2, size: 200 },
  { x: (p5.width / 4) * 3, y: p5.height / 2, size: 200 }
]

p5.draw = () => {
  p5.background(0)

  // First triangle
  p5.triangle(
    shapePositions[0].x, shapePositions[0].y - shapePositions[0].size / 2,
    shapePositions[0].x - shapePositions[0].size / 2, shapePositions[0].y + shapePositions[0].size / 2,
    shapePositions[0].x + shapePositions[0].size / 2, shapePositions[0].y + shapePositions[0].size / 2
  )

  // Second square
  p5.rectMode(p5.CENTER);
  p5.square(shapePositions[1].x, shapePositions[1].y, shapePositions[1].size);

  if (cc[1] == 1) {
    p5.triangle(
      p5.width / 2, p5.height / 2 - (400 * cc[0] + 200 * p5.noise(cc[0])) / 2,
      p5.width / 2 - (400 * cc[0] + 200 * p5.noise(cc[0])) / 2, p5.height / 2 + (400 * cc[0] + 200 * p5.noise(cc[0])) / 2,
      p5.width / 2 + (400 * cc[0] + 200 * p5.noise(cc[0])) / 2, p5.height / 2 + (400 * cc[0] + 200 * p5.noise(cc[0])) / 2
    )
  } else {
    p5.square(
      p5.noise(cc[0] * 2) * p5.width,
      cc[0] * p5.height,
      200
    )
  }
}


src(s0).modulate(osc(15, 0.3, 1.2), 0.1).mult(noise(2, 0.8)).diff(src(o1)).out();
src(s0).modulate(gradient().rotate(1.5), 0.5).mult(osc(8, 0.2, 2)).diff(src(o1)).out();
src(s0).modulate(noise(150, 1.2), 0.7).mult(osc(25, 5, 30)).diff(src(o1)).out();

// Feedback effects
src(s0).modulate(noise(4, 1.5), 0.6).mult(osc(1, 5, 1)).out(o2);
src(o2)
  .modulate(src(o1).add(solid(0, 0), -0.5), 0.005)
  .blend(src(o0).add(o0).add(o0).add(o0), 0.1)
  .out(o2)

render(o2)


hush()

Tidal Code:

-- Sound
d1 $ s "bleep" # room 0.6 # gain 1
d2 $ s "~cp" # gain 1.2 # room 0.3

d3 $ sometimes (# velocity 0.6) $ iter 4 $ struct "<t(4,8) t(4,8,1)>" $ s "cp"

d4 $ s "superpiano" >| note (scale "major" ("[1 3 2 8 9 6]") + "8") # room 0.4 #gain 2 

-- CC
d5 $ whenmod 32 12 (# ccn ("<t(4,8) t t(4,8,1) t>"))
    $ ccn "0*64" # ccv (slow 3 (range 10 127 tri))
    # s "midi"

d6 $ whenmod 32 12 (# ccv "127") $ ccn "1*128" # ccv 0 # s "midi"





hush

Also Hi,

Apologies for uploading this late. Laptop was not cooperating.