One point I found particularly compelling is the idea that artificial nature can evoke emotional responses similar to those triggered by real nature, especially when designed thoughtfully. The text discusses how even simulated environments—like those in theme parks or controlled greenhouses—can evoke feelings of wonder, peace, or introspection, blurring the line between authenticity and imitation. This struck a chord with me because it challenges the often rigid dichotomy we hold between “real” and “fake” nature. For instance, I visited the greenhouses at world expo in Shanghai and found myself genuinely moved by the scale, the attention to detail, and the atmospheric design of the Cloud Forest greenhouse. The text itself is very open the emotional legitimacy of artificial experiences, which I appreciated. It doesn’t dismiss artificiality as hollow or deceptive but rather explores how intention, design, and sensory engagement can make even synthetic environments meaningful.

However, its also important to recognize that the reading also raises deeper questions about sustainability and our evolving relationship with nature: If artificial nature can provide emotional satisfaction, will society start to invest more in simulations and less in preserving real ecosystems? Or could well-designed artificial nature actually help foster a deeper appreciation and urgency to protect what’s left of the natural world?

It was fascinating to read about two artists who took polar opposite approaches to what it means to “live code” in terms of where the liveliness lies and what it means for something to count as “live coding.” I personally thought that I was more similar to Deadmau5 because like him, I like to have a structure/demo of what’s going to happen even if it’s not entirely concrete, especially because with the music part; this is because I think that unlike Hydra where there’s less stakes with improvising your visuals, your audio should have a build up and a storytelling aspect that’s clear to the audience. After all, it’s more prominent if your different audio aspects don’t harmonize.

“The liveness in live coding is fulfilled through a performer’s activity in generating the sound, rather than a performer’s presence as a figurehead in a spectacle.”

I thought the above quote was quite interesting, because it summed up the writer’s belief that the essence of live coding should be in the element of the performer actually performing and thinking at the spot on his composition, rather than the performer just hitting the play button, which is what Deadmau5’s performances are like according to the writer. This made me question my own self on how much I am exactly “live coding,” because although I’ve been tweaking things on the spot, I still had the majority of the code all planned out before my performances. Did this mean that I wasn’t fulfilling that “live” aspect of live coding, too?

Finally, the writer’s conclusion of how live coding is a practice that opens us up to the “unbounded exploration of the musical potential of materials” made me realize that one of the most important mindsets I should have in live coding is to not be afraid of making mistakes, which are bound to happen especially if I were to respect and follow the live coding’s liveness element and do more improvisation at the spot.

After reading this article, I watched a performance by Derek Bailey. It is a combination of live guitar performance and dancing. The sound that Derek Bailey made is completely different from the typical impression we have on guitar. It doesn’t include delicate chords and is more like some random experiments on the instrument. Though it’s not a great pleasure to listen to it, I still appreciate how he combined guitar, percussion, and dance in a single performance and made everything out of nothing. These improvisational performances also happen in Jazz sessions, in which the players often jam together, communicating with each other only by instruments. Though they are using traditional instruments, I think they have something in common with live coders, which is the enthusiasm for creating new patterns in performance and not being afraid to make mistakes.

My impression of DJ performances used to be a performer standing behind a big DJ controller, mixing tapes and creating effects during the performance. However, things seem to be different nowadays. The DJs are actually playing pre-setted audios and visuals, so the role they play in the performance is actually more like a conductor who lets the audience dance to the beats. It is interesting that computer programs are leading people to two extremes: creating completely repeatable music by using preset patterns and creating completely random music by utilizing random functions. Though the performances with pre-recorded audio can still be exciting, I think the spirit of jamming should be celebrated, and that is what live coders are doing, just like what Bailey did in his performance. The computer is the tool and approach, but the spirit is what really matters.

This paper covered varied insights into topics including: what “live” means in a live performance; the role of the performer and the laptop plays in a set; and what is improvisation like in a live coding performance, etc. One part that really attracts me is the 5th chapter, in which the author tried to interprete the computer-based improvisation act with the old school free improvisation which is deeply rooted in the history of jazz. Jazz developed its own methods for unplanned performance, shaped by shared history and physical instruments. When transferred to a digital system, these methods must be adjusted for tasks such as bringing uncertainty with the random functions, or play with different parameters to create unpredictable sound effects.

The chapter also brings up the idea of the computer as a partner rather than just a tool. In many jazz settings, each musician has a voice in the group’s shared sound. The author proposes that a laptop or other digital device can also “respond” in real time, acting like an extra player. This suggests that computer-based improvisation is not strictly about striking keyboards or entering code—it can be performers manipulating external devices to adjust codes on the computers in the NIME field. This reminds me of a possible application: control the patterns based on EEG from both the performers and the audiences, so that everyone can engage in and create something based on their continuous brain pulses which reflect their feelings on the performance seems to be interesting.

Reading about “the instrumental impulse” in this paper reminded me of the psychological concept of Perceptual Narrowing—a cognitive phenomenon where, for efficiency, humans develop schemas that help make fast decisions, but in doing so, lose access to a broader range of possibilities. When I was studying about surrealism, surrealists believed one way to return to authenticity was through Automatism—letting the mind wander freely, uncensored. This mirrors the spirit of improvisation, where intuition precedes logic and the unknown is welcomed. I believe this can be applied to live-coding, or, in broader concept, improvisation becomes a tool not just for performance, but for rediscovering what expression and creativity can mean beyond established norms.

I was especially moved by the author’s framing of the computer as a romantic, expressive instrument—not just a cold rational machine. Just like traditional instruments, laptops in live coding are personalized by each performer’s unique techniques and approaches. In this way, every computer becomes a partner, emotionally bonded with the artist. This echoes how traditional musicians develop intimate relationships with their instruments, shaping their identity over time.

Lastly, I found the contrast between live coding and tools like Ableton or Radial insightful. While those tools prioritize accessibility and user-friendly design, live coding embraces the rawness of syntax and structural logic. From my perspective, neither is better or worse—just different in intention. Live coding reflects the artists’ passion for exploring the pure beauty of machine, representing a more experimental, deeply embodied relationship with technology—one that reimagines both the machine and the human in the act of creation.

Inspired by a p5.js sketch I had created during Intro to IM:

P5 Code:

let baseSize = 40;
let p5 = new P5();
s0.init({ src: p5.canvas });
p5.hide();
p5.frameRate(30);

p5.setup = () => {
  p5.createCanvas(400, 400);
};

p5.draw = () => {
  p5.background(220, 220, 220, 40);

  let cc0 = (typeof cc !== 'undefined' && cc[0] != null) ? cc[0] : 0;
  let cc1 = (typeof cc !== 'undefined' && cc[1] != null) ? cc[1] : 0;

  // determine mode:
  // Mode 0: Grid of ellipses (cc0 < 0.33)
  // Mode 1: Rotating lines (0.33 ≤ cc0 < 0.66)
  // Mode 2: Central blob (cc0 ≥ 0.66)
  let mode = cc0 < 0.33 ? 0 : cc0 < 0.66 ? 1 : 2;

  // map cc1 to color values
  let r = p5.map(cc1, 0, 1, 50, 255);
  let g = p5.map(cc1, 0, 1, 100, 200);
  let b = p5.map(cc1, 0, 1, 150, 255);

  p5.noStroke();

  if (mode === 0) {
    // mode 0: Grid of ellipses
    let spacing = p5.map(p5.mouseX, 0, p5.width, 20, 50);
    let size = baseSize * (1 + cc1) + p5.map(p5.mouseY, 0, p5.height, 0, 20);
    for (let x = 0; x < p5.width; x += spacing) {
      for (let y = 0; y < p5.height; y += spacing) {
        p5.fill(
          p5.random(r - 20, r + 20),
          p5.random(g - 20, g + 20),
          p5.random(b - 20, b + 20),
          120
        );
        p5.ellipse(x, y, size, size);
      }
    }
  } else if (mode === 1) {
    // mode 1: Rotating lines
    p5.push();
    p5.translate(p5.width / 2, p5.height / 2);
    let numLines = 12 + Math.floor(cc1 * 12) + Math.floor(p5.map(p5.mouseX, 0, p5.width, 0, 6));
    let step = p5.TWO_PI / numLines;
    let speed = 0.02 + p5.map(p5.mouseY, 0, p5.height, 0, 0.05);
    for (let i = 0; i < numLines; i++) {
      let angle = i * step + p5.frameCount * speed;
      let len = 100 + cc1 * 100;
      p5.stroke(r, g, b, 150);
      p5.strokeWeight(2);
      p5.line(0, 0, len * p5.cos(angle), len * p5.sin(angle));
    }
    p5.pop();
  } else if (mode === 2) {
    // mode 2: Central blob multiple rotated polygons
    p5.push();
    p5.translate(p5.width / 2, p5.height / 2);
    let copies = 6 + Math.floor(cc1 * 6) + Math.floor(p5.map(p5.mouseX, 0, p5.width, 0, 4));
    let step = p5.TWO_PI / copies;
    for (let i = 0; i < copies; i++) {
      p5.push();
      p5.rotate(i * step + p5.frameCount * 0.01 + p5.map(p5.mouseY, 0, p5.height, 0, 0.05));
      p5.fill(r, g, b, 180);
      let sides = 5 + Math.floor(cc1 * 5) + Math.floor(p5.map(p5.mouseY, 0, p5.height, 0, 3));
      let radius = 50 + cc1 * 50;
      p5.beginShape();
      for (let j = 0; j < sides; j++) {
        let a = p5.map(j, 0, sides, 0, p5.TWO_PI);
        p5.vertex(radius * p5.cos(a), radius * p5.sin(a));
      }
      p5.endShape(p5.CLOSE);
      p5.pop();
    }
    p5.pop();
  }
};

src(s0)
  .modulate(noise(3, 0.2), 0.05)
  .blend(osc(10, 0, 1).rotate(0.1), 0.1)
  .out(o0);

render(o0);

// hush()

Tidal:


d1 $ slow 2 $ s "house(4,8)" # gain 0.9

d2 $ stack [s "drum(4,8) realclaps(4,8)" # speed 4 ,ccv "<64 0 127 0>" # ccn "0" # s "midi"]

d3 $ stack [ s "popkick(8,16)" # speed 3 , ccv "<32 96 127 64>" # ccn "1" # s "midi" ]

d5 $ stack [ s "pad(4,8)" # speed 1.5, ccv "<127 64 32>" # ccn "2" # s "midi" ]

d5 silence

Output (please disregard the audio quality, my microphone was not working properly):

Hydra Code:

let p5 = new P5();
p5.createCanvas(p5.windowWidth - 100, p5.windowHeight - 100, p5.WEBGL, p5.canvas);
s0.init({src: p5.canvas});
src(s0).out();
p5.hide();

p5.angleMode(p5.DEGREES);
p5.colorMode(p5.HSB);
p5.stroke(199, 80, 88);
p5.strokeWeight(3);
p5.noFill();
p5.pixelDensity(1);

let r = p5.width / 3;
let rotationSpeed = 0.8;
let hueShift = 0;

const settings = [0.75, 5, 5];

p5.draw = () => {
  p5.clear();
  p5.orbitControl();
  p5.rotateX(p5.frameCount * rotationSpeed);
  p5.rotateY(p5.frameCount * rotationSpeed);

  settings[0] = cc[0] || 0.75;
  settings[1] = cc[1] ? cc[1] * 10 : 5;
  settings[2] = cc[2] ? cc[2] * 10 : 5;

  p5.rotateX(65);
  p5.beginShape(p5.POINTS);
  for (let theta = 0; theta < 180; theta += 2) {
    for (let phy = 0; phy < 360; phy += 2) {
      let x = r * (1 + settings[0] * p5.sin(settings[1] * theta) * p5.sin(settings[2] * phy))
              * p5.sin(theta) * p5.cos(phy);
      let y = r * (1 + settings[0] * p5.sin(settings[1] * theta) * p5.sin(settings[2] * phy))
              * p5.sin(theta) * p5.sin(phy);
      let z = r * (1 + settings[0] * p5.sin(settings[1] * theta) * p5.sin(settings[2] * phy))
              * p5.cos(theta);
      p5.stroke((hueShift + theta + phy) % 360, 80, 88);
      p5.vertex(x, y, z);
    }
  }
  p5.endShape();

  hueShift += 0.1;
}

src(s0)
  .modulatePixelate(noise(()=>cc[2]*20,()=>cc[0]),20)
  .out()

src(s0).modulate(noise(4, 1.5), 0.6).mult(osc(1, 10, 1)).out(o2)

render(o2)

src(o2)
  .modulate(src(o1).add(solid(0, 0), -0.5), 0.005)
  .blend(src(o0).add(o0).add(o0).add(o0), 0.1)
  .out(o2)

render(o2)


hush()

TidalCycle Code:

d1 $ ccv "0 40 80 127" # ccn "0" # s "midi"

d2 $ slow 4 $ rotL 1 $ every 4 (# ccv (fast 2 (range 127 0 saw))) $ ccv (segment 128 (range 127 0 saw)) # ccn "1" # s "midi"

d3 $ slow 4 $ ccv (segment 128 (slow 4 (range 127 0 saw))) # ccn "2" # s "midi"

d4 $ slow 4 $ note "c'maj d'min a'min g'maj" # s "superpiano" # legato 1 # sustain 1.5

d5 $ slow 4 $ note "c3*4 d4*4 a3*8 g3*4" # s "superhammond:5"
  # legato 0.5

xfade 4 $ "bd <hh hh*2 hh*4> sd <hh [hh bd]>"
  # room 0.2

once $ s "auto:3"

hush

Link to the video.