Below is my live coding practice for this week.

Demo 1

// feedback
src(s0).mult(osc(10,0,1)).out()
osc(2,0.,1).scale(0.5).modulate(noise(3,0.01),1).out(o1)
src(o2).modulate(src(o1).add(solid(1,1),-0.5),.005).blend(src(o0).add(o0).add(o0).add(o0),0.25).out(o2)
render(o2)

let p5 = new P5()

s0.init({src: p5.canvas})
p5.hide()

let hearts = []
let colors = ["#edbba8", "#e66f3c", "#c6b6d5", "#f1d147", "#a4cd98", "#95accb"]

class Heart {
  constructor(p) {
    this.p = p 
    this.x = p.random(p.width)
    this.y = -20
    this.r = p.random(0.5, 1.2)
    this.dy = p.random(1, 3)
    this.c = p.random(colors)
  }

  display() {
    this.p.push()
    this.p.translate(this.x, this.y)
    this.p.fill(this.c)
    this.p.noStroke()
    this.p.beginShape()
    for (let i = 0; i < this.p.TWO_PI; i += 0.1) {
      let x = 16 * Math.pow(Math.sin(i), 3) * this.r
      let y = (13 * Math.cos(i) - 5 * Math.cos(2 * i) - 2 * Math.cos(3 * i) - Math.cos(4 * i)) * -this.r
      this.p.vertex(x, y)
    }
    this.p.endShape(this.p.CLOSE)
    this.p.pop()
  }

  fall() {
    this.y += this.dy
  }
}

p5.draw = () => {
  p5.clear() 
  if (p5.frameCount % 10 == 0) {
    hearts.push(new Heart(p5))
  }

  for (let i = hearts.length - 1; i >= 0; i--) {
    hearts[i].display()
    hearts[i].fall()

    if (hearts[i].y > p5.height + 20) {
      hearts.splice(i, 1)
    }
  }
}

src(s0)
  .modulate(noise(3), 0.1)
  .out()

Demo 2

What struck me immediately even before fully reading was the visual experience of the PDF itself. The text isn’t clean or stable; it’s fragmented, stretched, interrupted by blocks of symbols and noise. It feels like the document is actively “glitching” as you read it. The manifesto sort of performs the glitch theory rather than just describing it. The layout forces me to slow down, to become aware of reading as a mediated process. It almost resists being consumed in a normal, linear way.

This made me realize that the glitch here is highly embodied in the medium. The scattered typography, the visual noise, and the interruptions act like breaks in a signal, constantly pulling me out of passive reading. In a way, I felt like I was navigating a system that was slightly broken but also strangely expressive. That tension between frustration and curiosity felt intentional.

In livecoding, the process is visible and unstable: errors, unexpected outputs, or crashes become part of the performance rather than something to hide. Similar to Menkman’s idea, the “glitch” is nothing like a failure but a moment where the system reveals itself. When code behaves unpredictably during a live set, it creates a kind of raw, real-time interaction between the artist, the machine and the audience.

I think both glitch aesthetics and livecoding challenge this idea that digital media should be smooth and optimized. Instead they expose the underlying structure: the code, the errors, the limits. They make the medium feel alive. I was browsing albert and ran into one class on named “Experiments in the Future of Producing/Performing” which, according to the description, encourages students to hack the music/visual software and conduct software abuse in order to challenge conventional recorded music/visual products. It says, “Sound (and other kinds of art) is an unstable art form.” Reading this manifesto made me more aware of how much I usually expect technology to “just work,” and how much potential there is when it doesn’t.

It also makes me question my own creative practice: am I just using tools as intended, or am I willing to push them to the point where they break and become something else?

I gradually built up the piece from my first week’s demo and wanted to experiment combining acoustics with electronics. I used mostly built-in sound sources, and it mainly consists of two major parts, yet it does not follow a strict structure. Starting with a piano loop that kind of feels nostalgic, I started to add more electronic sounds that keep getting layered on top of one another. Then it transitions to the second stage where it felt like expanding, gradually getting more chaotic when some random piano is added. Eventually it vanishes into a supersine cluster sound.

tidal:


piano_opening1 = slow 1.5
  $ n "c4 g4 d5 ds5 [d5,bf5] ~ ~ ~ ~ ~ ~ ~ c6 ~ ~ ~"
  |+ n 7
  # s "superpiano"
  # velocity "0.8 0.6 0.75 0.45 0.5 ~ ~ ~ ~ ~ ~ ~ 0.7 ~ ~ ~"
  # detune 0.1
  # gain 0.7
  # lpf 2000
  # lpq 0.2         -- filter smooth
  # sustain 5
  # room 0.95
  # size 0.7
  # pan (slow 4 sine)

piano_opening2 = slow 1.5 $ n "f4 a4 d5 ds5 f5 ds5 d5 a4 g5 ~ ~ ~ a5 ~ ~ ~"
   |+ n 7
   # detune 0.1
    # s "superpiano"
    # velocity "0.8 0.6 0.55 0.45 0.5 0.45 0.35 0.6 ~ ~ ~ 0.7 ~ ~ ~"   --strike
    # gain 0.8
    # lpf 2000
    # lpq 0.2         -- filter smooth
    # sustain 5
    # room 0.95
    # size 0.7
    # pan (slow 4 sine)

hush

d1 $ piano_opening1
d1 $ piano_opening2

d1 $ qtrigger $ filterWhen (>=0) $ seqP [
   (0, 3, slow 1.5 $ n "c4 g4 d5 ds5 [d5,bf5] ~ ~ ~ ~ ~ ~ ~ c6? ~ ~ ~"
     |+ n 7 # velocity "0.8 0.6 0.75 0.45 0.5 ~ ~ ~ ~ ~ ~ ~ 0.7? ~ ~ ~"),
  (3, 6, slow 1.5 $ n "f4 a4 d5 ds5 f5 ds5 d5 a4 g5 ~ ~ ~ a5 ~ ~ ~"
     |+ n 7 # velocity "0.8 0.6 0.75 0.45 0.5 0.45 0.35 0.6 ~ ~ ~ 0.7 ~ ~ ~")
 ]
 # s "superpiano"
 # detune 0.1
 # gain 0.8
 # lpf 2000
 # lpq 0.2
 # sustain 2
 # room 0.8
 # size 0.9
 # pan (slow 4 sine)

d1 silence

hush

d1 silence

d2 $ slow 2 $ n "[a6|e6]*8"
  |+ n 5
  # detune 1.2
  # gain 0.4
  # s "supersaw"
  # decay 0.2
  # sustain 5
  # distort 1.2

d2 silence

d5 $ jux (rev) $
  struct "t*16?" $
  n (irand 36 + 24)
  # s "supervibe"
  # bandf (range 500 8000 $ rand) 
  # bandq 15
  # hpf (range 0 2000 $ rand)
  # sustain 0.05
  # detune 1.2
  # room 0.8 # size 0.9
  # gain (range 0.3 0.5 $ rand)
  # pan rand
  # squiz 5

d5 silence

d3 $ s "superstatic*16"
    # n (irand 12)
    # sustain 0.01
    # accelerate 20
    # pitch1 2000
    # room 0.9
    # gain (range 0.15 0.3 $ rand)

d3 silence

d4 $ slow 4 $
    n "c4 g4 d5 f5"
    |+ n 5
    # s "supersiren"
    # sustain 8
    # voice 0.3
    # hpf 500
    # lpf 2000
    # detune 0.2
    # room 0.9 # size 0.9
    # gain 0.4 
    # distort 2.0
    

d4 silence
d1 silence


d7 $ slow 4 $ s "moog:2"
  >| note (scale "<minor>" ("[<-7>,0,2]*<1>") + "[c5,c6]")
  # legato 1
  # lpf 500
  # lpfbus 1 (segment 64 (slow 4 (range 100 3000 saw)))
  # lpq 0.4
  # gain 0.4

d7 silence

d8 $ fast 4 $ sometimesBy 0.4 (shuffle 8) $ s "simplesine"
  >| note ((scale "pelog" "<-2 .. 10>"))
  # hpf 1200
  # lpf 3000
  # room 0.4
  # gain 0.5

d8 silence

hush

  d1 silence

 d10 $ qtrigger $ filterWhen (>=0)
   $ cat [
       repeatCycles 2 $ scramble 8 $ n "c4 g4 d5 ds5 [d5,bf5] ~ ~ ~ ~ ~ ~ ~ c6? ~ ~ ~"
         |+ n 7 # velocity "0.8 0.6 0.75 0.45 0.5 ~ ~ ~ ~ ~ ~ ~ 0.7? ~ ~ ~",
       repeatCycles 2 $ scramble 8 $ n "f4 a4 d5 ds5 f5 ds5 d5 a4 g5 ~ ~ ~ a5 ~ ~ ~"
         |+ n 7 # velocity "0.8 0.6 0.75 0.45 0.5 0.45 0.35 0.6 ~ ~ ~ 0.7 ~ ~ ~"
   ]
   # s "superpiano"
   # detune 0.1 # gain 0.7 # lpf 2500 # sustain 2 # room 0.9 # size 0.9 # pan 0.4

d10 silence

 d11 $ qtrigger $ filterWhen (>=0)
   $ cat [
       repeatCycles 2 $ scramble 8 $ fast 2 $ n "c4 g4 d5 ds5 [d5,bf5] ~ ~ ~ ~ ~ ~ ~ c6? ~ ~ ~"
         |+ n 19 # velocity "0.8 0.6 0.75 0.45 0.5 ~ ~ ~ ~ ~ ~ ~ 0.7? ~ ~ ~",
       repeatCycles 2 $ scramble 8 $ fast 4 $ n "f4 a4 d5 ds5 f5 ds5 d5 a4 g5 ~ ~ ~ a5 ~ ~ ~"
         |+ n 19 # velocity "0.8 0.6 0.75 0.45 0.5 0.45 0.35 0.6 ~ ~ ~ 0.7 ~ ~ ~"
   ]
   # s "superpiano"
   # detune 0.1 # gain 0.6 # lpf 2500 # sustain 2 # room 0.5 # size 0.6 # pan 0.7

d11 silence

hush

hydra:

	shape(4, 0.7, 0.01)
  .color(0.6, 0.7, 1)
  .repeat(2, 2)
	.modulateScale(osc(3, 0.1).rotate(1.5), 0.5)
  .layer(
    src(o0)
      .mask(shape(4, 0.9, 0.5))
			.scale(() => 1.01 + (0.5* 0.05))
      .rotate(() => Math.sin(time * 0.1) * 0.02)
      .luma(0.2, 0.1)
  )
  .modulatePixelate(
    noise(20, 0.5),
		() => 0.5 * 1024
  )
	.brightness(() => -0.15 + (0.5 * 0.1))
  .contrast(1.2)
  .scrollX(() => Math.sin(time * 0.2) * 0.05)
  .out();

	shape(4, 0.7, 0.01)
	  .color(0.6, 0.7, 1)
	  .repeat(2, 2)
	  .contrast(1.5)
	  .modulateScale(osc(3, 0.1).rotate(() => time * 0.2), 0.5)
	  .layer(
	    src(o0)
	      .mask(shape(4, 0.9, 0.5))
	      .scale(() => 1.01 + Math.sin(time) * 0.02)
	      .rotate(() => Math.sin(time * 0.1) * 0.05)
	      .luma(0.2, 0.1)
	  )
	  .modulatePixelate(
	    noise(20, 0.5),
	    () => [1024, 512, 128, 2048][Math.floor(time % 4)]
	  )
	  .brightness(() => -0.15 + Math.random() * 0.05)
	  .hue(() => Math.sin(time * 0.05) * 0.1)
	  .scrollX(() => Math.sin(time * 0.2) * 0.05)
	  .out();

hush()

Reading about Ryoichi Kurokawa made me think differently about how scale and control work in audiovisual art. What stood out to me most was his idea of “de-naturing” nature, to uncover structures that are usually invisible. He breaks natural phenomena down into data, sound and image to create works that feel both precise and unstable at the same time, which makes the experience feel alive rather than fixed.

I was also drawn to how Kurokawa thinks about time. He sees his work as “time design” and he justify by constantly adjusting and reshaping it across performances and installations. His slow and evolving approach contrasts how quickly technology usually moves – he doesn’t seem interested in novelty for its own sake, but in letting ideas develop gradually, the way nature does.

Another aspect I found compelling is that Kurokawa doesn’t confine himself to a single platform or format. He moves fluidly between concerts, installations, sculptures and data-driven visuals, choosing the medium based on what the idea requires rather than forcing the work into one system. This flexibility reinforces his larger interest in scale and transformation, and it made me reflect on how artistic practice doesn’t have to be tied to one tool or discipline to remain coherent.

Background: What is Pure Data?

Pure Data (Pd) is an open-source visual programming environment primarily used for real-time audio synthesis, signal processing, and interactive media. Programs in Pd are built by connecting graphical objects (known as patches) that pass messages and audio signals between one another. Unlike text-based programming languages, Pd emphasizes signal flow and real-time interaction, allowing users to modify systems while they are running.

Pd belongs to the family of visual patching languages derived from Max, originally developed by Miller Puckette at IRCAM. While Max/MSP later became a commercial platform, Pure Data remained open source and community-driven. This has led to its widespread use in experimental music, academic research, DIY electronics, and new musical interface (NIME) projects.

One of Pd’s major strengths is its portability. It can run not only on personal computers, but also on embedded systems such as Raspberry Pi, and even on mobile devices through frameworks like libpd. This makes Pd especially attractive for artists and researchers interested in standalone instruments, installations and hardware-based performances.

Personal Reflection

I find Pure Data to be a really engaging platform. It feels similar to Max/MSP, which I have worked with before, but with a simpler, more playful interface. The ability to freely arrange objects and even draw directly on the canvas makes the patch feel less rigid and more sketch-like, almost like thinking through sound visually.

Another aspect I really appreciate is Pd’s compatibility with microcontrollers and embedded systems. Since it can be deployed on devices like Raspberry Pi (haven’t tried it out though), it allows sound systems to exist independently from a laptop. This makes Pd especially suitable for experimental instruments and NIME-style projects.

Demo Overview: What This Patch Does

For my demo, I built a generative audio system that combines rhythmic sequencing, pitch selection, envelope shaping, and filtering. The patch produces evolving tones that are structured but not entirely predictable, demonstrating how Pd supports algorithmic and real-time sound design.

1. Timing and Control Logic

The backbone of the patch is the metro object, which acts as a clock. I use tempo $1 permin to define the speed in beats per minute, allowing the patch to behave musically rather than in raw milliseconds.

Each metro tick increments a counter using a combination of f and + 1. The counter is wrapped using the % 4 object, creating a repeating four-step cycle. This cycle is then routed through select 0 1 2 3, which triggers different events depending on the current step.

This structure functions like a step sequencer, where each step can activate different pitches or behaviors. It demonstrates how Pd handles discrete musical logic using message-rate objects rather than traditional code.

2. Pitch Selection and Control

For pitch generation, I use MIDI note numbers that are converted into frequencies using mtof. Each step in the sequence corresponds to a different MIDI value, allowing the patch to cycle through a small pitch set.

By separating pitch logic from synthesis, the patch becomes modular: changing the melodic structure only requires adjusting the number boxes feeding into mtof, without touching the rest of the system. This reflects Pd’s strength in modular thinking, where musical structure emerges from the routing of simple components.

3. Sound Synthesis and Shaping

The core sound source is osc~, which generates a sine wave based on the frequency received from mtof. To avoid abrupt changes and clicks, pitch transitions are smoothed using line~, which interpolates values over time.

I further shape the sound using:

  • expr~ $v1 * -1 and cos~ to transform the waveform
  • lop~ 500 as a low-pass filter to soften high frequencies
  • vcf~ for resonant filtering driven by slowly changing control values

Amplitude is controlled using *~, allowing the signal to be shaped before being sent to dac~. This ensures the sound remains controlled and listenable, even as parameters change dynamically.

Randomness and Variation

To prevent the output from becoming too repetitive, I introduced controlled randomness using random. The random values are offset and scaled before being sent to the filter and envelope controls, creating subtle variations in timbre and movement over time.

This balance between structure (metro, counters, select) and unpredictability (random modulation) is central to generative music, and Pd makes this relationship very explicit through visual connections.

Conclusion

Overall, Pure Data feels less like a traditional programming environment and more like a living instrument, where composition, performance and system design happen simultaneously. This makes it especially relevant for experimental music, live performance and research-based creative practice.

(Somehow I can’t do recording with internal device audio… I’ll try recording it using my phone sometime)

I think the author’s understanding of “feel” in music is significant in the context of computational music. If musical feel emerges from microscopic deviations in timing and intensity, then computational systems expose a clear tension: computers prioritize precision, while groove often depends on subtle imprecision. Strict quantization can therefore erase the traces of bodily presence that make rhythms sound relaxed or “in the pocket”, even when they are metrically correct.

This tension becomes especially clear in live coding. While live coding environments often emphasize algorithmic structure and real-time precision, they also introduce a performative and temporal dimension that reopens space for musical feel. Rather than encoding a fixed groove in advance, live coding unfolds in time, making timing decisions visible and responsive. Small changes in code, like shifting delays, densities or rhythmic patterns function less as abstract instructions and more as gestures comparable to microtiming adjustments in embodied performance.