My first impression of the reading is that it looks weird, and then after reading a bit I think it makes sense.
The syntax of the text itself is a break out of the convention form of standard writing. In this way, it felt disoriented just by reading all the texts and felt a sense of the unexpected in each and every page that constantly allows the reader to break out from the expectations.
Like what the author said, each noise is a description ruins yet it creates new potentials and energies for new formations. The glitch is a disruption yet a new opportunity.
I felt like it’s easier to follow a convention as we want things to go the way we predict. The glitches in life are scary. Having a glitch sometimes makes me feel like the self that I was so certain of suddenly breaks into pieces leaving nothing to hold on for the next moments. As well as when making art, especially when new to something. I want things to go as expected,having a well , nice structure so what I create is something in my hand.
But reading the manifesto made me realize that a glitch also opens something new, it opens a new structure and constantly generates enormous possibilities. Like the feedback, the constantly changing itself becomes the manifesto that leads us rather than the convention.
In my composition project, I tried to work with different strcuture to craft a pience, but I find out dispite having a st
I recently worked on a live-coding piece where I ne main build and then started another. Figuring out how to bring it to a proper ending turned out to be harder than I expected. I tried fading elements in and out, layering piano, bass, and percussion, and even syncing chords with claps using MIDI—but it never quite lined up the way I imagined.
Some parts worked better than others, and the misalignments actually gave the piece some unexpected character. It was a good reminder that live coding isn’t just about making things “perfectly in sync”—it’s about experimenting, feeling how layers interact, and learning from what doesn’t go as planned.
Even though the final MIDI sync didn’t work out, the process taught me a lot about structure, timing, and how to play with musical ideas in real time.
start = do{
d1 $ qtrigger $ filterWhen (>=0) $ seqP [
(3,6, s "hh*1 hh*2 ~808:3" # gain (slow 4 (range 1.25 1.5 saw))),
(6,8, s "[808mc:1(5,8), hh*1? hh*2 ~808:3]" # gain (slow 4 (range 1.25 1.5 saw))),
(8,30, s "[<808mc:2(7,8) 808mc:1(5,8)>, hh*2? hh*4 ~808:3]" # gain (slow 4 (range 1.25 1.5 saw)))];
d6 $ qtrigger $ filterWhen (>=0) $ seqP [
(0,3, ccv "<0 31 64 127>" # ccn "0" # s "midi"),
(3, 6, slow 2 $ ccv (segment 128 $ range 0 127 tri) # ccn "1" # s "midi"),
(6, 8, struct "t(5,8)" $ ccv (segment 128 $ range 0 127 tri) # ccn "1" # s "midi"),
(8, 30, struct "<t(7,8) t(5,8)>" $ ccv (segment 128 $ range 0 127 tri) # ccn "1" # s "midi"),
(3,30, ccv "0 0 0 127" # ccn "0" # s "midi"),
(5,8,struct "t(5,8)" $ ccv (range 127 0 tri) # ccn "3" # s "midi"),
(8,30,struct "<t(7,8) t(5,8)>" $ ccv (range 127 0 tri) # ccn "3" # s "midi"),
(4, 50, struct "~ ~ ~ t" $ ccv "<5 60>" # ccn "4" # s "midi")
--(5, 9, fast 2 $ ccv "0 64 127 64" # ccn "1" # s "midi"),
-- (9, 50, ccv (slow 2 $ range 0 127 saw) # ccn "1" # s "midi")
];
-- os there a way to fade the beats in
hush
d2 $ qtrigger $ filterWhen (>=0) $ slow 4 $ s "superpiano*4" # note "[c4,e4,g4,b4] [a3,c4,e4,g4] [f3,a3,c4,e4] [g3,b3,d4,a4] [d4,f4,a4,c5,e5]"+"c3" # legato 2.8 # room 0.8 # lpf (range 600 800 (slow 4 saw))
# speed (range 1.995 3.005 (slow 10 saw)) # pan (range 3 9 (slow 10 saw))
}
build =do{
xfade 2 $ qtrigger $ filterWhen (>=3) $ s "superpiano*8" # note "[c4,e4,g4,b4] [a3,c4,e4,g4] <[f3,a3,c4,e4] [g3,b3,d4,a4]> [d4,f4,a4,c5,e5]"+"c3" # legato 2.8 # room 0.8 # lpf (range 600 800 (slow 4 saw))
# speed (range 1.995 3.005 (slow 10 saw)) # pan (range 3 9 (slow 10 saw));
d3 $ qtrigger $ filterWhen (>=0) $ seqP[
(0,5, s "tabla*8" # n (irand 12) # room "0.4" # hpf 300),
-- (1,5, every 8 (fast 0.1) $ s "tabla2*8" # n (irand 46) # room "0.4"),
(5, 15, stack [s "tabla*8" # n (irand 12) # room "0.4",
s "jungbass*5" # note "1 0 1 0 0" # speed 0.25 # room 0.7]),
(15, 30, s "jungbass*8" # note "1 0 1 0 0" # speed 0.8# room 0.8 # gain 2)];
}
build2=do{
xfade 1 $ qtrigger $ filterWhen (>=0) $ s "[808sd:7*3, <808bd:3*2, 808bd:1*4>, hh*2]" # room "0.25" # nudge 0.2 # speed (range 1 1.5 (slow 2 saw))# gain 2;
d2 $ qtrigger $ filterWhen (>=0) $ seqP[
(0,5, s "superhoover">| note (scale "major" "<[0 4 0] [0 6 0]>" + "c4")# room 0.8 # gain 0.8 # legato 2.5 #speed (range 1.4 1.5 (slow 2 saw)) #lpf 400),
(5,15, s "superhoover">| note (scale "major" "<[0 5 6] [0 7 0]>" + "c4")# room 0.8 # gain 0.8 # legato 2.5 #speed (range 1.4 1.5 (slow 2 saw)) #lpf 500),
(15,20, s "superhoover">| note (scale "major" "<[0 6 7] [7 6 0]>" + "c4")# room 0.8 # gain 0.8 # legato 2.5 #speed (range 1.4 1.5 (slow 2 saw)) #lpf 600)
] # hpf (slow 4 (100*saw + 100)) # speed (slow 4 (range 1 4 saw));
d6 $ qtrigger $ filterWhen(>=0) $ seqP[
(0,15, ccv (range 0 64 tri) # ccn "1" # s "midi"),
(0,30, ccv (range 0 32 tri) # ccn "2" # s "midi")]
}
hush
start
build
do
xfade 1 silence
xfade 2 silence
xfade 6 silence
build2
xfade 3 silence
xfade 1 silence
hush
shape(3, () => 0.1 + cc[1] * 0.2, 0.2).rotate(() => cc[0] * 3.14)
.invert(0.9).modulatePixelate(noise(()=>2+cc[3]3, 0.5), 100, 1).color(0.65,0.86,()=>2-cc[4]5).scale(1,()=>window.innerHeight/window.innerWidth,1).out()
shape(3, () => 0.1 + cc[1] * 0.2, 0.2)
.repeat(2, 2)
.rotate(() => cc[0] * 3.14)
.modulatePixelate(noise(() => 2 + cc[3]*3, 0.5), 100, 1)
.invert(0.9)
.color(
() => 0.65 + Math.sin(time) * 0.1,
() => 0.86 + Math.cos(time*1.2) * 0.1,
() => 2 - cc[4]*5
)
.repeatX(4) // 4 across
.repeatY(2) // 2 d
.scale(1, () => window.innerHeight/window.innerWidth, 1)
.out()
// reactive Hydra visual for build2
osc(30, () => cc[1], 0)
.modulatePixelate(
shape(4, 0.9)
.rotate(() => cc[1] * Math.PI),
100, 1
)
.repeat(8, 8)
.color(3,5,8) // green multiplier
.hue(() => cc[2]*0.2)
.colorama(() => cc[2]*0.8)
.blend(o0, 0.5)
.out(o0)
osc(30, ()=>cc[0], 0).modulatePixelate(shape(4, 0.9).rotate(() => cc[2] * 3.14)).repeat(8, 8).color(3,5,8).hue(()=>cc[1]*0.2).colorama(() => cc[2] * 0.8).blend(o0,0.5).out()
Sitting as an audience in the 3D in Ink. I was shocked at how particles of the liquid drops gather and disperse into flashing vapor until it gathers back to form the calligraphy characters standing in solid shape. Gather, disperse and formulate different routes of writing and shapes.
The writer says that like the two hangers of coats left in his studio, “Kurokawa organizes his work according to two conceptual hangers: his most widely synaesthesia and the deconstruction of nature”. Different from how writers mix different senses as an expression, I’m really intrigued about how Kurokawa “sync” the visuals and audio together to create a satisfaction of matching and perfection of synchronization and yet going beyond patterns and comfort of being organized. In his “Octfall”, I feel fulfilled seeing when with a base drum beat and a glitch the visual switches from one screen to another, zooming in its size. There’s a sense of “oh wow” feeling running through my body as I see how the visual change and the audio change synchronize in a way that we don’t initially recognize.
Kurokawa’s notion on “Deconstruction of nature” is like a disorientational feeling generated through the visual’s change and the defamiliarization of naturalistic objects, or concepts we touch in our everyday life. He leaves time of suspension, repetition and sudden shifts of the pace; it’s like the objects and the elements depicted in his pieces go through a transformation as the audio triggers the changes. I don’t really know how he builds up these changes, but just watching it, not even in real life, gives me an awing feeling. It feels like he’s tearing nature apart by distorting it a little, as he proceeds with a little twist one by one, it explodes into a different word.
The precision involved in using tech to create art is so high. Having different software systems for trial, carrying cables through water vessels, I’m just envisioning the hardship of the installation with such a sophisticated art piece. It must be such a cool yet stressful job to handle these installations.
Halfway through the reading, I realized that I should turn on some beats to understand what “microtiming” rhythm actually feels like.
I never really thought about music through the lens of rhythm before, whether from the notes in piano, or the lyrics in the operas and pop songs, I didn’t realize how the beats, the tempo and the rhythm magically taps and leads my body around in the spaces. Music has always been a mystical yet intimidating realm for me. I feel its magical power in the sudden shift of atmosphere, mood, emotions and the dynamic human interactions, but being able to “know music” or “create music” seems to be always tightly linked with talent and gift. It seems like you’ll need to learn piano for 9 years or guitar or knowing all the instruments to actually create something that can be called a piece of music.
This reading, though, makes me realize it’s actually not impossible for someone like me to try something. I don’t have a drum set, so I tried to clap and stump to recreate and emphasize the magical “backbeat” following the author, and wow, it’s actually so simple to create something you can vibe with. I really appreciate how the reading helps bring music outside the podium of “the blue hall” for me. With the table, a chair, the hand claps, it’s actually not that hard to create a rhyme following some guidance of the beat patterns.
The groove music is very improv based, testing a sudden variation and deviating from the original pattern by layering it with another beat or tone brings so much more texture. It’s like in theater when we improv, we want unexpected tensions and events. Something needs to happen. And I realize it’s very important in music too. The reading says that the“variety of expressive timing against an isochronous pulse contains important information about the inner structure of the groove.” It’s the patterns that don’t go as expected that incorporate the human aspects of music.
With the live coding platform, we can recreate this delay of beat or the sudden deviation with our programming language to bring the human body into dance moves.
Real time generation of code becomes part of an aesthetic sense, where it comes part of an improvisational performance like coming up with a plot, a movement or a piece of music on the spot in more theatrical improv senses. Our code replaces the bodies used in performance, and to me it feels like a constant reminder of the creator’s existence.
I find it so interesting how the author says that “coding and everyday life are drawn together in ways that become imperceptible”. When I’m using technology devices, I find myself judging whether a software is user friendly by evaluating it with how much technical graphic exposure the graphic designer utilizes, and I tend to consider user-interfaces that contains code as a form of revealing the “backstage of a performance” like an error that is not supposed to appear.
The revealing of the code thus brings a sense of disfamiliarity that tears down the whole illusion of a stage setting and reveals that it’s a show and that’s how we get it to work. Live coding becomes such a fun and pure act that doesn’t serve a commercial purpose like the big tech. It’s like showing the camera man and the whole tech crew behind a film.
Yet, one more thing that struck me is the “improvise” aspect of live coding, and the spontaneity of it. “A live coding musician is like an improvising composer, able to transform the whole structure of the piece with a few keystrokes.” And YESS, I find this so cool that the computer programs that I’ve learned to be “dull and systematic” can be transformed into an instrument with no predictive behavior where our emotions and impulses can have full control of it. We’re allowed to be present through our computer and live and feel something real with it.
Exciting!