Group Jam 1:
Group Jam 2:
Solo Jam 1:
Solo Jam 2:
Group Jam 1:
Group Jam 2:
Solo Jam 1:
Solo Jam 2:
I did not expect a paper about shamanism to make me think about my Hydra and TidalCycles setup, but here we are. Shanken’s framing of technoshamanism as a merger of ancient tranceinducing technologies with digital tools actually relates onto what live coding feels like from the inside. There is something about writing patterns in real time and watching sound emerge from syntax, that feels less like programming and more like tuning into something. You are not always fully in control and that is kind of the point.
The part that stuck with me most was Pauline Oliveros. Shanken describes her relationship to music as bodily and preconscious, and her ideal AI chip as something that could “perceive the spiritual connection and interdependence of all beings.” That is a wild ask for a piece of hardware, but I think I get what she means. Deep listening is never passive but a practice of expanding your attention until the boundaries between you and the sound start to blur.
What I am still sitting with is whether the technology actually enables that expanded consciousness or just simulates it. I am curious and that gap feels worth exploring more.
Group Jams:
Solo Sessions:
I think the Deadmau5 vs Bailey comparison is a really clever way to frame what live coding is. He is being honest about something that a lot of performers pretend is not happening, and I think there is something refreshing about that. For me the whole point of live coding is that something could go wrong, and that tension is what makes it feel alive. A perfectly pre planned show with synced lights and video is impressive in its own way but it does not give me that feeling.
The Bailey section hit different because I actually feel like that philosophy is closer to how I think about using Hydra and TidalCycles. I do not always know what a pattern is going to sound like or what a modulation value is going to do to a visual until I try it, and I think that exploratory quality is what makes it feel like playing an instrument rather than just operating software. I have had some of my best moments in live coding sessions when the code did something I did not expect and I just leaned into it instead of fixing it.
Jam 1:
Jam 2:
Group Jam:
Rosa Menkman’s Glitch Studies Manifesto is an interesting read but it can feel overwritten at times. That said one argument genuinely stuck with me and it was the idea that a glitch is not just a mistake but a moment that exposes what a technology actually is beneath its polished surface. When something breaks you suddenly see the system and its assumptions and limits. I think that framing makes a lot of sense.
Working in TidalCycles and Hydra I think about this a lot. When Im live coding and something goes wrong like a pattern fires off rhythm or a Hydra function produces something completely unexpected there is this brief moment of panic but also genuine curiosity. Like what just happened and why did it do that? Menkman would probably call that the acousmatic quality of glitch where you are confronted with an output you cannot fully trace back to a source. In my project I was deliberately layering tabla samples with glitchy electronic sounds and the tension between those two things felt like exactly what she was describing. I feel like the interruption becomes the point.
Where Im less convinced is her claim that intentional glitch art still counts as true glitch. I think once you are designing the error you have already domesticated it. There is a real difference between a glitch that surprises you and one that you planned. The shock is what makes it what it is.
For this composition I created an audiovisual piece using TidalCycles for sound and Hydra for visuals.
The piece is structured into five sections:
intro
buildup
breakdown
beatdrop
outro
Each of them are triggered manually in Tidal by evaluating a named do block. The sound is built around tabla samples layered with glitch percussion, with each section adding or removing layers to create a sense of tension and release. The visual side was built in Hydra with four scenes that match each compositional section. I chose tabla specifically because I wanted to bring in a sound that felt personal and culturally grounded and pairing it with glitch electronics felt like an interesting contrast between something organic and something digital.
Composition Video:
Code Snippet:
setcps (130/60/4)
intro = do
d1 $ slow 2 $ degradeBy 0.6 $ s "tabla" # n (irand 4) # gain 0.8 # room 0.95 # size 0.9 # speed 0.5
d2 $ degradeBy 0.8 $ s "glitch" # crush 12 # gain 0.6 # pan rand
d3 $ s "space" # gain 0.5 # room 0.5 # size 0.5 # speed 0.5
d8 $ ccv 0 # ccn 0 # s "midi"
buildup = do
d1 $ s "tabla*2" # n (irand 8) # gain 1.0 # room 0.6 # speed (choose [1, 1.5, 0.75])
d2 $ whenmod 4 3 (stutter 2 0.25) $ s "glitch*4" # crush 8 # gain 0.75 # pan rand
d3 $ every 4 (fast 2) $ s "tabla" # n (irand 8) # gain 0.9 # room 0.8 # speed 0.75
d4 $ degradeBy 0.8 $ s "glitch" # crush 6 # gain 0.7 # speed (choose [1, 2])
d8 $ ccv 42 # ccn 0 # s "midi"
breakdown = do
d1 $ slow 2 $ degradeBy 0.4 $ s "tabla" # n (irand 4) # gain 1.1 # room 0.95 # size 0.9 # speed (choose [0.5, 0.75])
d2 $ whenmod 8 7 (density 4) $ s "glitch*2" # crush 4 # gain 0.85 # pan rand
d3 $ silence
d4 $ every 2 (# speed 0.5) $ s "glitch" # crush 3 # gain 0.9 # room 0.7
d8 $ ccv 84 # ccn 0 # s "midi"
beatdrop = do
d1 $ s "tabla*4" # n (irand 16) # gain 1.3 # room 0.2 # size 0.3 # speed (choose [1, 1.5, 2, -1])
d2 $ whenmod 4 3 (stutter 4 0.125) $ s "glitch*8" # gain 0.9 # crush 6 # speed (choose [1, 2, -1]) # pan rand
d3 $ every 3 (fast 2) $ s "tabla" # n (irand 16) # gain 1.1 # room 0.9 # size 0.8 # speed (choose [0.5, 1, 2])
d4 $ whenmod 8 7 (density 4) $ s "glitch" # crush 8 # gain 0.8 # pan rand
d5 $ fast 2 $ s "tabla*2" # n (irand 8) # gain 1.2 # crush 5 # speed (choose [1, -1, 2])
d8 $ ccv 127 # ccn 0 # s "midi"
outro = do
d1 $ degradeBy 0.7 $ slow 2 $ s "tabla" # n (irand 4) # gain 0.7 # room 0.95 # size 0.9 # speed 0.5
d2 $ degradeBy 0.8 $ s "glitch" # crush 12 # gain 0.5
d3 $ s "space" # gain 0.4 # room 0.5 # size 0.5 # speed 0.5
d4 $ silence
d5 $ silence
d8 $ ccv 0 # ccn 0 # s "midi"
stop_it = do
d1 $ silence
d2 $ silence
d3 $ silence
d4 $ silence
d5 $ silence
d8 $ silence
intro
buildup
breakdown
beatdrop
outro
stop_it
ccActual = [0,0,0,0,0,0,0,0]
whichVisual = 0
visuals = [
()=>{noise(2,0.1).color(0.2,0.1,0.5).modulate(osc(4,0.01,0.5),0.2).scale(1,innerHeight/innerWidth).add(src(o0).scale(1.005).brightness(-0.02),0.92).out(o0)},
()=>{shape(()=>Math.floor(Math.sin(time)*4+5),0.3,0.02).scale(1,innerHeight/innerWidth).color(0.8,0.2,1).modulate(osc(20,0.05,2),0.4).rotate(()=>Math.sin(time*0.5)*3).add(src(o1).scale(0.99),0.85).out(o1);src(o1).blend(o0,0.3).out(o0)},
()=>{osc(60,0.1,3).color(1.5,0.2,0.8).pixelate(()=>Math.floor(Math.random()*32+4),()=>Math.floor(Math.random()*32+4)).modulate(noise(8,1.2),0.5).diff(src(o0).scale(1.005)).add(src(o0).brightness(-0.3),0.7).out(o0)},
()=>{noise(()=>Math.sin(time)*8+10,0.4).color(2,0.5,1.5).modulate(osc(30,0.2,1).rotate(()=>time*0.3),0.6).scale(1,innerHeight/innerWidth).add(src(o0).scale(1.008).rotate(0.02).color(1.5,0.8,2),0.92).out(o0)}
]
visuals[0]()
update = ()=>{
if(whichVisual != ccActual[0]){
whichVisual = ccActual[0]
visuals[whichVisual]()
}
}
navigator.requestMIDIAccess().then(midiAccess=>{
midiAccess.inputs.forEach(input=>{
input.onmidimessage=msg=>{
const [status,cc,value]=msg.data
if((status&0xf0)===0xb0){
ccActual[cc]=Math.round(value/127*3)
}
}
})
console.log('MIDI connected!')
}).catch(err=>console.log('MIDI error:',err))