I have always thought of technology as a source of disconnect, despite its ability to connect people and overcome distances, I believe modern day use of it led to a detachment from our realities. I have connected with works that use digital media previously, but there is still a kind of disconnect in how we experience it, where media can start to feel distant or not entirely real. Reading this challenged that assumption of mine in a way I did not expect. When the text describes how these practices can create expanded or layered forms of awareness, particularly the idea of “seeing at once both inward realities and the outward surfaces of the world,” it made me reconsider whether technology has to produce that sense of separation. Even though I am still unfamiliar with many of the concepts being discussed, several of the examples stood as an representation ot me of what technology is capable of achieving, instead of an isolation away from reality, it is capable of creating a kind of dual consciousness, where it does not pull us away from the world but allows us to engage with it in a more complex and connected way.

Entering a state where I connect with my work is something that I did feel while live coding at some point. During such times, everything seems to work smoothly and the creation process is less about exerting effort into crafting something and more about bringing to life something that emerges. Until reading the text, I had never seen this phenomenon as being relevant to any other notion but the process of coding itself. This realization made me wonder about what else can be done with live coding and how the possibilities may be extended to the creation of something that remains open-ended. Rather than serving as a representation of my concept, a creation can remain in progress and continue evolving through time. Although this idea is somewhat unexpected, it also sounds intriguing since it brings a whole new perspective on what I may create and achieve. Particularly the ability to create a state of consciousness for others also intrigued me, proving to me the capabilities our there that can come out of live coding.

In the article we establish that a performance is a spectacle that either goes through “composition or facilitation” once live depending on the position an artist places themselves in the continuum between both. Placing live coding into this continuum poses some conceptual challenges, particularly in defining where liveness and instrumental agency truly reside in such performances. Live coding simultaneously embodies real-time composition while relying on pre-existing computational structures that we often practice and reference, which complicate a clear distinction between creation and execution. Reading the opposing perspectives about liveness and performer activity raised in the article further complicates this distinction, as it becomes unclear whether the value of a performance lies in visible real-time decision-making or in the premeditated design of a cohesive audiovisual experience. While the article had a clear preference, I do find myself wondering if one is more valuable than the other or if this hierarchy in itself is a product of traditional assumptions about what counts as an “authentic” performance and what doesn’t. The distinction to an extent feel less like a measure of artistic value and more like a reflection of audience expectations, where immediacy is equated with authenticity. I do agree though that revealing the process through the display of the code and the inner workings of the performance as it is being created does create a sort of connection with the audience even if they do not fully comprehend the technical language being displayed to them. This visibility reinforces the perception of real-time decision-making, aligning live coding more closely with traditional notions of liveness as active creation rather than mere execution.

We are taught to chase perfection when writing code or creating a program. To write functions that act when things fail layered with more functions that can act when the first layer of defence fails. Over the years such practices have ensured that users never get a look into the other side of the wall even when things do not go as planned, practically endangering the glitch. I never gave that a second thought until I read Rosa Menkman’s Glitch Studies Manifesto that revealed an upside to a glitch in a program that I would not have thought of before. Describing noise as a “disturbance, break or addition within the signal of useful data” highlighted how it’s only thought of as a useless disturbance that we need to get rid of. Ignoring it’s potential in revealing hidden structures and challenge the norms entrusted upon us in modern technology. Using the glitch as a critique of this move away from noise which created a consumerist culture where we constantly crave newer devices that are more deprived of noise than the ones that came before them.

While I do still understand the desire for noiseless devices, this process of thinking did help me appreciate the noise more. And with the rise in demand for older digital camera and different vintage devices I see on social media now, I’d say there is an overall shift towards the noise in an attempt to reclaim the technological sphere. Artists taking the concept of the glitch and constructing it into their work is another form of reclamation of control over not just technology but also economic and political hierarchies. Taking a glitch and encoding it into a work can be seen as creating structure out of an unstructured phenomenon where you create “a new protocol after shattering an earlier one”. However, the discussion of how this is not the case for the viewer that still view the glitch art as an unexpected disturbance highlights how the world is all about perception, where the creator of the work will now see structure the viewer will still experience the essence of the glitch. Similarly every interaction with technology, whether it’s an orderly interaction or one muddled with noise, is shaped by the perception of the viewer who decides whether this is what a perfect program is.

The approach I took to creating a composition was a reverse engineering build-up where I found sounds and combinations that I enjoy and after finding those who sounded cohesive joined them into a composition. I wanted to create a more whimsical or alien-like energy to the work when choosing the sounds that I liked as I noticed several of the dirt samples had a high-pitched noise that went well really together. There was a main part that I thought brought together the composition which ended up being my A that is repeated. With that I integrated the other parts, creating the pattern A-B-A-C pattern. I tried to experiment with several of the techniques we learnt in class to find which ones fit into my work which helped out a lot in getting a hang of them, I think the part I focused the most on was creating a visual and audio fade out which was initially a transition, but then ended up looking like a great way end the work on a high instead. Which you see at the end when the screen goes fades into fight with a rising sound matching the rise in brightness in part C.

Hydra code

render()

shape(100, ()=> 0.65-cc[3])
  .scale(1,1,() => window.innerWidth / window.innerHeight)
  .scroll(0.5,()=> -0.5+cc[1])
  .invert(()=>cc[0])
  .modulateKaleid(voronoi(55))
  .colorama(1.2).posterize(4).saturate(0.7).contrast(6)
  .blend(
    shape(4, 100),()=>cc[5]
  )
  .blend(osc(30,0.0001,0.75)
      .colorama(0.5)
      .mult(osc(20,0.0001,0.2).modulate(noise(3,1)).rotate(0.4))
      .modulateScrollY(osc(2)).modulate(osc().rotate(),.011), ()=>cc[8]
  )
  .rotate(11,()=>cc[6]/127)
  .blend(shape(10,0.5).modulatePixelate(osc(25,0.5),100).colorama(0.5),()=>cc[7])
  .out(o0)

shape(100, ()=> 0.65-cc[3])
  .scale(1,1,() => window.innerWidth / window.innerHeight)
  .scroll(0.5,()=> -0.5+cc[1])
  .invert(()=>cc[0])
  .modulateKaleid(osc(55))
  .colorama(1.2).posterize(4).saturate(0.7).contrast(6)
  .blend(
    shape(4, 100),()=>cc[5]
  )
  .out(o0)

shape(100, ()=> 0.65-cc[3])
  .scale(1,1,() => window.innerWidth / window.innerHeight)
  .scroll(0.5,()=> -0.5+cc[1])
  .invert(() => 1 - cc[0])
  .color(() => 0.2 + cc[2], 0, 1)
  .modulateKaleid(osc(55),0.1,1)
  .blend(
    shape(4, 100),()=>cc[5]
  )
  .out(o1)


shape(100, ()=> 0.65-cc[3])
  .scale(1,1,() => window.innerWidth / window.innerHeight)
  .scroll(0.5,()=> -0.5+cc[1])
  .invert(() => 1 - cc[0])
  .color(() => 0.2 + cc[2], 0, 1)
  .modulateKaleid(osc(55),0.1,1)
  .blend(
    shape(4, 100),()=>cc[5]
  )
  .out(o2)

shape(100, ()=> 0.65-cc[3])
  .scale(1,1,() => window.innerWidth / window.innerHeight)
  .scroll(0.5,()=> -0.5+cc[1])
  .invert(()=>cc[0])
  .modulateKaleid(osc(55),0.1,1)
  .colorama(1.2).posterize(4).saturate(0.7).contrast(6)
  .blend(
    shape(4, 100),()=>cc[5]
  )
  .out(o3)

hush()

Tidal Code

--A
d1 $ stack[
  s "bass1 ~ <bass1 dr_few:2> ~" # room 0.4,
  s "bd bd*4 bd ~" # room 2,
  s "hh*8" # gain 0.75,
  ccv "0 127" # ccn "0" # s "midi",
  ccv (segment 128 $ range 0 127 saw) # ccn "1" # s "midi",
  s "feelfx*8" # note (scale "major" "1 1 3 5"-24) # room 1,
  s "tink:1 tink:2 tink:4 tink:5" # room 2,
  ccv "0 30 60 127" # ccn "2" # s "midi",
  ccv "0 60" # ccn "8" # s "midi"
]

xfade 1 silence

--B
d1 $ stack [
  ccv "60" # ccn "8" # s "midi",
  s "pluck:2 pluck:3",
  ccv "0 127" # ccn "6" # s "midi"
]

d1 $ qtrigger $ seqP [
  (0, 10, ccv "0 127" # ccn "6" # s "midi"),
  (0, 10, ccv "0 0 0 60 60 0 0 0" # ccn "7" # s "midi"),
  (0, 10, ccv 0 # ccn "8" # s "midi"),
  (0, 4, s "feelfx*8" # note (scale "sine" "0 0 0 2 2 0 0 0"-24) # room 0.8),
  (0, 4, s "pluck:2 pluck:3" # distort 0.2 # room 0.8),
  (4, 6, s "feelfx*8" # note (scale "sine" "0 0 0 2 2 0 0 0"-24) # room 0.6),
  (4, 6, s "pluck:2 pluck:3" # distort 0.2 # room 0.6),
  (6, 8, s "feelfx*8" # note (scale "sine" "0 0 0 2 2 0 0 0"-24) # room 0.4),
  (6, 8, s "pluck:2 pluck:3" # distort 0.2 # room 0.4),
  (8, 10, s "feelfx*8" # note (scale "sine" "0 0 0 2 2 0 0 0"-24) # room 0.2),
  (8, 10, s "pluck:2 pluck:3" # distort 0.2 # room 0.2)
]

--A

d2 $ stack[
  s "bass1 ~ <bass1 dr_few:2> ~" # room 0.4,
  s "bd bd*4 bd ~" # room 2,
  s "hh*8" # gain 0.75,
  ccv "0 127" # ccn "0" # s "midi"
]

d1 $ stack[
  ccv (segment 128 $ range 0 127 saw) # ccn "1" # s "midi",
  s "feelfx*8" # note (scale "major" "1 1 3 5"-24) # room 1,
  s "tink:1 tink:2 tink:4 tink:5" # room 2,
  ccv "0 30 60 127" # ccn "2" # s "midi"
]

--C

d2 $ stack [
  "reverbkick!4" # room 1.2,
  "realclaps*8" # room 1,
  fast 8 $ ccv (segment 128 $ range 0 127 tri) # ccn "3" # s "midi"
]


d1 $ qtrigger $ filterWhen (>0) $ seqP [
  (0, 2, s "juno:9" # room 0.2),
  (2, 4, s "juno:9" # room 0.65),
  (4, 6, s "juno:9" # room 0.85),
  (6, 8, s "juno:10" # room 1),
  (6, 10, slow 4 $ ccv (segment 128 $ range 0 127 saw) # ccn "5" # s "midi"),
  (8, 10, fast 2 $ s "juno:9*16" # room 1.2)
]

xfade 2 silence

hush

d1 $ stack [
  ccv 0 # ccn "0" # s "midi",
  ccv 0 # ccn "1" # s "midi",
  ccv 0 # ccn "2" # s "midi",
  ccv 0 # ccn "3" # s "midi",
  ccv 0 # ccn "4" # s "midi",
  ccv 0 # ccn "5" # s "midi",
  ccv 0 # ccn "6" # s "midi",
  ccv 0 # ccn "7" # s "midi",
  ccv 0 # ccn "8" # s "midi"
]

Demo