I wanted to do something with gifs using p5 – but decided to stick to the basics for now.
In this session, I tried with gifs. This was pretty cool but I don’t know how the gif ended up leaving marks over the canvas even though that wasn’t the case with my other gifs
The first thing you notice is that the document itself doesn’t look like a normal academic paper. It’s filled with visual noise, fragmented text, and off-margin layouts that act like a glitch. This non-traditional format perfectly sets up her argument that we shouldn’t just ignore the “black box” of our computers and other interesting comments about glitch in the paper.
Menkman describes the glitch as an “exoskeleton of progress,” saying these technical interruptions are actually “wonderful” experiences. When a computer fails, we move from a “negative feeling” to an “intimate, personal experience” where the system finally shows its “inner workings and flaws”. Usually, we use technology so fast that it feels transparent, but a glitch breaks that flow and forces us to be “shocked, lost and in awe” of what the machine is actually doing.
One of the most interesting points she makes is that a glitch is “ephemeral”. The second you “name” a glitch or understand how it was created, its “momentum” is gone. It stops being a mysterious rupture and just becomes a new set of conditions or a “domesticated” tool. For Menkman, the real value of computer noise isn’t in fixing it, but in that brief moment of “devastation” where a “spark of creative energy” shows us that the machine can be something more than what it was programmed to be.
I began the composition with a simple drum and snare pattern, initially aiming to come up with an Afrobeat composition. But ended up with something completely different which sort of sounded good to me, so I chose to follow that direction. The structure follows an Intro → A → Rise → B → A → B. The composition is organized using timed sequences and multiple channels to allow me control individual layers. MIDI patterns are used to drive Hydra visuals, but this was quite challenging as my entire visual setup is made of videos. Getting them to sync with the pattern from the midi was, and is still very challenging.
What I find most interesting is Kurokawa’s statement, “Nature is disorder. I like to use nature to create order… I like to denature.” I have always thought of nature as a form of order — ecosystems functioning in balance, patterns repeating, and life sustaining itself without human interference. In contrast, I tend to see human beings as the source of disruption and chaos. That is why his perspective feels so unexpected to me. By describing nature as disorder, he shifts the responsibility of structure and organization onto the artist. It suggests that order does not simply exist waiting to be admired; it can be constructed through interpretation and design
LocoMotion is a live coding language designed specifically for dance and choreography. Instead of generating sound or abstract visuals, it controls 3D avatars in a virtual space, allowing performers to write code that directly produces movement. At its core is the dancer function, which creates an avatar that can be positioned using x, y, and z coordinates. Movement phrases are selected with animation, their speed adjusted with dur, and procedural motion can be generated using functions like osc, which creates continuous oscillation. Because the system updates instantly when code is evaluated, choreography can be composed, modified, and improvised in real time.
Within the wider live coding community, LocoMotion expands the artform into embodied movement. Live coding culture emphasizes transparency, improvisation, and the visibility of code during performance. LocoMotion contributes to this culture by treating code as a choreographed performance or score, positioning programming not as a hidden technical tool, but as an expressive, performative medium that merges computation with dance.
As someone who listens to Afrobeats almost daily, I found this paper surprisingly affirming of things I’ve felt for a long time. The author’s idea that simple, repetitive patterns can carry an entire universe of expression immediately made sense to me, because that is exactly how Afrobeats works on me as a listener. Many songs use minimal harmony or looping rhythms, yet they never feel empty; instead, the groove itself feels alive.
What stood out most was the discussion of microtiming and how tiny shifts in timing can completely change the emotional feel of a rhythm. When I listen to Afrobeats, I’m often responding less to melody and more to how the beat sits—how relaxed, smooth, or slightly off-grid it feels. The paper helped me realize that this “feel” is not accidental but deeply tied to the body, movement, and cultural listening practices. Reading this made me more aware of why Afrobeats feels so natural and immersive to me.
As a computer scientist, I’m most comfortable coding privately and presenting the finished product afterward. We’re trained to show our best selves—clean code, intentional outcomes, and working solutions. Live coding will challenge this trained instinct by making the process public, exposing not only what works but also mistakes, hesitation, and uncertainty.
Watching code evolve in real time turns programming into a way of thinking out loud rather than a finalized performance. The messiness becomes part of the work, making software feel alive and human.
I also wonder how much traditional music theory actually feeds into live coding. While theory may shape the structures in the background, live coding seems driven more by responsiveness and experimentation. It feels less about following musical rules and more about negotiating them in the moment. For me, music theory in live coding functions as something flexible—useful when needed, but never fixed—allowing spontaneity and interaction to take the lead.