LocoMotion is a live coding language designed specifically for dance and choreography. Instead of generating sound or abstract visuals, it controls 3D avatars in a virtual space, allowing performers to write code that directly produces movement. At its core is the dancer function, which creates an avatar that can be positioned using x, y, and z coordinates. Movement phrases are selected with animation, their speed adjusted with dur, and procedural motion can be generated using functions like osc, which creates continuous oscillation. Because the system updates instantly when code is evaluated, choreography can be composed, modified, and improvised in real time.

Within the wider live coding community, LocoMotion expands the artform into embodied movement. Live coding culture emphasizes transparency, improvisation, and the visibility of code during performance. LocoMotion contributes to this culture by treating code as a choreographed performance or score, positioning programming not as a hidden technical tool, but as an expressive, performative medium that merges computation with dance.

Code

Video Demo

As someone who listens to Afrobeats almost daily, I found this paper surprisingly affirming of things I’ve felt for a long time. The author’s idea that simple, repetitive patterns can carry an entire universe of expression immediately made sense to me, because that is exactly how Afrobeats works on me as a listener. Many songs use minimal harmony or looping rhythms, yet they never feel empty; instead, the groove itself feels alive.

What stood out most was the discussion of microtiming and how tiny shifts in timing can completely change the emotional feel of a rhythm. When I listen to Afrobeats, I’m often responding less to melody and more to how the beat sits—how relaxed, smooth, or slightly off-grid it feels. The paper helped me realize that this “feel” is not accidental but deeply tied to the body, movement, and cultural listening practices. Reading this made me more aware of why Afrobeats feels so natural and immersive to me.

As a computer scientist, I’m most comfortable coding privately and presenting the finished product afterward. We’re trained to show our best selves—clean code, intentional outcomes, and working solutions. Live coding will challenge this trained instinct by making the process public, exposing not only what works but also mistakes, hesitation, and uncertainty.

Watching code evolve in real time turns programming into a way of thinking out loud rather than a finalized performance. The messiness becomes part of the work, making software feel alive and human.

I also wonder how much traditional music theory actually feeds into live coding. While theory may shape the structures in the background, live coding seems driven more by responsiveness and experimentation. It feels less about following musical rules and more about negotiating them in the moment. For me, music theory in live coding functions as something flexible—useful when needed, but never fixed—allowing spontaneity and interaction to take the lead.