I chose Sardine because I am more comfortable with Python, and I thought it would be interesting to use it for live coding. Since it is relatively new and especially because it is an open-source project, it caught my attention. I like open-source projects because, as a developer, they allow us to build communities and collaborate on ideas. I also wanted to be part of that community while trying it out.

Sardine was created by Raphaël Maurice Forment, a musician and self-taught programmer based in Lyon and Paris. It was developed in 2022 (or around I am not too sure) for his PhD dissertation in musicology at the University of Saint-Étienne.

So, what can we create with Sardine? You can create music, beats, and almost anything related to sound and musical performance. By default, Sardine utilizes the SuperDirt audio engine. Sardine sends OSC messages to SuperDirt to trigger samples and synths, allowing for audio manipulation.

And Sardine works with Python 3.10 or above versions.

How does it work?

Sardine follows a Player and Sender structure.

  • Pa is a player and it acts on a pattern.
  • d() is the sender and provides the pattern.
  • * is the operator that assigns the pattern to the player.

Syntax Example:

Isn’t the syntax simple? I found it rather straightforward to work with. Especially after working with SuperDirt, it looks similar and even easier to understand.

There are two ways to generate patterns with Sardine:

  1. Player a shorthand syntax built on top of it
  2. @swim functions

Players can also generate complex patterns, but you quickly lose readability as things become more complicated.

The @swim decorator allows multiple senders, whereas Players can only play one pattern at a time.

My Experience with Sardine

What I enjoyed most about working with Sardine is how easy it is to set up and start creating. I did not need a separate text editor because I can directly interact with it from the terminal. There is also Sardine Web, where I can write and evaluate code easily.

Demo Video:

My Code:

(The Claude visual is used from https://github.com/mugulmd/Claude/)

What I liked working with Sardine:

  • Easy to set up and get started
  • Very well-written documentation, even though there are not many tutorials online
  • Easy to synchronize with visuals by installing Claude (Claude is another open-source tool for synchronizing visuals with audio in a live-coding context. It has a Sardine extension and allows you to control an OpenGL shader directly from Sardine.)

Downsides:

  • Not many tutorials available online

Resources:

While I’m neither a regular listener of Afrobeats nor a musician, the moment I hear it I want to move my body. For me, it’s not the melody, but it’s the steady, undeniable pulse that feels so alive. The reading provided an explanation for this on how Afro-diasporic music is built on multiple interlocking rhythmic patterns that make it inseparable from dance. Microtiming further explains how musicians place notes slightly early or late relative to the strict pulse, delivers the musicians feel to the listerners. This subtle human variance is what turns the static rhythm into a groove. Therefore one can conclude that the human presence itself makes the music feel alive.

This led me to a fascinating question as we shift in class to creating our own beats. We’re building beats not by striking a drum, but by coding. Once the pattern is programmed, it’s static unless we go back and edit it. So, can a beat made this way ever truly feel alive? Can we even create something as subtle and human as microtiming from a keyboard? The reading’s conclusion points toward an exciting answer. Artists aren’t just trying to perfectly imitate human timing with machines. Instead, they are forging a new continuum between body and technology. Expression no longer comes only from a musician’s hands on an instrument, but from the creative dialogue between human intention and digital process. I believe this is where my experience with live coding would fit in. Performing with my own programmed beats, I realize that making them feel alive doesn’t rely on sound alone. The interaction becomes key: by showing the lines of code that create the rhythm, the audience witnesses the architecture of the groove. This transparency can turn a static pattern into a dynamic, embodied experience.

The reading states that “Live coding is about making software live.” As an aspiring software engineer, my experience with coding has always involved writing an algorithm, testing it, debugging it, and deploying it. Essentially, I have always programmed software to execute tasks that, once deployed, perform the same actions repeatedly for users. However, I realize that throughout my programming experience, I have never once treated the software as a live entity with which I could interact in real time. The software was always pre-programmed, static. Therefore, I find the concept of live coding (where the software feels alive and can be interacted with during a performance) a fascinating way to blend artistic expression with such rigid field of programming.

To prove the liveliness of coding, the reading also points out that live coding is similar to live music performance. The real, social, and spiritual experience of music happens in the moment of performance, with the presence of the musician and the audience. Similarly, live coding embodies the same principle. It is a performance where the creation of music (through writing code happens) live, in front of an audience. There is no pre-recorded track, the music is generated in real time from the performer’s actions. Thinking about learning live coding seems a bit intimidating to me at this point. However, I find the idea of learning the algorithms and methods to manipulate my laptop screen to express my ideas truly exciting.