I chose Sardine because I am more comfortable with Python, and I thought it would be interesting to use it for live coding. Since it is relatively new and especially because it is an open-source project, it caught my attention. I like open-source projects because, as a developer, they allow us to build communities and collaborate on ideas. I also wanted to be part of that community while trying it out.

Sardine was created by Raphaël Maurice Forment, a musician and self-taught programmer based in Lyon and Paris. It was developed in 2022 (or around I am not too sure) for his PhD dissertation in musicology at the University of Saint-Étienne.

So, what can we create with Sardine? You can create music, beats, and almost anything related to sound and musical performance. By default, Sardine utilizes the SuperDirt audio engine. Sardine sends OSC messages to SuperDirt to trigger samples and synths, allowing for audio manipulation.

And Sardine works with Python 3.10 or above versions.

How does it work?

Sardine follows a Player and Sender structure.

  • Pa is a player and it acts on a pattern.
  • d() is the sender and provides the pattern.
  • * is the operator that assigns the pattern to the player.

Syntax Example:

Isn’t the syntax simple? I found it rather straightforward to work with. Especially after working with SuperDirt, it looks similar and even easier to understand.

There are two ways to generate patterns with Sardine:

  1. Player a shorthand syntax built on top of it
  2. @swim functions

Players can also generate complex patterns, but you quickly lose readability as things become more complicated.

The @swim decorator allows multiple senders, whereas Players can only play one pattern at a time.

My Experience with Sardine

What I enjoyed most about working with Sardine is how easy it is to set up and start creating. I did not need a separate text editor because I can directly interact with it from the terminal. There is also Sardine Web, where I can write and evaluate code easily.

Demo Video:

My Code:

(The Claude visual is used from https://github.com/mugulmd/Claude/)

What I liked working with Sardine:

  • Easy to set up and get started
  • Very well-written documentation, even though there are not many tutorials online
  • Easy to synchronize with visuals by installing Claude (Claude is another open-source tool for synchronizing visuals with audio in a live-coding context. It has a Sardine extension and allows you to control an OpenGL shader directly from Sardine.)

Downsides:

  • Not many tutorials available online

Resources:

Background & Functionality

Gibber is a browser-based live coding environment developed by Charlie Roberts and JoAnn Kuchera-Morin, who worked together with their UCSB team to create an accessible platform for live music and audiovisual programming. The existing live coding tools create obstacles for most learners because of their need to be installed and the requirement to learn specialized programming languages and complete intricate setup procedures. Gibber addresses these challenges by running entirely in the web browser and using pure JavaScript, which enables users to start coding immediately without any installation requirements. The system provides users with brief development tools that create Web Audio API functions enabling them to build oscillators, FM synthesis, granular synthesis, audio effects and musical patterns through minimal coding efforts. The timing system of the system provides two types of timing functions, which enable users to schedule audio at sample level and use musical time values. The platform provides sample-accurate time functions together with Seq and Score tools, which enable users to create both freeform musical patterns and organized musical compositions. The platform combines visual rendering capabilities with real-time code collaboration, which enables performers to edit code together while the system uses CRDTs to ensure performance consistency throughout the collaboration.

Gibber’s design provides multiple benefits which transform it into an effective educational instrument and performance tool. The system allows users to create music through its simplified syntax which enables beginners to achieve musical results yet permits advanced users to conduct complex testing. The application runs in web browsers which provides all users access to its features which makes it suitable for use in educational settings and training programs and online group performances. The integrated graphics system of Gibber enables artists to create audio-responsive visual content which works together with interactive drawings and multimedia shows through a unified coding platform. The software provides users with pattern sequencers and modulation tools and synthesis options which enable them to create music across multiple styles that include rhythmic beat-making and experimental sound design. The collaborative features further distinguish Gibber which enables multiple performers to code together in real time while they share musical ideas through their common code instead of using audio stream synchronization. The software enables users to create music through its flexible design which serves as a learning platform for users to practice and create together with others in the field of electronic art.

Gibber Performance

Performance Title: Better Late Than Never

https://drive.google.com/file/d/1YBxah5ff8GHTpRIVB2YY5V2pk-isBFyo/view?usp=sharing

Final Code:

kick = Kick().trigger.seq(3, 1/2 )

bass = FM('bass.sub').note.seq( 0, 1/32 ).index.seq(2)
hat = Hat().trigger.seq( 1, 1/32 )


bass = FM('bass.sub')
  .note.seq( [0,-2,-5,-7,-9,-12], 1/16 )
  .index.seq( Rndf(5,12) )

lead = FM('lead')
  .note.seq( [7,9,12,14], 1/8 )
  .index.seq( Rndf(3,10) )

pad = Synth('pad')
  .chord.seq( [[0,4,7,11]], 1 )

Graphics.quality = 'high'
Graphics.animate = true

Background(Vec3(.1))
Light(Vec3(2,2,3), Vec3(1))

Union2(
  Sphere(1).translate(-1.5,0,0).material('red'),
  Sphere(1).material('blue'),
  Sphere(1).translate(1.5,0,0).material('yellow'),
  Plane().material('yellow')
).render()


Union2(
  Sphere(1).translate(-4.5,0,0).material('red'),
  Sphere(1).translate(-3.5,0,0).material('green'),
  Sphere(1).translate(-2.5,0,0).material('blue'),
  Sphere(1).translate(-1.5,0,0).material('purple'),
  Sphere(1).translate(-0.5,0,0).material('yellow'),
  Sphere(1).translate(0.5,0,0).material('cyan'),
  Sphere(1).translate(1.5,0,0).material('magenta'),
  Sphere(1).translate(2.5,0,0).material('orange'),
  Sphere(1).translate(3.5,0,0).material('white'),
  Sphere(1).translate(4.5,0,0).material('gray'),
  Plane().material('yellow')
).render()

I researched P5LIVE, a collaborative live coding platform for p5.js that runs in the browser. Live coding is an art practice where the process is the performance. You write and change code in real time, and the visuals update immediately, often with the code visible too. P5LIVE was created by Ted Davis, a media artist and educator in Basel, originally for a Processing Community Day event where people wanted to live code visuals during a DJ party.

What I like about P5LIVE is that it treats live coding as a social activity. It lowers friction by running in the browser, and it makes collaboration feel natural through links and shared rooms. It is not just an editor. It is a space where teaching, performance, and experimentation overlap. Instead of coding being private and finished, P5LIVE encourages coding as something collective and ongoing.

P5LIVE’s key feature is COCODING, which works like Google Docs for code. You create a room link, others join, and everyone edits the same sketch together in real time while the visuals run locally for each person. It also includes classroom and performance features like lockdown mode, chat, and SyncData, which lets people share live inputs like MIDI or mouse data with the group. In my demo, I will show the instant feedback loop and a basic COCODING session.

LocoMotion is a live coding language designed specifically for dance and choreography. Instead of generating sound or abstract visuals, it controls 3D avatars in a virtual space, allowing performers to write code that directly produces movement. At its core is the dancer function, which creates an avatar that can be positioned using x, y, and z coordinates. Movement phrases are selected with animation, their speed adjusted with dur, and procedural motion can be generated using functions like osc, which creates continuous oscillation. Because the system updates instantly when code is evaluated, choreography can be composed, modified, and improvised in real time.

Within the wider live coding community, LocoMotion expands the artform into embodied movement. Live coding culture emphasizes transparency, improvisation, and the visibility of code during performance. LocoMotion contributes to this culture by treating code as a choreographed performance or score, positioning programming not as a hidden technical tool, but as an expressive, performative medium that merges computation with dance.

Code

Video Demo

Overview

Punctual is a live coding platform that runs in your browser and handles both audio and visuals. The creator says it was inspired by a way of using SuperCollider called JITLib, but uses a newer and more economical notation. Because it is browser-based, you do not need to install anything, which makes it much more accessible to start experimenting and learning. David Ogborn, the creator, also provides tutorials and guides on YouTube, which make it easier to understand and explore the platform.

Wider Context

Punctual is really accessible because it runs straight in a web browser, so you don’t have to deal with complicated setups like installing SuperCollider or Hydra. The documentation is actually a great starting point; you can take the example code, run it, and see how every parameter affects both the sound and the visuals. This makes it easier to understand what each function does before you start making your own pieces. That’s important because tools that require lots of setup can feel intimidating, but with Punctual, the barrier is lower, which makes live coding feel more approachable and encourages experimentation.

How it Works (+ My Observations)

One thing to note is that there’s no scrolling feature; you have to move the cursor with the arrow keys if your code gets long. Unlike Hydra, where Shift+Enter runs just the current line, in Punctual, Shift+Enter seems to reevaluate all the code in the editor, which is convenient but can feel a bit limiting for live coding because you can’t isolate just one line. Still, the documentation and tutorials are super helpful, you can paste a line, run it, then add another line and run again, and you get a sense of how each parameter affects the audio and visuals as the code is reprocessed.

My Code

osc 440 * lftri 3 * 0.7 >>audio;
[unipolar (1 - sqrt(fx*fx + fy*fy)/0.5) * 0.7, 0, 0]>> add;

saw 33 * 0.25 >> audio;
[unipolar (0.05 / (sqrt(fx*fx + fy*fy) + 0.01)), 0, 0] >>add;

osc [220, 261, 311] * 1.1 >>audio;
[unipolar (lftri 5 * (sin(fx*10)sin(fy*10))), 0, 0] >> add;

osc [330, 440, 523] * lftri 5 * 0.6 >>audio;
[unipolar (lftri 5 * fx * 5), 0, unipolar (lftri 5 * fy * 5)] >> add;

lpf 120 1 (saw 55) * lftri 0.15 >>audio;
[unipolar (0.5 - abs(sqrt(fx*fx + fy*fy) - 0.3)), 0, 0]>> add;

osc [196, 220, 233] * 1.05 >>audio;
[unipolar (lftri 4 * lftri 3 * 0.35), 0, 0]>> add;

osc 110 * lftri 1.5 * 0.5 >>audio;
[unipolar (lftri 1.5 * 0.5), 0, unipolar (lftri 0.7 * 0.3)] >>add;

osc 87 * 0.3 >> audio;
[unipolar (0.6 - sqrt(fx*fx + fy*fy)), 0, 0]>>add;

Video Demo:

https://drive.google.com/file/d/1RrrvHwDGgeVrWRvA2tjVaBR-KlmWS0Cg/view?usp=drive_link

Sonic Pi was created by Sam Aaron, a programmer and live coding artist that developed it as his post-doc research at the University of Cambridge. He created the platform in hopes of making programming more accessible to children. It was developed in junction with the Raspberry Pi Foundation, as a easy to run program that could run on weaker computers, such as the Raspberry Pi.

It began as just an educational tool but evolved into a platform that is utilized both for learning, and for performing. This design is one of the things that I thought was different about Sonic Pi, it was developed for a range of users, meaning its features must be simple enough for children to operate, but also technical enough for live coding artists.

The program is easy to run, as the only software needed to download is the Sonic Pi software itself. The above mentioned philosophy for its feature set is also why there are built in tutorials within the software.

The Program runs similarly to how Tidal-Cycles does, where you type code and evaluate it. However, one core difference between Sonic Pi and Tidal-Cycles, is, instead of evaluating line by line, each time you run the code, you run the whole code on the current document. Similarly to Tidal-Cycles though, it doesn’t generate any sound itself, but it utilizes SuperCollider for sound generation, albeit, having it built into the program when you download it.

Strudel represents a significant evolution in accessible music live coding, porting the sophisticated pattern language of TidalCycles from Haskell to JavaScript for entirely browser-based performance.

Origins and Development

The project emerged in early 2022 when Alex McLean (creator of TidalCycles) began porting Tidal’s pattern representation to JavaScript. Developer Felix Roos discovered this early work and built a complete browser system around it. After intensive collaborative development, Strudel was formally presented at the 2023 International Conference on Live Coding in Utrecht, establishing it within the “Uzulangs” family of Tidal-inspired environments.

Key Differences from TidalCycles

While Strudel faithfully preserves Tidal’s cyclic time model and pattern operations, several distinctions matter:

No installation required. Unlike TidalCycles, which demands Haskell, SuperCollider, and SuperDirt setup, Strudel runs immediately in any modern browser. This dramatically lowers the entry barrier for newcomers and educational contexts.

JavaScript, not Haskell. The syntax feels familiar to web developers, though the underlying pattern concepts remain consistent with Tidal’s approach.

Flexible output routing. Strudel includes WebAudio synthesis directly, but can also drive MIDI hardware, send OSC to SuperCollider/SuperDirt, connect via WebSerial, or route to CSound, making it adaptable to various workflows.

How It Works

Strudel’s REPL transpiles code into Pattern objects using Acorn and Escodegen parsers. A scheduler queries these patterns at regular intervals, generating musical events (called “Haps”) while maintaining Tidal’s characteristic approach: events compress into fixed cycle lengths, enabling dense polymetric structures without tempo changes.

The result is a practical tool for algorave performance, classroom teaching, and studio sequencing that preserves TidalCycles’ creative philosophy while embracing web accessibility.

Explore at strudel.cc.

Short Demo