Alda is a “text-based music composition programming language” developed by Dave Yarwood in 2012. As a software engineer with a background in classical music training, Yarwood sought for a way to link his interests and create an accessible, general-purpose audio programming language. The result was Alda, which he designed with the purpose of providing both musicians without programming knowledge and programmers without musical knowledge with a unique and simple method of composition. 

Alda allows for users to write and play music using just the command line and a text editor. The interactive Alda REPL lets the user get immediate feedback from typing and entering in lines of code in the command line. Longer scores can be written in a text file, which can then be fed to and played by Alda (also via the command line). Staying true to this purpose of accessibility, musical notation is rendered simple—notes are represented by their respective letters (a~g), octaves can be set and switched using o1~o7, rests are simply r, + and – are used to denote sharps and flats, and so forth.

There is a noticeable lack of a “live” component in Alda compared to the other coding platforms on the list. Alda is only able to output exactly what it is given, and it does so in a finalized form (in that the output cannot be edited in real time). The REPL is perhaps a little more interactive in this aspect, as the focus is on maintaining a quick exchange of input and output. Alda also does support writing music programmatically, as Alda code can be generated through programs in other languages. Even so, Alda itself does not have any innate functions that allow for algorithmic composition/live coding.

I personally had a lot of fun experimenting with Alda. Following the tutorial on the Alda website, I was able to mess around with the REPL with ease—the simple code and instant output makes it easy to test things out and understand how they work. With a grasp of all the commands and methods gained from that session, I was able to put together longer and more complex scores using multiple instruments through text files.

I should note that I do have some classical training from my years of playing cello and am familiar with sheet music and musical notation. However, I still struggle with letter notation (A-G), as I learned music through the solmization syllable system (do re mi). This complicated my efforts to make things sound the way I wanted them to and made me feel that one would have to have at least some knowledge of music theory and notation to really use Alda to its full potential. Regardless, there is no denying that Alda makes composition much less daunting in both appearance and experience to people who might not be used to reading/writing sheet music.

For my mini project with Alda, I attempted to recreate the starting portion of the beautiful instrumental for Where the Flower Blooms, a track by rapper and producer Tyler the Creator. The text file that I wrote up is as follows:

(key-sig! '(a major))

violin "violin-1":
(pp) o5 f1 | f | f | f |
f | f | f | f |

violin "violin-2":
o4 r8 a+ > f < a+ > g < a+ > f < a+ | r a+ > f < a+ > g < a+ > f < a+ |
r g > e < g > f < g > e < g | r f > c < f > d+ < f > c < f |
o5 b8 f e d+ b f e d+ | b f d < c > b f d < c |
a g f c a g f c | a g f c a g f c |

viola:
o4 r8 f b f > c < f b f | r f b f > c < f b f | e a e b e a e | d+ a d+ b d+ a d+ |

cello:
(pp) o3 c1 | c- | b6 d b d b d | b d b d b d |
c1 | c- | >b | b |

piano "piano-1":

(p) o3 r1 | f/>c-/f | <f/a/>f | f/>c-/f |
(mf) d+4./f/b/>d+  d+8/f/b/>d+ d+ <b4. | e/g-/b e/g-/b | c/f/a/>c <c/f/a/>c | c/f/a/>c <c/f/a/>c |
o4 e/g


piano "piano-2":

r1 | r1 | r1 | r1 |
(mf) o3 c2 c4. >c8 | <c-2 c- | <b/>f r8 f4 <b8 | b2/>f <b/>f |
o3 c1

And here is a short video of what it sounds like:

Sonic Pi is a live coding platform created by Sam Aaron, at University of Cambridge Computer Laboratory, in 2012 as a low-cost programmable computing platform for Raspberry Pi. Since it was originally created to make programming in Raspberry Pi more accessible, it is beginner-friendly and has friendly UI and documentation. 

Sonic Pi is also popular among live coding DJs and used by multiple artists. DJ_Dave is one of the artists who actively use Sonic Pi to create electronic music. Below is a live coded music she created using Sonic Pi.

I also experimented with Sonic Pi and created a edit of a track – Free Your Mind by Eden Burns. I also used some custom samples of bass drums and claps for the beat. The following is the code:

s = [path_to_main_track] # this is the path to the music file
ukgclap = [path_to_sample_dir]
ukghh = [path_to_sample_dir]

use_bpm 124

##| start
in_thread do
  sample s, start: 0.6542207792, finish: 0.7045454545, amp: 0.5, decay: 3
  sleep 8*4
  set :beat, 1
end

live_loop :sbd do
  sample s, onset: 0, amp: 1.6
  sleep 1
end

##| play beat
sync :beat
live_loop :clap do
  sleep 1
  sample ukgclap, "ukgclap_8", amp: 0.7
  sleep 1
end

live_loop :shh do
  sleep 0.5
  sample ukghh, "ukghh_1", amp: 0.7
  sleep 0.5
end

##| intro
in_thread do
  sample s, start: 0.163961039, finish: 0.2142857143, amp: 0.85
  sleep 8*4
  set :mainBeat, 1
end

##| play main
live_loop :main do
  sync :mainBeat
  sample s, start: 0.3022727273, finish: 0.4029220779
  sleep 8*16
end

Here, I chopped up different parts of the track and reordered with a coded house beat using custom samples. The hardest part was to make the set: and sync: work properly with the threads. I also had to calculate the bars manually to figure out the sleep time. I wonder if there is a better way to do this.

While the last block is being halfway played, I also executed the following code in another buffer of Sonic Pi:

s = [path_to_main_track] # this is the path to the music file

2.times do
  with_fx :reverb, phase: 0.2, decay: 5, mix: 0.75 do
    sample s, start: 0.6542207792, finish: 0.7045454545, amp: 0.5, decay: 3, rpitch: 1.3
    sleep 8*4
  end
end

It plays the vocal in a somewhat distorted way as it overlaps with the vocal that were already being played by the main block.

For my research project, I looked into several languages before I stumbled upon livecodelab. This platform, created by the team of Davide Dela Casa and Guy John, is a breath of fresh air for anyone wishing to experiment with digital creation without the inconvenience – “anyone” and “without inconvenience” being the key terms. Both of the creators’ backgrounds are in Computer Science and they’ve worked in tech giants like Meta, Amazon and Google as per my research. Lets look deeper into what they’ve developed now.

The Concept and The Language

What sets LiveCodeLab apart is its simplicity. The code is developed through JavaScript and runs in a native JS runtime. They’ve crafted a language, LiveCodeLang, that’s so straightforward, it feels like you’re having a conversation rather than writing lines of code. And the magic happens as you type – 3D visuals start taking shape, and sound clips start playing, all in real-time. It’s like watching your ideas come to life right before your eyes. The rendering happens at 60 fps and the sound plays at an adjustable bpm.

The Website: livecodelab.net

I was pleasantly surprised to find that LiveCodeLab works right in your browser. No need to download any fancy software or set up complicated environments. Just fire up your browser, head to livecodelab.net, and you’re ready to dive into the world of creative coding. What’s even cooler is that the visuals and sounds keep playing as long as you stay on the tab. They won’t distract you if you change tabs, but will still be there when you come back later.

Cool Features and Areas for Improvement

One thing I love about LiveCodeLab is its accessibility. You don’t need to be a coding whiz to appreciate what it can do. As Guy John puts it,

"You don't need to understand the code to appreciate it, much like you don't need to play guitar to enjoy a guitar performance."

It’s a welcoming space where anyone, regardless of their background, can unleash their creativity and see where it takes them.

On top of that, it’s open-source. That means anyone can contribute to its development and help make it even better. Whether it’s adding more shapes and visuals or finding ways to sync animations with sound beats, the possibilities are endless. Everything about how to build on top of LiveCodeLang is also a part of its GitHub documentation.

As great as the platform is, there are areas that could use some polish. More shapes and visuals would be great, and syncing animations with sound could take things to the next level.

However, even as it is, LiveCodeLab is a shining example of how simplicity and creativity can go hand in hand.

I opted for the live coding platform Improviz. The setup was quite straightforward. It required me to download the necessary files from GitHub onto my computer, launch Improviz via the command terminal, and then proceed to code within Improviz’s web-based editor. As I dived into the documentation, it became clear that the platform is primarily designed with beginner users in mind. The syntax/language it uses is similar to that of Processing or p5.js, focusing primarily on the creation of simple 3D shapes. A function that particularly stood out to me was the ability to use personal images and GIF animations as textures for the shapes, which adds the ability for unique customizations and visual appeals. Improviz also comes with a selection of pre-installed textures and materials that are both visually appealing and add to the creative possibilities. The syntax of Improviz is straightforward and intuitive, making it accessible for beginners, yet it offers enough functions to create amazing live arts.

Here’s a simple live art I made using Improviz. My idea was to have some geometric shapes with textures and materials changing colors dynamically, and make them move a little. I did this by using move, rotate and the sine function, which changes with time. The full code is on the left.

Coding was always scary to me. Projects like Mercury make it increasingly intuitive and fun! My research project explores Mercury, a live coding environment created by Timo Hoogland in Max8. What caught my attention is that Mercury’s syntax is very easy to understand, and uses “clear descriptive names” for functions.

Hoogland created Mercury as part of his his Masters in Music Design at the University of Arts in Utrecht. His focus was to create a minimal language, user friendly, to teach (& create) algorithmic composition, electronic music, among others. Mercury has two versions, one for the browser, one running on MaxMSP. I focused on Mercury Playground, the browser version.

The user interface is quite intuitive, and GitHub documentation is extensive. I was able to quickly and easily learn the basics of this environment, and begin to create my own music. The first step I did was to follow the tutorials. They are short, concise and very informative. From the tutorials, and performance examples, I began building up my own tunes. It was so much fun! Below are both the interfaces for the web version & the max version.

Here’s a composition I’ve been working on:


set tempo 148
list cutoffs [200 400 700 1000]
list qs [0.3 0.3 0.3 0.3 0.8]

new sample kick_deep time(1/4)
new sample bell time(1/8) gain(0.8) play(0.85)
new sample hat_909_short time(1/16) play(0.95) gain(0.5)

new synth saw time(1/8) fx(filter low cutoffs qs) shape(1 80) note(C3) gain(1.2)
new synth saw time(1/4) fc(filter low cutoffs qs) shape(1 80) note(F4) play(0.65)

set scale minor c

list baseline repeat([8 -2 0 5] 4)
new synth saw note(baseline 0) shape(off) time(1/4) slide(1/16) gain(1) fx(reverb)

A very positive component I found about Mercury is the ease with which one can add hydra code, and collaborate with others on projects.

GLICOL (an acronym for “graph-oriented live coding language”) is written with rust. With a bit of visuals and an amazing working environment, let’s get those beats popping… I did not actually master the program but.. we try…

So Important things first,

Where did this begin

Qichao Lan, the developer of this platform, is a computer musician and audio programmer in his early 30s (probably around), specializing in Rust and web audio, live coding, and music AI. His inspiration behind Glicol as described by the University of Oslo- where he did his PhD – is as follows:

“Qichao Lan wanted to make music live coding accessible to anyone. As part of his PhD, he made a music language and an app easily accessible in a web browser.”

Silje Pileberg – RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion

He wanted to make it more accessible for people to understand music, all while making it fun and enjoyable. It is true that today many people surround themselves with music not knowing where it originated, he was trying to connect these 2 dots. You will see as you read on, I feel like Glicol relies a story of someone who genuinely wants to teach you something about what he loves. He wants to share his passion, make people connect, innovate, try, create, and most importantly understand. He made use of his education and passion, he is not much older than we are. Glico is made using Rust programming language, and javascript, and can use both js and meta programming languages to code within the environment. Its cool go play around.

We Innovate

https://glicol.org/

I wanted to talk here about 1, the website, 2 the documentation, 3 the references, and 4 the experience.

Website

Honestly, in my opinion, the website has this appeal of welcomeness, with the colors, the language, the jokes, and the way different parts of the website were organized, it was all very welcoming. How often do you see that!

I also really liked the fact that the website has the option to play the audio and animation on the square in the website while reading explanations and seeing the output on the spot — almost like an innovation of the “run code” button that opens a new page to run the code.

Documentation

Glycol had, essentially, the readme — the required documentation — a demo, and a tour on the website.

The tour explained concepts almost like you are in class listening to your professor, following steps to genuinely understand each of the functions being used and what they mean. It is definitely one of the nicest presentations on these topics that I have seen.

The demo, on the other hand, did not have as much explanation, but it had combinations, that could be used to code. It is nice to be able to see how the codes would go together, especially as someone new to a live coding environment, or as someone new to music development or programming.

One issue I had with what was on the website, is that it wasn’t particularly enough… I guess references (check references section) exist and they explained most needed concepts, but it felt incomplete in way.

References

The references they had were really really cute, but, I wish it could be searched up somewhere other than the consol. Some document or another. because as fun as it is to read it on the consol once, I don’t want to do it every time I forget.

Basically references are shown on the consol in 3 ways:

  • You type d() to get the keywords available
  • You type h() to get a specific explanation for any keyword
  • You press Alt+D with the cursor on the keyword you need help with and it will show a short help for this keyword

I honestly think Alt+D is an amazing addition, the other 2 not as much if I am being honest

Experience

After exploration, attempts at understanding, and a failed attempt to replicate a heartbeat using a sin wave, here is what we got…

WebEditor

// Hello world note:
o1: sin 440

// playing around
~o1: speed 4.0 >> seq 60 _ 50_2 _ >> sp \808bd;

~a: choose 60 60 0 0 72 72
~o2: speed 2.0 >> seq 60 ~a _ 48_67 >> sp \blip

~o3: speed 3.0 >> seq _ 10 _ _~a >> sp \808bd

// combining examples with my code
~gate: speed 2.0 >> seq 60 _60 _~a 48;
~a: choose 48 48 48 72 0 0 0
~amp: ~gate >> envperc 0.001 0.1;
~pit: ~gate >> mul 200
~o4: saw ~pit >> mul ~amp >> lpf ~mod 5.0
~mod: sin 2 >> mul 1.3 >> add -0.2

out: mix ~o.. >> plate 0.1

// js example
~x1: seq 60 >> sp \##s()# // random pick
~x2: seq _60 >> sp \##s(0, 20)# // random pick range 
~x3: seq _ _60 >> sp \##s(42)# // select
~x4: seq _60 _ >> sp \##s("in")# // filter word
out2: mix ~x.. >> plate ##1/10#

// basic fm notes 300sin(0.2)+1000
~fm: sin 0.2 >> mul 300.0 >> add 1000.0
out3: sin ~fm >> mul 0.1

// final output
~b: choose 100 60 20 0 70 40
out4: sin 0.6 >> mul 5 >> add -2.4 >> speed 2.1 >> seq ~b _ ~b

Blooper

So-called “live coding'” as declared by the RITMO Centre of the University of Oslo. (https://www.hf.uio.no/imv/english/research/news-and-events/events/disputations/2022/lan.html)

I just wanted to talk about how I got to using GLICOL. I want to say know the dependencies before you use any open-source software, please.

I first started with Mosaic. Mosaic needs a lot more dependencies than I thought it would, especially for a Windows user. The documentation is also a little hard to follow – if you do not have someone who already knows how to make it work explain that, it is not easy to figure out. After I spent hours between downloads, figuring out errors and 500 different “instructions” websites, the Mosaic ofx project was not working, and then I decided it was not worth the time to keep going with it.

Switching is hard, because Mosaic seemed like a really powerful platform, that could need a learning curve, but also allows you to play around without fully knowing what is going on. This is an intro video to explain what I am trying to say > Mosaic Intro.


I chose to learn a bit about the LiMuT platform for the research project. It’s a web-based live coding platform built on JavaScript, WebGL, and WebAudio with a syntax similar to Tidalcycles. The platform, according to its creator, Stephen Clibbery, was inspired by FoxDot, a Python-based live coding platform. He started it as a personal project in 2019, and since he’s been the sole contributor to the project, the documentation is not as developed as other platforms. While I reached out to him through LinkedIn to gather more insight into the platform, he hasn’t responded as of yet.

The strength of the platform is definitely in its ease of setup since any modern browser can run the platform at https://sdclibbery.github.io/limut/. The platform supports visuals and audio, but there aren’t many ways to link the two up. Many of the visual tools are set, and there isn’t a lot of customization that can be done. However, LiMuT is nonetheless a powerful live coding tool that enables the user to perform flexibly. I decided to explore by diving into the documentation and tweaking bits and pieces of the examples, looking into how each piece interacts with each other.