What is Alda?

Alda is a music composition programming language that offers an easier and more intuitive way to create musical scores using simple syntax. It was designed for musicians with no programming experience and programmers with no music experience. Alda provides an accessible way for composing music through code without the need for much prior experience in either.

The Creator: Dave Yarwood

Alda was created and developed by Dave Yarwood, a musician and programmer with a background in bassoon performance and music composition. Fed up with existing graphical music composition tools like Sibelius, Yarwood took matters in his own hands to create a musical programming language. Yarwood described existing graphical music composition softwares to be more like graphic design softwares than music composition environments. He wanted a lower-level, distraction-free way to compose music—something similar to the modularity and flexibility of command-line tools. The process took him 4 years, and in 2012 Alda was born.

Why Alda?

The motivation behind Alda stemmed from Yarwood’s desire to make music composition and development easier. He viewed exisitng graphical notation software as an IDE (Integrated Development Environment) for music, more often than not he would get distracted by the interface rather than focus on composition. Alternatively, Alda allows musicians to compose scores using simple text, making it easier to focus on the musical structure rather than navigating complex UI elements, memorizing hotkeys and looking for shortcuts.

By removing graphical distractions, it offers a more direct and expressive way to write music, much like writing sheet music by hand but with the benefits of digital tools and the ability to run the composition in realtime.

Influences and Design Philosophy

While developing Alda, Yarwood explored existing text-based notation systems but found none that exactly fulfilled his needs. However, several languages and frameworks influenced Alda’s design:

  • ABC and LilyPond – Both are markup-based tools designed for generating sheet music, inspiring Alda’s approach to notation.
  • Music Macro Language (MML) – Provided an expressive syntax for defining music programmatically.
  • Clojure – Yarwood leveraged Clojure to hook into MIDI synthesizers, allowing Alda to generate sound from its text-based notation.

How Alda Works

Alda is a mix between a markup language and a functional programming language. It allows composers to write musical scores in plain text, which are then parsed and evaluated into Clojure code. This approach ensures flexibility, making it easy to compose, edit, and experiment with musical ideas in a streamlined and efficient manner.

Demo

Mini composed song

Slide Deck

Microstructures of feel macrostructures

In this reading, the author mentions that emotions play a huge part in the music. He questions how something as complex as emotions can be conveyed through something as simple as a drumbeat. This reminds me of the famous quote:

“To play a wrong note is insignificant; To play without passion is inexcusable” -Ludwig Van Beethoven.

My music mentors over the years have emphasized how important body language is; one of them would always tell us that people don’t only come to shows to listen to music, but also to watch so your body language has to be performative and in sync with your music. Through mastering the art of body performance, your music gets better. When a musician allows themselves to be immersed in the music, more often than not, they unconsciously start adding accents and decorative elements that enhance the music and give it a richer texture. This could be applied to each instrument in a different way, when it comes to the drums it’s through the motion of the hands and the intensity and speed of the contact with the drum.

Another important idea the author mentions is the importance of the rests. I am once again reminded of the famous quote:

“The music is not in the notes, but in the silence in between” – Wolfgang Amadeus Mozart.

The presence of rests creates rhythm and beats. Without rests there would be no music, only noise. Based on my years as an orchestra chair, I can vouch for the importance of rests. If a rest is miscounted, players would play at the wrong time, the beat would be off, the accents would be on the wrong notes, the emotion invoked in the audience would be completely thrown off and everyone would lose their sync. To create a groovy beat, to invoke feelings in listeners, you have to pay immense attention to the rests. In the context of percussive beats, rests hold the power to make beats more or less interesting, by shifting rests a bit more grooviness can be achieved. This makes me wonder though how it compares to computer generated music, can we manipulate computers to convey the emotions? By randomizing and through the asynchronization of rests and beats, could we achieve the same results as human generated music? This raises bigger questions in my mind about whether human emotions are as simple as calculated tricks that can be programmed into a computer.

What is live coding?

It’s safe to say that any programmer would agree—coding in front of an audience is incredibly challenging. It’s inevitable that one misses some semicolons, or misspells functions leading to bugs. My first thought when I heard about live coding was “That sounds insane”. The thought of facing an audience and writing code in real time felt daunting. I didn’t mind the whole improvisation thing, because I’ve done it before as a musician. However, unlike playing an instrument, where mistakes can sometimes blend into the performance, coding errors are glaringly obvious. On the other hand, unlike other kinds of performances, Live Coding allows performers to create something insanely cool from something as boring and abstract as code.

Displaying the code on the screen during the performance fosters a deeper connection between the performer and their audience. Without the visibility of the code, audiences can question the authenticity of the improvisation. They might think the performers are just triggering pre-prepared tracks. But by removing that barrier, the performer transforms the experience from a mere spectacle to an interactive, shared journey. In my opinion, the coolest thing about the show is not the visuals or audio but witnessing how performers converse with the machine in real time. Live coding not only spotlights the abstract interaction with the machine, but also opens up the conversation on what kinds of art and creative self expression is possible using the rawest forms of technology. Which is actually kind of insane.