I remembered Shreya asked for the off in my performance. Here is it. You can play with the numbers and it does some very interesting stuff. You are basically offsetting the time.
d3 $ jux rev $ off 0.25(|+ n 12) $ off 0.125(|+ n 7) $n "" # sound "supermandolin" # legato 4
d4 $ ccv (stitch "" 127 0) # ccn 0 # s "midi"
I personally did not like this reading much. It felt it was very abstract and the speaker Paul Miller goes back and forth about improvising and the digital community, giving us so many different examples and ways to think about it that the discussion loses its main focus. Miller talks about magic, film, memory, recordings, economics, and commerce but does not give a clear understanding of putting it into the improvisation context. There is too much to keep track of and follow in the article which I believe actually digresses from the main point and makes the reader more confused and the dialogue hard to follow. The moderator and Iyer keep trying to bring Paul back to tie this to the idea of the digital community.
Having said that, there are a few lines I particularly found interesting. After reading the article twice, the definition of improvisation that I understood was, “Improvisation is navigating an informational landscape.” This is why learning how to just be in this world is also classified as a primal level of improvisation – quite an interesting idea actually! Putting it into the digital context, Miller says, “For me as an artist, digital media is about never saying that there’s something that’s finished. Once something’s digital, anything can be edited, transformed, and completely made into a new thing.” I never thought about digital media that way. In this case, there is so much scope for improvising in a digital culture. Everything online is a record and everyone listens to records, which are just documents or files essentially. But improvising with this cascade of information is what converts raw data into useful information – making it a form of improvisation in the digital community.
This is the code from my performance the other day! I’m not entirely sure if this is the correct way of making functions in tidal but it seemed to work pretty well for me. I made functions for both the sound and the visuals in tidal so that it would be easier to control both at once. In hydra, I set it up so that the parameters that I wanted to change were linked to the functions I made in tidal
Another thing that could be helpful is writing code on multiple lines at once, just press shift and click on where you want to type and viola! This was especially helpful for when I wanted to add effects to the sound and wanted them to reflect on the visuals
FoxDot is a Python programming environment created in 2015 by Ryan Kirkbride. It is designed in the aim to help people to combine coding and audio to create lice coding performance that’s mainly made up of musical elements.
On the technical side, the installation of FoxDot is fairly simple. All it takes are a few lines of commands in the terminal and installation within SuperCollider. A command line will take you to the user interface of FoxDot within Python.
The language FoxDot adopts actually resembles python in many ways. It also used the concept ofpattern and with functions it provides for manipulation of the patterns, we can easily build patterns for percussions and simple notes.
hHowever, it does not have a ready to use web interface. With the limited number of functions in the library, there are also limited variations to the rhythm we are making. Moreover, the platform itself has not had any major updates for a long time, which means we are missing out on a lot of the novel development in live coding with this platform.
“Improviz is a live-coding environment built for creating visual performances of abstract shapes, blurred shades, and broken GIFs.”
When I was choosing a platform from the list, the caption for Improviz mentioned something about how it’s based on using and abusing basic shapes. This caught my attention as I am always interested in seeing how far you can push basic structures to create significantly more complex visuals.
Improviz is built in Haskell and interacts directly with OpenGL and it was developed by David Guy John, who goes by ‘rumble-san’. This platform is considered to be a fork of another platform: LiveCodeLab which rumble-san also took part in developing. Still, Improviz stands out a lot more to users as it is much faster than LiveCodeLab, but most importantly, it also offers the option to load custom textures, materials, models, … which gives a lot of room for personalization and experimentation.
It has a web version (you can find it here), but it lacks the ability to load all the custom objects I mentioned previously. Therefore it is recommended to download the latest release from GitHub, run it from the terminal, and use an editor (like Atom) alongside it.
The language used reminds me a lot of p5.js. It has a quite simple syntax where you call functions that either draw shapes to the screen or change those shapes in some way. “By default, the only thing that actually changes in Improviz is the time variable, which tracks the number of seconds that have passed since the program was started.” So introducing the time variable in the code is a good idea to get some cool variations and effects.
Here’s one of the demos I experimented with using Improviz:
I chose Motifn as the live coding platform. It is a platform that focuses on creating “visualized “music by utilizing “quite a lot of third parties” like Tone.js and receiving support from live coding artists, namely Alex McLean and Charlie Roberts, the creators of the live coding software Tidal Cycles and Gibberwocky respectively. Later on, I did find that there are quite a lot of similarities between the language used in Motifn and in Tidal Cycles. I guess that’s because after all, they are all created based on JS.
It basically provides two modes for coders to play around with. One is “FUN”, which is to code on the web, and the other one is “DAW”, which is to connect MIDI to play music. I mainly tried the “FUN” mode as it is more convenient (but definitely more risky lol).
In my exploration, I found that this platform is indeed quite user-friendly as it contains music examples as well as interactive tutorials for coders to self-study. The tutorials not only teach users how to create melody based on notes and rhythms based on using drums but even JavaScript basics. It is definitely very helpful for newbies like me. But still, this is not a highly integrated platform like Garageband on the ios system (I played around quite a lot with it), it still requires users with fundamental knowledge of coding. Nevertheless, I believe that this will not be “live coding” if it doesn’t contain the process for users to code themselves.
The UI is also user-friendly. Users can manually adjust the space of the three sidebars based on their preferences or needs. Here, I really want to point out the note tracks this platform provides, which is absolutely one of its best features. Similar to the tracks in Garage Band, this system also demonstrates the track for each instrument, and users can check each note by clicking on them, and in return, the corresponding code part will also be highlighted by the cursor.
For the practical usage part, I first learned the basic structure of programming on this platform and found that it contains three parts: 1) Define a function (like function song ()), 2) “Let” function for each track (let ss = songStruture ( { ), and 3) return song. Then learned details of how to create melodies, rhythms, and arrange music structure.
In terms of the content that we can create on this platform, I feel like there are a lot of very interesting effects that we can make, for example, the “ties” and the “vibrato” that can trigger very funny sounds. Although it’s true that this platform doesn’t provide many preset synths for users to choose is also possible to use this platform to create absolutely great pieces if you know quite a lot about composition. Like this one: