Hello, here is the code from my live demo today!

This is the code for hydra:
voronoi(2,()=> cc[0]*5,0.3).color(2,0,50).out(o0)
src(o0).modulate(noise(()=>cc[0]),0.005).blend(shape(),0.01).out(o0)
hush(o0)
shape(5, .5,.01).repeat(()=> cc[2]*5,()=> cc[2]*4, 2, 4).layer(src(o0).mask(o0).luma(.1, .1).invert(.2)).color(()=> cc[1]*20,()=> cc[1],5).modulate(o1,.02).out(o0)
// .scrollY(10,-0.01)
// .rotate(0.5)
hush()

This is the code for tidal:
d7 $ (fast 2) $ ccv "" # ccn "0" # s "midi"
d1 $ sound "electro1:8*2" # gain 1.4
d2 $ sound "electro1:11/2" # gain 1.8
d3 $ sound "hardcore:8" # room 0.3
d4 $ sound "hardcore:0*8"
d5 $ sound "reverbkick odx:13/2" # gain "1 0.9" # room 0.3 -- odx *2
d6 $ sound "arp" # gain "1.2"
d8 $ ccv (segment 128 (fast 2 (range 127 60 saw))) # ccn "1" # s "midi"
d9 $ ccv "" # ccn "2" # s "midi"
hush
d4 silence

The order I followed is based on the line numbers, a screenshot is attached below!

Hello everyone!

I remembered Shreya asked for the off in my performance. Here is it. You can play with the numbers and it does some very interesting stuff. You are basically offsetting the time.

d3 $ jux rev $ off 0.25(|+ n 12) $ off 0.125(|+ n 7) $n "" # sound "supermandolin" # legato 4
d4 $ ccv (stitch "" 127 0) # ccn 0 # s "midi"

I personally did not like this reading much. It felt it was very abstract and the speaker Paul Miller goes back and forth about improvising and the digital community, giving us so many different examples and ways to think about it that the discussion loses its main focus. Miller talks about magic, film, memory, recordings, economics, and commerce but does not give a clear understanding of putting it into the improvisation context. There is too much to keep track of and follow in the article which I believe actually digresses from the main point and makes the reader more confused and the dialogue hard to follow. The moderator and Iyer keep trying to bring Paul back to tie this to the idea of the digital community.

 

Having said that, there are a few lines I particularly found interesting. After reading the article twice, the definition of improvisation that I understood was, “Improvisation is navigating an informational landscape.” This is why learning how to just be in this world is also classified as a primal level of improvisation – quite an interesting idea actually! Putting it into the digital context, Miller says, “For me as an artist, digital media is about never saying that there’s something that’s finished. Once something’s digital, anything can be edited, transformed, and completely made into a new thing.” I never thought about digital media that way. In this case, there is so much scope for improvising in a digital culture. Everything online is a record and everyone listens to records, which are just documents or files essentially. But improvising with this cascade of information is what converts raw data into useful information – making it a form of improvisation in the digital community.

This is the code from my performance the other day! I’m not entirely sure if this is the correct way of making functions in tidal but it seemed to work pretty well for me. I made functions for both the sound and the visuals in tidal so that it would be easier to control both at once. In hydra, I set it up so that the parameters that I wanted to change were linked to the functions I made in tidal

 

Another thing that could be helpful is writing code on multiple lines at once, just press shift and click on where you want to type and viola! This was especially helpful for when I wanted to add effects to the sound and wanted them to reflect on the visuals 

 

Hope this helps!

//hydra 

blobs = ()=> osc(()=> cc[2]*30+15, .01, 1).mult((osc(20, -.1,1)).modulate(noise(()=>(cc[0])*5, cc[0])).rotate(1)).posterize(()=> cc[1]*5).pixelate(200,200)

hush()

blobs().out()


--tidalCycles

--------------------------  fucntions --------------------------
beep_beep_bop = s "arpy*2 arpy?" # n "c5 <g4 c4> a5" -- ? fast
some_beats = s "bd*4" -- g1.2
more_beats = s "hh*8"
deeeep = s "<bass3(5,8) techno:1(3,8)>"
hehe = note "<g5 e5 c5> [<c4 g4> a4] <g4 g5>? c4" #s "sine"
deep = "techno:1*2 techno:1?" -- krush8
noisy = s "bleep*4" #speed 0.5 -- fast
genocide = note "<g1 e2 c3> [<c1 g1> a2] <g1 g2>? c2" #s "arpy" #squiz 2 #krush 9
-------------- VISUALS -----------------------------------
amount = struct "t*2 t?" $ ccv ((segment 128 (range 127 0 saw))) # ccn "0" # s "midi"
colour = struct  "<t(5,8) t(3,8)>" $ ccv ((segment 128 (range 127 30 saw))) # ccn "1" # s "midi"
wobble = struct "t [t*2] t? t" $ ccv ((segment 128 (range 127 30 saw))) # ccn "2" # s "midi"
--------------------------  functions end ----------------------

hush

d1 $ every 4 (fast 2) $ beep_beep_bop
d2 $ every 4 (fast 2) $ amount
d3 $ deeeep
d4 $ colour
d5 $ more_beats
d6 $ noisy #gain 0.8
d7 $ hehe
d8 $ wobble


hush

 

FoxDot is a Python programming environment created in 2015 by Ryan Kirkbride. It is designed in the aim to help people to combine coding and audio to create lice coding performance that’s mainly made up of musical elements. 

On the technical side, the installation of FoxDot is fairly simple. All it takes are a few lines of commands in the terminal and installation within SuperCollider. A command line will take you to the user interface of FoxDot within Python. 

The language FoxDot adopts actually resembles python in many ways. It also used the concept of  pattern and with functions it provides for manipulation of the patterns, we can easily build patterns for percussions and simple notes. 

hHowever, it does not have a ready to use web interface. With the limited number of functions in the library, there are also limited variations to the rhythm we are making. Moreover, the platform itself has not had any major updates for a long time, which means we are missing out on a lot of the novel development in live coding with this platform. 

“Improviz is a live-coding environment built for creating visual performances of abstract shapes, blurred shades, and broken GIFs.”
When I was choosing a platform from the list, the caption for Improviz mentioned something about how it’s based on using and abusing basic shapes. This caught my attention as I am always interested in seeing how far you can push basic structures to create significantly more complex visuals.

Improviz is built in Haskell and interacts directly with OpenGL and it was developed by David Guy John, who goes by ‘rumble-san’. This platform is considered to be a fork of another platform: LiveCodeLab which rumble-san also took part in developing. Still, Improviz stands out a lot more to users as it is much faster than LiveCodeLab, but most importantly, it also offers the option to load custom textures, materials, models, … which gives a lot of room for personalization and experimentation.

It has a web version (you can find it here), but it lacks the ability to load all the custom objects I mentioned previously. Therefore it is recommended to download the latest release from GitHub, run it from the terminal, and use an editor (like Atom) alongside it.

The language used reminds me a lot of p5.js. It has a quite simple syntax where you call functions that either draw shapes to the screen or change those shapes in some way. “By default, the only thing that actually changes in Improviz is the time variable, which tracks the number of seconds that have passed since the program was started.” So introducing the time variable in the code is a good idea to get some cool variations and effects.

 

Here’s one of the demos I experimented with using Improviz:

 

And, here’s the code I used for it:

background(30)
t = time/2
rectSize = 2

paintOver()

texture(:art4)

//axis
ax = sin(time) *3

move(0,0,0.5*ax*0.4)
  rotate(0,0,time*2)
  rectangle(rectSize*sin(t))

move(0,0,-0.5*ax*1.5)
  rotate(0,0,time)
  rectangle(rectSize*sin(t)*0.3)

move(0,-0.5*ax*2,0)
  rotate(0,0,time*0.5)
  rectangle(rectSize*sin(t)*2)

move(0,0.5*ax*4,0)
  rotate(0,0,time*2)
  rectangle(rectSize*sin(t)*1.5)

move(0.5*ax*2,0,0)
  rotate(0,0,time*0.2)
  rectangle(rectSize*sin(t))

move(-0.5*ax*3,0,0)
  rotate(0,0,time*0.4)
  rectangle(rectSize*sin(t)*0.7)

 

I chose Motifn as the live coding platform. It is a platform that focuses on creating “visualized “music by utilizing “quite a lot of third parties” like Tone.js and receiving support from live coding artists, namely Alex McLean and Charlie Roberts, the creators of the live coding software Tidal Cycles and Gibberwocky respectively. Later on, I did find that there are quite a lot of similarities between the language used in Motifn and in Tidal Cycles. I guess that’s because after all, they are all created based on JS.

It basically provides two modes for coders to play around with. One is “FUN”, which is to code on the web, and the other one is “DAW”, which is to connect MIDI to play music. I mainly tried the “FUN” mode as it is more convenient (but definitely more risky lol). 

In my exploration, I found that this platform is indeed quite user-friendly as it contains music examples as well as interactive tutorials for coders to self-study. The tutorials not only teach users how to create melody based on notes and rhythms based on using drums but even JavaScript basics. It is definitely very helpful for newbies like me. But still, this is not a highly integrated platform like Garageband on the ios system (I played around quite a lot with it), it still requires users with fundamental knowledge of coding. Nevertheless, I believe that this will not be “live coding” if it doesn’t contain the process for users to code themselves. 

 

The UI is also user-friendly. Users can manually adjust the space of the three sidebars based on their preferences or needs. Here, I really want to point out the note tracks this platform provides, which is absolutely one of its best features. Similar to the tracks in Garage Band, this system also demonstrates the track for each instrument, and users can check each note by clicking on them, and in return, the corresponding code part will also be highlighted by the cursor.

For the practical usage part, I first learned the basic structure of programming on this platform and found that it contains three parts: 1) Define a function (like function song ()), 2) “Let” function for each track (let ss = songStruture ( { ), and 3) return song. Then learned details of how to create melodies, rhythms, and arrange music structure. 

In terms of the content that we can create on this platform, I feel like there are a lot of very interesting effects that we can make, for example, the “ties” and the “vibrato” that can trigger very funny sounds. Although it’s true that this platform doesn’t provide many preset synths for users to choose is also possible to use this platform to create absolutely great pieces if you know quite a lot about composition. Like this one:

And here is a demo I made: