Design P_2_2_1_01 (Herman Schmidt et al., 2018) from the generative design book caught my eye and I decided that it would be a good base for the work I was planning to make with p5. I felt that abstract patterns like the one seen below were a great start to the project. I took the algorithm of how the patterns were drawn and then I customized the code to be able to respond to midi events instead of mouse position. 

There are 4 patterns on tidal. One is a pattern of higher sounds and the other is a bit lower sounding pattern where I experimented with distortion using the someCycles function. The other two patterns control the midi output.

Here is the demo of the visuals controlled by tidalCycles:

I drew inspiration from Hans Zimmer’s work in Dune, with the industrial but desert sounds and the deep base. The sounds for the evil side of the track were also insipred by the Sardukar chant from the epilogue of each of the movies. The aesthetics of the hero soundtracks in games I played growing up.

The story being told is of a battle between evil and good a where good win after the beat drops (hopefully on time..). I have attached a video to the composition.

Looking at the video one more time, I realise that you can see the cursor bing visible in the middle of the screen. This is evidence of the p5js visualization shown as a video and there not being a p5 instance created during the performance.

The background visual is made with p5.js by taking muse mind monitors eeg readings of me playing chess and taking time-series data for each type of brain wave (alpha, delta, thetha) which are correlated to different types of emotions. The wave values, while in .csv had lost a lot of information and context where the color pf the particles is the aggregation of the wavetypes and the size is dictated by purely delta waves associated with high stress. The ring in the middle pulses on the heart-rate data from the headband.

The code code was used to generate the p5js video:

let brainData = [];
let headers = [];
let particles = [];
let currentIndex = 0;
let validData = [];
let heartBeatTimer = 0;
let lastHeartRate = 70; //default heart rate

class Particle {
  constructor(x, y, delta, theta, alpha) {
    this.pos = createVector(x, y);
    this.vel = p5.Vector.random2D().mult(map(delta, -2, 2, 0.5, 3));
    this.acc = createVector(0, 0);

    this.r = map(delta, -2, 2, 50, 255);
    this.g = map(theta, -2, 2, 50, 255);
    this.b = map(alpha, -2, 2, 50, 255);

    this.size = map(abs(delta), 0, 2, 2, 10);
    this.lifespan = 255;
  }

  update() {
    this.vel.add(this.acc);
    this.pos.add(this.vel);
    this.acc.mult(0);
    this.vel.add(p5.Vector.random2D().mult(0.1));
    this.lifespan -= 2;
  }

  applyForce(force) {
    this.acc.add(force);
  }

  display() {
    noStroke();
    fill(this.r, this.g, this.b, this.lifespan);
    ellipse(this.pos.x, this.pos.y, this.size, this.size);
  }

  isDead() {
    return this.lifespan < 0;
  }
}

function preload() {
  brainData = loadStrings("mindMonitor_2025-02-26--20-59-08.csv");
}

function setup() {
  createCanvas(windowWidth, windowHeight);
  colorMode(RGB, 255, 255, 255, 255);

  headers = brainData[0].split(",");
//filterinf dataset for valid rows
  validData = brainData.slice(1).filter((row) => {
    let cells = row.split(",");
    return cells.some((cell) => !isNaN(parseFloat(cell)));
  });

  console.log("Total valid data rows:", validData.length);
}

function draw() {
  //trailing effect for particles
  background(0, 20);

  if (validData.length === 0) {
    console.error("No valid data found!");
    noLoop();
    return;
  }

  let currentLine = validData[currentIndex].split(",");

  let hrIndex = headers.indexOf("Heart_Rate");
  let heartRate = parseFloat(currentLine[hrIndex]);

  if (!isNaN(heartRate) && heartRate > 0) {
    lastHeartRate = heartRate;
  }

  let beatInterval = 60000 / lastHeartRate;

  heartBeatTimer += deltaTime;

  if (heartBeatTimer >= beatInterval) {
    heartBeatTimer = 0;

    let deltaIndex = headers.indexOf("Delta");
    let thetaIndex = headers.indexOf("Theta");
    let alphaIndex = headers.indexOf("Alpha");

    let delta = parseFloat(currentLine[deltaIndex]) || 0;
    let theta = parseFloat(currentLine[thetaIndex]) || 0;
    let alpha = parseFloat(currentLine[alphaIndex]) || 0;

    let particleCount = map(abs(delta), 0, 2, 5, 30);
    for (let i = 0; i < particleCount; i++) {
      let p = new Particle(
        width / 2 + random(-100, 100),
        height / 2 + random(-100, 100),
        delta,
        theta,
        alpha
      );
      particles.push(p);
    }

    currentIndex = (currentIndex + 1) % validData.length;
  }

  push();
  noFill();
  stroke(255, 0, 0, 150);
  strokeWeight(3);
  let pulseSize = map(heartBeatTimer, 0, beatInterval, 100, 50);
  ellipse(width / 2, height / 2, pulseSize, pulseSize);
  pop();

  for (let i = particles.length - 1; i >= 0; i--) {
    particles[i].update();
    particles[i].display();

    if (particles[i].isDead()) {
      particles.splice(i, 1);
    }
  }
}

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
}

I encapsulated the tidal code so that i would only have to excute few lines during the performance which didn’t go the way I wanted:( , so in a way i feel like when it comes to live coding performance, having a few refactored functions work while working with code blocks lives gives a better setup for performing and debuggin on the fly.

Below is the tidal code:


-----------------hehe----

setcps (135/60/4)

-- harkonnen type beat
evilPad = slow 8
$ n (cat [
"0 3 7 10",
"3 6 10 13",
"5 8 12 15",
"7 14 17 21"
])
# s "sax"
# room 0.8
# size 0.9
# gain (range 0.7 0.9 $ slow 4 $ sine)

--shardukar type beat
evilBass = slow 8
$ sometimesBy 0.4 (rev)
$ n "0 5 2 7"
# s "sax:2"
# orbit 1
# room 0.7
# gain 0.75
# lpf 800
# shape 0.3

--geidi prime type beat
evilAtmosphere = slow 16
$ sometimes (|+ n 12)
$ n "14 ~ 19 ~"
# s "sax"
# gain 0.8
# room 0.9
# size 0.95
# orbit 2

--shardukar chant type pattern chpped
evilPercussion = slow 4
$ striate "<8 4>"
$ n (segment 16 $ range 6 8 $ slow 8 $ perlin)
# s "speechless"
# legato 2
# gain 1.2
# krush 4

--shardukar chant type pattern 2 chopped
evilVoice = slow 4
$ density "<1 1 2 4>/8"
$ striate "<2 4 8 16>/4"
$ n (segment 32 $ range 0 6 $ slow 8 $ sine)
# s "speech"
# legato (segment 8 $ range 2 3 $ slow 16 $ sine)
# gain (segment 32 $ range 0.8 1.2 $ slow 4 $ sine)
# pan (range 0.3 0.7 $ rand)
# crush 8

hush

evilRhythm = stack [
s "~ arp ~ ~" # room 0.5 # krush 9 # gain 1.2,
fast 2 $ s "moog2 moog2 moog2 moog3 moog:1 moog2" # room 0.7 # krush 5, fast 4 $ s "wobble8" # gain 0.8 # lpf 1200
]

d1 $ evilRhythm
hush

-- Build-up to drop
evilBuildUp = do {
d1 $ qtrigger $ filterWhen (>=0) $ seqP [
(0, 1, s "moog:14"), (1, 2, s "moog:18"),
(2, 3, s "moog:116"), (3, 4, s "moog:132")
] # room 0.3 # krush 9 # lpf (slow 4 (3000saw + 200)); d2 $ qtrigger $ filterWhen (>=0) $ seqP [ (0, 1, s "bass1:74"),
(1, 2, s "bass1:88"), (2, 3, s "bass1:916"),
(3, 4, s "bass1:932") ] # room 0.3 # lpf (slow 4 (1000saw + 100)) # speed (slow 4 (range 1 4 saw)) # gain 1.3;
d3 $ qtrigger $ filterWhen (>=0) $ seqP [
(0, 4, evilVoice # gain (slow 4 (range 0.8 1.5 saw)))
];
d4 $ qtrigger $ filterWhen (>=0) $ seqP [
(3, 4, s "crash:2*16" # gain (slow 1 (range 0.7 1.3 saw)) # room 0.9)
]
}

d1 $ silence
d2 $ silence
d3 silence
d4 silence

theDrop = do {
d1 silence;
d2 silence;
d3 silence;
d4 silence
}

heroBassDrum = s "[808bd:3(3,8), 808bd:4(5,16)]" # gain 1.3 # room 0.4 # shape 0.4 #krush 9

heroSnare = s "~ sd:2 ~ sd:2" # room 0.6 # gain 1.1 # squiz 0.8 #krush 6

heroHiHats = fast 2 $ s "hh*4" # gain (range 0.8 1.0 $ rand) # pan (range 0.2 0.8 $ slow 3 $ sine) #krush 5

heroTom = s "~ ~ ~ [~ lt:1 lt:2 lt:3*2]" # gain 1.0 # room 0.5 # speed 0.9 #krush 9

heroCymbal = s "[~ ~ ~ crash:4]" # gain 0.95 # room 0.7 # size 0.8

heroicFill = s "[feel:14, [~ ~ ~ feel:64], [~ ~ ~ ~ crash]]"
# gain 1.2
# room 0.7
# speed (range 1.5 0.9 $ saw)
# crush 8

dramaticEntrance = do {
d1 $ s "808bd:3 ~ ~ ~ ~ ~ ~ ~" # gain 1.4 # room 0.9 # size 0.9;
d2 $ s "~ ~ ~ crash:4" # gain 1.3 # room 0.9;
d3 silence;
d4 silence
}

heroPattern = do {
d1 $ stack [
heroBassDrum,
heroSnare,
heroHiHats
];
d2 $ stack [
heroTom,
heroCymbal
] # shape 0.3;
d3 $ s "sine:3" >| note (scale "mixolydian" ("0,4,7") + "c4")
# gain 0.65 # room 0.7;
d4 silence
}

heroExpanded = do {
d1 $ stack [
heroBassDrum # gain 1.4,
heroSnare # gain 1.2,
heroHiHats # gain 1.1
];
d2 $ stack [
heroTom,
heroCymbal # gain 1.1
] # shape 0.3;
d3 $ s "sine:3" >| note (scale "mixolydian" ("0,4,7") + "")
# gain 0.75 # room 0.7 # lpf 3000;
d4 $ s "[~ ~ ~ [feel:6 feel:7]]" # gain 0.9 # room 0.6
}

heroEpic = do {
d1 $ stack [
s "[808bd:3(3,8), 808bd:4(5,16)]" # gain 1.4 # room 0.4,
s "~ sd:2 ~ [sd:2 sd:4]" # room 0.6 # gain 1.2,
fast 2 $ s "hh4" # gain (range 0.9 1.1 $ rand) ]; d2 $ stack [ s "~ ~ mt:1 [~ lt:1 lt:2 lt:32]" # gain 1.1 # room 0.5,
s "[~ ~ ~ crash:4]" # gain 1.0 # room 0.7
] # shape 0.4;
d3 $ s "sine:3" >| note (scale "mixolydian" ("0,4,7,9") + "")
# gain 0.8 # room 0.8 # lpf 4000;
d4 $ s "feel:6(3,8)" # gain 0.9 # room 0.6 # speed 1.2
}

evilIntro = do {
d1 $ evilPad;
d2 silence;
d3 silence;
d4 silence
}

evilBuilds = do {
d1 $ evilPad;
d2 $ evilBass;
d3 $ evilAtmosphere;
d4 silence
}

evilIntensifies = do {
d1 $ evilPad;
d2 $ evilBass;
d3 $ evilAtmosphere;
d4 $ evilPercussion
}

d9 $ ccv "0 40 60 120" # ccn "0" # s "midi"

evilFullPower = do {
d1 $ evilPad # gain 0.6;
d3 $ evilVoice;
d4 $ evilPercussion # gain 1.3
}

--for evil full power
d6 $ s "~ arp ~ ~" # room 0.5 # krush 9 # gain 1.2
d7 $ fast 2 $ s "moog2 moog2 moog2 moog3 moog:1 moog2" # room 0.7 # krush 5 d8 $ fast 4 $ s "wobble8" # gain 0.8 # lpf 1200

evilIntro
evilBuilds
evilIntensifies
evilFullPower

evilBuildUp
theDrop

dramaticEntrance
heroPattern
heroExpanded
heroEpic

hush

Below is the hydra code:

s0.initVideo("/Users/janindu/Documents/liveCoding/p5_demo/p5_js_eegViz.mp4")

src(s0)
  .blend(o1, () => cc[1])
  .out()

  src(o0)
    .blend(src(o0).diff(s0).scale(.99),1.1)
    .modulatePixelate(noise(2,0.8).pixelate(160,160),1024)
    .out(o1)



  src(o0)
    .blend(
      src(o0)
        .diff(s0)
        .scale(() => 0.99 + 0.1 * Math.sin(time * 0.1))
        .rotate(() => cc[1] * 0.1),
      () => 1.1 + cc[1] * 0.5
    )
    .modulatePixelate(
      noise(() => 2 + cc[1] * 3, () => 0.01 + cc[1] * 0.05)
        .pixelate(() => 16 - cc[1] * 8, () => 16 - cc[1] * 8),
      () => 1024 * (1 - cc[1] * 0.5)
    )
    .modulate(
      o0,
      () => cc[1] * 0.3
    )
    .kaleid(() => Math.floor(cc[1] * 4) + 1)
    .colorama(() => cc[1] * 0.1)
    .out(o1)


//---
    hush()

Im really happ something came out of the eeg pipieline but I honestly feel that that that time culd have been better spent relying on noise to generate visuals.

But nevertheless, I’m happy I could (pun intended..) leave my heart and soul on the screen.

Also this is how it looks like. from p5js:

Ryoichi Kurokawa’s work is centered around trying to bring out two concepts: synesthesia and deconstruction of nature. This is very evident in most of Kurokawa’s work – watching ad/ab Atom on fullscreen is an astronomically more fulfilling experience than having it play on a different tab. The structures used, images, colors and the flow of Kurokawa’s work exhibit the sense of randomness that we see in nature, its almost as if Kurokawa holds the reins to this randomness as his pieces don’t necessarily succumb to entropy but hold a meaningful flow. 

For example in ad/ab Atom, Kurokawa’s starts off with the microscopic particles and then descends into smaller and smaller scales into subatomic particles. This can be visually seen as the particle patterns get more and more chaotic and sometimes wave like but stays controlled when visually presented. We also hear the main audio backbone for the piece getting more and more sharper and more intentional compositions being presented.

What intrigues me most about Kurokawa’s work is the control of the flow to explore the themes that he has in mind. Using nature and the environment around you is very hard to control and express motifs. Kurokawa himself states he intentionally does this when he calls his work time design. Field recordings themselves have proven to be unpredictable in nature, laying graphics on top of these and animation help Kurokawa to control what the viewer experiences, in my opinion in most of his work, be it a visual representing a snake shedding skin or a highway or even a goldfish, we are looking a metaphysical (like chaos) or conversely a material (like an atom) artifact in different viewpoints or lenses. Kurokawa’s work demonstrates how flow can be used as a tool for artistic expression in a audio-visual piece.

Devine Lu Linvega currently lives on a sailboat, and can be found lurking in discord chat rooms hinting at possible bugs and easter eggs in his work. He also created ORCA, along with its companion programs (such as the synthesiser Pilot which we will be using ) under the design collective 100rabbits aboard Devine’s sailboat.

What is ORCA?

Orca is a visual programming environment for making music. Except, there are no words, loops or long lines of code. The IDE is simply a grid composed of placeholder ‘*’ characters. The program does not produce sound on its own, and instead produces MIDI events, OSC connections and UDP packets to communicate what sound should be played. I use Pilot, which is a synthesizer that Devines’ team created to produce my sounds. Even then, I could not send MIDI events to pilot. However, I could send them to superCollider. This approach did not work as Supercollider began picking up some immutable background noise through the same port. Hence I used UDP packets with the “;” operator to create sounds.

How does ORCA work?

The interaction of code and computer flows through “bangs”, a concept borrowed from Max/MSP and Pure Data. live programming concepts of immediate execution. These bangs differ from traditional programming languages like C which have complex blocks of code.

The bangs are maintained by frames which are discrete time steps. Each frame represents one “tick” of the sequencer’s clock (its rate is determined by the BPM). Every frame, the grid is updated and evaluated. Uppercase operators in ORCA execute automatically on every frame, ensuring a continuous, rhythmic flow of actions.

ORCAs IDE was built on inspiration from early 2000s games like Dwarf Fortress.

Operators
There are only 26 alphabetical operators. They go from A-Z and have 8 special operators. Some familiar operators to what we have seen in Tidalcycles are:
C clock(rate mod): Outputs modulo of frame.
D delay(rate mod): Bangs on modulo of frame.
U uclid(step max): Bangs on Euclidean rhythm.
: midi(ch oct note velocity*): Send a midi note.
; pitch(oct note): Send pitch byte out. (Sends this as a UDP message to the synthsizer)
Also interestingly, operators N, S, E, W, which represent the directions can be used to send bangs in their respective directions. Devine shows that this works lille sending marbles to collide with an object.

The above operators produce bangs, combined with other functions such as add and variables, we can make these bangs come in contact with events that we created. An example MIDI event is “:94g..”

The tools roots are from a misunderstanding about tidal that Devine had. ORCA tries to give an accessible way to do generative programming without a large programming background. The tool in a way gamifies the experience of making music in contrast to coding, making music. As ORCA does not compile and run in to errors, live coders experiment will not face syntax errors or compilation errors while editing the code.

However, the packaging of ORCA may seem inaccessible to some, the browser based package requires webMidi to be installed and configured and the locally run version that runs on node requires building on the terminal. Furthermore ORCA is a very “cool” tool to use in front of an audience as the interface between the computer and programmer transcends that of traditional programming languages. From a performance perspective, Orca can also connect to Unity, enabling interaction with 3D graphics. ORCA represents a paradigm shift in live coding by proving that constraints breed creativity. As Devine Lu Linvega states: “It’s for children. The documentation fits in a tweet”, yet professionals use it for everything from IDM sets to interactive installations. ORCA is both entry point and advanced tool in the ecosystem of computational art.

Below is a demo of ORCA in action.

Many ORCA users are technically proficient, employing the tool for solo experimentation with synths or DAWs, yet the platform thrives on community involvement and input. An engaged live coding community collaborates on Discord and Reddit, sharing techniques and directly interacting with creator Devine Lu Linvega, who actively participates in discussions and guides the ecosystem’s evolution through hints and open dialogue.

A central argument of the text is that the subtle variations in rhythm such as the variation of the position that the snare is played on a backbeat is what makes music human. And that examine this ‘humanness’ of music which arises in performance variation, musical expression and microtiming as seen prior. This maybe why a recording artist such as Ye has human electronic and sample heavy but still sounds very human (a caveat to this argument is also that Kanye uses human voice as a instrument in his tracks).

The interplay of the human element and technology is a key theme throughout the text. While technology can create precise, “robotic” rhythms, the text notes that this absence of microtiming can also be a powerful artistic choice. The deliberate use of “robotic” rhythms can suggest a disembodied, futuristic ideal, or can be a way to signify on technology and history. However, the author also explores ways in which technology is being used to capture and manipulate the nuances of human performance, such as through sampling and the manipulation of recorded sound, in an attempt to retain that human element in music.

A large part of music performances in the modern era are just the presence or the interaction of the artist with their art. For example, a mainstream artist such as Playboi Carti will most probably recite the ad-libs and dance to their own songs that are being played in the background. This example of performance being the act of through vibing extends to mainstream DJs.

The text shows that the “human” element in music has capacity for expression, which can be either mimicked or intentionally avoided as a stylistic choice.

Software engineering or computer science as a practice teaches us to keep code abstracted and encapsulated for it to be production ready. While the average college coding assignment will be in one file with a large amount of ‘cout>>”here”; ‘s or ‘printf(“heree2”);’s humanising the the otherwise cold language of computers, code bases in the industry are surprisingly bland.

The existence of Live coding is an attempt of liberating the tool that we use to interact with computers. It is also an attempt at regaining the autonomy over the the strict regimes and practices we have established for communicating through computers. In a sense, the Turing Complete User definition established in the text diminishes with every iteration that a thought goes through a computer.

Furthermore the level of transparency in the process of creating the work in Live coding adds to its punk nature, as it tries to combat the encapsulated and abstracted, product oriented nature of software and code we have been accustomed to. In addition the novelty of using code, in contrast to other mediums of electronic artistic expression is a largely refreshing one.