I started this project with one goal in mind: incorporating one of my favorite memes—the “Is that hyperpigmentation?” meme that’s been going viral recently. I didn’t have a specific or fixed vision from the start; instead, I approached it by experimenting over and over again until I found something that felt right.
One thing I really love is experimental music production. Lately, I’ve been listening to a lot of NewJeans and NMIXX, and I wanted to try a “switch-up” style beat—where the entire vibe of the music shifts suddenly while still transitioning smoothly between phases. I’ve composed music like this before, but never through coding.
For this project, I imported the “Is that hyperpigmentation?” and “It is fantastic” lines from the original meme as samples. Initially, I wasn’t planning on incorporating a beat drop, but after last week’s class, I was inspired to experiment with the beat drop example we studied. I wouldn’t necessarily call what I created a traditional beat drop—it’s more of a buildup leading to an underwhelming yet oddly satisfying drop (at least in my opinion). I do think I could have executed the buildup better, as I struggled to align the audio with my initial vision despite multiple attempts. However, I really like how the buildup transitions into the final section.
For the visuals, I wanted them to match the vibe of the music while also conveying emotion. The piece starts with a simple line that moves to the beat, set against a dark background to complement the bass-heavy intro. During this section, I subtly tease the “Is that hyperpigmentation?” and “It is fantastic” samples—just enough to keep the audience intrigued and a little confused. As the composition progresses, I introduce hi-hats, a Casio sample, and glitch effects layered with drums. When the Casio sample speeds up, I color in the lines and shift them to light blue, reinforcing the energy shift.
Next comes the buildup, featuring the phrase “It is fantastic” repeating and intensifying, ultimately cutting out to silence just before the full sample plays. At this point, the sound transitions into an ethereal yet mysterious atmosphere, and the visuals suddenly become vibrant and overwhelming—which I love. I also introduce a simple, looping “It is fantastic,” which enhances the vibe. This is followed by a complex beat sequence before the piece gradually winds down, ending with the full sample playing in silence alongside the “hyperpigmentation” drawing from the original meme.
Tidal Code:
-- Part 1
sTart = do
d16 $ s "~ ~ ~ bass1" # room 5 # legato 1 # gain 1
d2 $ ccv "127 ~ ~ 0 " # ccn "0" # s "midi"
sTart
once $ s "tastic" # gain 0.5
d15 $ s "bleep" # room 0.6 # gain 1
once $ "pigment" # gain 0.5
d9 $ s "~cp" # gain 1.2 # room 0.3
boop_boop = do
d1 $ qtrigger $ filterWhen (>= 0) $ fast 4 $ s "casio" <| n (run 2) # room 0.8
d2 $ ccv "0 70 90 127" # ccn "0" # s "midi"
boop_boop
-- Part 2
d3 $ degradeBy 0.01 $ sound "hh*8" + "hh!"# gain "1.5 0.75 0.75 1.5 0.6 0.9 0.9" # speed 2 #gain 1.5
glitchity_boop = do
d8 $ qtrigger $ filterWhen (>= 0) $ sound "<glitch:5 glitch:3> bd bd [~ bd]" # gain 2 # room 0.01
d4 $ ccv "<2 127> 30 40 [~ 10]" # ccn "0" # s "midi"
glitchity_boop
-- part 3
tASTIC = do
d3 $ qtrigger $ filterWhen (>=0) $ seqP
[ (0, 1, s "tastic:2*4")
, (1, 2, s "tastic:2*8")
, (2, 3, s "tastic:2*16")
, (3, 4, s "tastic:2*32")
]
# room 0.3
# hpf (slow 4 (1000 * saw + 100))
# speed (slow 4 (range 1 4 saw))
# gain 2
# legato 0.5
d4 $ qtrigger $ filterWhen (>=0) $ seqP
[ (4, 5, s "tastic") ]
# gain 2
# room 0.3
d8 $ silence
tASTIC
hush
-- part 4
d1va_Bo0ts = do
d10 $ qtrigger $ filterWhen (>= 0) $ slow 2 $ s "superzow" >| note (scale "<minor hexSus major>" ("[<-5 -3 -1 1> 0,2,4,8] * <1 8 16 32>") + "[f5,f6,f7]") # legato 1 # lpfbus 1 (segment 1000 (slow 4 (range 100 3000 saw)))
d5 $ struct "[t(1,2) t(2,4) t(4,10) t(10,16)]" $ ccv (segment 16 (slow 1 (range 120 0 saw))) # ccn "1" # s "midi"
d1va_Bo0ts
once $ s "pigment" # room 1.5 # gain 5
d13 $ qtrigger $ filterWhen (>= 0) $ fast 2 $ s "tastic" # gain 3 # room 1.5 #legato 0.5 # gain (range 1 1.2 rand)
d1 $ qtrigger $ filterWhen (>= 0) $ s "[bleep(5,16), cp(1,4), feel(7,8), bass1:(9,16)]" # legato 0.2 # gain 2
d1 silence
d8 silence
d10 silence
eNding
eNding = do
once $ qtrigger $ filterWhen (>= 0) $ s "tastic" # gain 2
hush
Dorothy Feaver’s article on Ryoichi Kurokawa introduced me to an artist who thinks like both a scientist and a poet. Kurokawa’s ability to dissect natural phenomena and rebuild them into immersive audiovisual experiences is captivating. While I found the technical aspects of his work slightly intimidating, the underlying themes of synesthesia and “denaturing” nature resonated with me.
The article paints a vivid picture of Kurokawa’s process, from the isolated Berlin studio to the collaborations with his producer. His concerts, like syn_, aim to fuse sound and image, creating a total sensory experience. The concept of “denaturing” nature, using data to reveal hidden structures, is equally intriguing.
What struck me most was Kurokawa’s almost architectural approach to time and space. He’s not just creating art; he’s designing experiences that challenge our perception. The article left me pondering the relationship between technology and nature, and how artists like Kurokawa can help us see the world in new ways. While I’m not sure I fully grasp the technicalities, I’m definitely curious to experience Kurokawa’s work firsthand and get lost in his synthesized sensory landscapes.
LiMuT is a live coding platform that combines both audio and visuals in a web browser. According to its creator, Stephen Clibbery, it was inspired by FoxDot, it aims to make creative coding more accessible by running entirely in any modern browser with no installation required. This allows users to experiment with generative music and real-time graphics seamlessly, making it a powerful tool for both live performances and creative exploration.
The platform’s strength lies in its easy setup, as it can be run directly from any modern browser at https://sdclibbery.github.io/limut/. While it supports both visuals and audio, there are limited options for integrating the two. Many of the visual tools are fixed, offering little room for customization. Despite this, LiMuT remains a powerful live coding tool that provides flexibility for performance. I decided to dive into the documentation, experimenting with and tweaking parts of the examples to better understand how each element interacts with the others.
LiMuT uses its own custom live coding language, which is primarily based on text-based patterns to generate music algorithmically. It draws some similarities from other live coding languages like TidalCycles and Sonic Pi, but its syntax and structure are unique to Limut.
The language itself is highly pattern-driven, meaning you write sequences that define musical events (like notes, rhythms, effects, etc.) and then manipulate these patterns in real-time. The language isn’t directly derived from a general-purpose programming language but instead is designed specifically for musical composition and performance.
Here’s how it stands out:
Pattern-basedsyntax: You define rhythms, pitches, and effects in a highly modular, shorthand format.
Time manipulation: Limut provides commands that adjust tempo, duration, amplitude, and other musical properties in real time.
Easy to Navigate: The user interface is designed in a way that makes navigation easy.
Reading about microtiming made me think differently about rhythm and how much subtle timing variations shape the way we experience music. I’ve always felt that some songs just “hit different,” but I never really considered how small delays in a drumbeat or a slightly rushed note could create that feeling. The discussion on African and African-American musical traditions, especially how groove emerges from microtiming, reminded me of songs that make me want to move even if I’m just sitting still. It’s fascinating how something so precise—down to milliseconds—can make music feel more human.
The idea of being “in the pocket” stood out to me, especially in relation to genres like funk, hip-hop, and R&B, where rhythm feels alive and interactive. I’ve noticed that in a lot of my favorite songs, the backbeat isn’t rigid but slightly laid back, creating that smooth, effortless vibe. It also makes me think about live performances versus studio recordings—sometimes, a live version feels more engaging because it has those natural imperfections that quantized beats remove. This makes me appreciate how rhythm isn’t just about keeping time but about shaping emotion and energy.
This chapter also made me reflect on how technology influences our sense of groove. With so much music being produced digitally, there’s a balance between precision and feel. Some tracks use quantization to sound perfect, but others intentionally keep human-like imperfections to maintain a groove. I’ve noticed how producers add swing to drum patterns in genres like lo-fi hip-hop, recreating the organic feel of live drumming. It’s interesting to see how microtiming isn’t just a technical detail but a crucial part of musical expression, bridging tradition and innovation in ways I hadn’t fully appreciated before.
To me, it feels like a raw, unfiltered conversation with technology—where code isn’t just something you write and execute but something you shape and negotiate with in the moment. It reminds me of DJing or vinyl scratching, where the act of creation is as important as the final output, and every adjustment is part of the performance.
There’s something rebellious about it, too. Most coding environments push precision, control, and pre-planned logic, but live coding thrives on unpredictability, improvisation, and even failure. The screen isn’t just a workspace—it’s a canvas, a stage, an instrument. The audience isn’t just watching; they’re witnessing thought unfold in real time. It challenges the idea that programming has to be hidden, polished, or even “correct.” Instead, it embraces the process, the trial and error, the glitches that become part of the art.
For me, live coding is exciting because it breaks down the usual walls between artist and machine, between logic and emotion. It’s proof that code isn’t just functional—it can be expressive, performative, even poetic. It makes technology feel more human, more alive.