How to Program Electronic Music That Plays as You Code It

Live coders create musical algorithms on stage. Here’s how to become one

Live coding is a type of performance art in which the performer creates music by programming and reprogramming a synthesizer as the composition plays. The synthesizer code is typically projected onto walls or screens for the audience to inspect as they listen to the unfolding sound. Live coding events are sometimes known as algoraves, and at these events it’s common to see visualizations of the evolving music projected alongside the code. Often, these visualizations are created by a second performer manipulating graphics software in tandem with the live coder.

After attending a few algoraves in New York City (musically, the results tend to fall along a spectrum from ambient soundscapes to pounding electronic dance music, with a few detours into more experimental domains), I decided to look a little closer at the software the performers were using. I wanted to see if I could come up with my own hardware spin on creating visualizations. While I’m not yet ready to take to the stage, the results have been fun. I’d recommend that any reader interested in music or sound art should try live coding, even if they have no experience playing any traditional musical instrument.

The most popular software for live coding appears to be Sonic Pi. This is an open source project originally created by Sam Aaron for the Raspberry Pi, although it is also available for Windows and macOS. Sonic Pi’s basic interface is a text editor. Apart from some performance-specific buttons, such as for starting and stopping a piece of music, it looks pretty much like any integrated development environment (IDE), in this case for a version of the Ruby language. Like Python, Ruby is an interpreted language that can run interactively. The Ruby-powered Sonic Pi IDE provides a friendly front end to the powerful SuperCollider sound-synthesis engine, which has been used for over two decades as the basis of many electronic music and acoustic research projects.

You could create a piece of music by typing a complete list of notes into the IDE, selecting a software-defined musical instrument plus any desired effects, such as reverb, and just having Sonic Pi play the tones. But this would eliminate the fun at the heart of live coding, which is a collaboration between the performer and the computer, in which the performer continually shapes algorithms but leaves the work of actually determining what note to play next up to those algorithms. Sonic Pi takes care of keeping everything in sync so that the music never misses a beat. [READ MORE]