Code of Music

Melody

Generative Sequences

Idea

I came up with this week’s idea immediately and did not deviate from it much: What if a step sequencer had a personality of its own and didn’t just stay set to whatever the user programmed? Some immediate questions arose from this idea (paired with what I came up with and implemented):

Q: What personalities might notes (particles) have?

Q: How might notes get added, removed?

Q: What specific implementation challenges might arise from this idea?

For melodic content, I also planned early on to use the Ryukyu scale, a Japanese pentatonic collection containing the notes (Do, Mi, Fa, Sol, Ti). Back in my youth, when I was learning about synthesis, I got a cartridge for my Nintendo DS called Korg DS-10. It is not a video game but rather an emulator of multiple Korg products. One of those emulations was of the Kaoscillator, basically a touchpad where one can map musical variables (like pitch and amplitude) to an X-Y axis. The Kaoscillator had tons of scales built in that I experimented with. For some reason, the Ryukyu always stuck with me.

Korg DS-10 Image from Amazon

Implementation

This sketch is a bit more complicated than last week’s, but because I overcame numerous hurdles last week (e.g. sequencing, 2d list processing), it all felt manageable. The big challenge in working with p5.js and Tone.js is managing big structures that control visuals and audio. These two things are related, but are called differently at different times. (I wonder if anyone has come up with an ideal method of arranging their code with these two libraries?) In this week’s solution, I created a Particle class which has a static variable Particle.positions, a 2d array of 0’s and 1’s that are fed into the Tone.Sequence and updated as particles get added and move around the screen. This is useful because the audio is directly drawn from the positions of objects on screen and both code written in particle.js and sketch.js have access to this variable. I also created a Grid class for animating the step sequencer grid, but I found it useful in other ways too: Its get_position() function returns pixel coordinates based on grid position, and its get_radius() function can used for animating particles.



Result

I think that this came out pretty well and is my most musically satisfying project yet. Because particles die out, the user has to press buttons really fast if they want the music to build. The instrument does not just play on its own but requires constant input like any real instrument. And, because particles lose amplitude as they die, new particles function as accented notes. Particles are added with a skewed random function meaning they tend to have a lower pitch, giving higher pitches more weight, and I put a bit of effort in getting the reverb, panning, and amplitude just right.

One potential issue is the rate at which things change - notes die pretty fast and can sometimes move around a lot. This can make it hard to lock into an enjoyable pattern one happens into. If this were to emulate, say a Steve Reich piece, then notes should hardly move (like once every couple measures). But, on the internet our attention spans are much lower, so maybe this is good given the circumstances.

Future Ideas

Depending on how the next few projects go, I may be interested in upgrading this one for the midterm.

Play it. (Supported by Chrome.)

View Source in Github.


A blog for displaying Willie Payne’s progress in the NYU ITP Course “The Code of Music.”