Code of Music
Melody
Generative Sequences
Idea
I came up with this week’s idea immediately and did not deviate from it much: What if a step sequencer had a personality of its own and didn’t just stay set to whatever the user programmed? Some immediate questions arose from this idea (paired with what I came up with and implemented):
Q: What personalities might notes (particles) have?
- I came up with three personalities chosen randomly from a list at particle birth:
['none', 'horizontal', 'vertical']
. As is likely obvious, particles either do not move at all, move up or down (change pitch), or side to side (change rhythmic position).
Q: How might notes get added, removed?
- I’ve been really into the idea of single input music making recently. Like my past projects, notes are added with a single button press and positioned with some (targeted) randomness. Notes have “lifespans”. Each one plays a certain number of times before becoming inactive, and as they “die” they become quieter.
Q: What specific implementation challenges might arise from this idea?
- For one thing, I needed to make sure that notes never overlap with each other? Each one checks if the position it is about to move to is occupied before it moves. There were also numerous memory management challenges in keeping track of the states of all the particles.
For melodic content, I also planned early on to use the Ryukyu scale, a Japanese pentatonic collection containing the notes (Do, Mi, Fa, Sol, Ti). Back in my youth, when I was learning about synthesis, I got a cartridge for my Nintendo DS called Korg DS-10. It is not a video game but rather an emulator of multiple Korg products. One of those emulations was of the Kaoscillator, basically a touchpad where one can map musical variables (like pitch and amplitude) to an X-Y axis. The Kaoscillator had tons of scales built in that I experimented with. For some reason, the Ryukyu always stuck with me.
Implementation
This sketch is a bit more complicated than last week’s, but because I overcame numerous hurdles last week (e.g. sequencing, 2d list processing), it all felt manageable. The big challenge in working with p5.js and Tone.js is managing big structures that control visuals and audio. These two things are related, but are called differently at different times. (I wonder if anyone has come up with an ideal method of arranging their code with these two libraries?) In this week’s solution, I created a Particle
class which has a static variable Particle.positions
, a 2d array of 0’s and 1’s that are fed into the Tone.Sequence
and updated as particles get added and move around the screen. This is useful because the audio is directly drawn from the positions of objects on screen and both code written in particle.js
and sketch.js
have access to this variable. I also created a Grid
class for animating the step sequencer grid, but I found it useful in other ways too: Its get_position()
function returns pixel coordinates based on grid position, and its get_radius()
function can used for animating particles.
Result
I think that this came out pretty well and is my most musically satisfying project yet. Because particles die out, the user has to press buttons really fast if they want the music to build. The instrument does not just play on its own but requires constant input like any real instrument. And, because particles lose amplitude as they die, new particles function as accented notes. Particles are added with a skewed random function meaning they tend to have a lower pitch, giving higher pitches more weight, and I put a bit of effort in getting the reverb, panning, and amplitude just right.
One potential issue is the rate at which things change - notes die pretty fast and can sometimes move around a lot. This can make it hard to lock into an enjoyable pattern one happens into. If this were to emulate, say a Steve Reich piece, then notes should hardly move (like once every couple measures). But, on the internet our attention spans are much lower, so maybe this is good given the circumstances.
Future Ideas
Depending on how the next few projects go, I may be interested in upgrading this one for the midterm.
- Because of the way I have loaded in sounds, it would not be too hard to change scales in real time. Maybe, once the user has played enough notes, it will automatically . (Or more intense, gradually begin drawing from a second pitch collection…)
- It would be awesome if the background changed as the piece progresses. Maybe there is some way to represent on the grid how many particles have died at each point?
- I know I’ve been into single button inputs, but a low-hanging fruit would be to add mouse click support for specific notes. I could also add a few sliders around the interface to control some of the features, like tempo, particle lifespan, and chance for movement.
Play it. (Supported by Chrome.)
A blog for displaying Willie Payne’s progress in the NYU ITP Course “The Code of Music.”