Web-based MIDI Editor and Synthesizer

A screenshot of the notes application showing the MIDI editor with some drawn and selected notes

I'd been exploring the WebAudio API and the various things that people have made with it. I was blown away by awesome demos such as this Karplus-Strong synthesis demo and this Collection of WebAudio effects, and I wanted to try it out for myself.

MIDI Editor

The MIDI editor interface is rendered as a SVG element. Its entire layout is calculated in Rust, and it is drawn/updated via shim functions exported from JavaScript. I went with SVG because it allows elements to be resized, deleted, and created individually without forcing the entire UI to re-render. Since the UI usually only changes slightly with each modification (one note added, etc.), this makes performance very good in the base case since we don't have to constantly re-render every frame.

Raw mouse and keyboard events are passed to Rust and handled entirely there. There are keyboard shortcuts implemented for a variety of actions such as moving notes, copy-pasting selections, and playing back the current composition. State is maintained for a set of selected notes which can be selected by clicking and dragging while holding shift or control-selecting individual notes. The goal was to make the editing experience as efficient as possible and give users access to higher-level methods of manuipulating the note data.

Note Data Representation

Note data is stored internally in skip lists, one for each note line. I chose this data structure due to the fact that we the majority of operations are random insertions and deletions triggered when users modify notes in the middle of a composition. It also supports "stabbing" queries, where the goal is to either find what note intersects a given beat or what notes bound it if there are no intersections.

Along the same, I created a really neat text-based representation of the skip list which is printed to the console in debug mode:

|15, 16|--------------------->|21, 21|--------------------->x
|15, 16|->|17, 17|----------->|21, 21|--------------------->x
|15, 16|->|17, 17|----------->|21, 21|->|22, 23|----------->x
|15, 16|->|17, 17|->|19, 20|->|21, 21|->|22, 23|->|25, 26|->x
|15, 16|->|17, 17|->|19, 20|->|21, 21|->|22, 23|->|25, 26|->x

Synthesizer Settings UI

The UI for the synthesizer controls is implemented using react-control-panel, my React port of the control-panel project. Changing values affects the synthesizer live, being applied to all of the underlying voices individually.

Synthesizer

The first part of the current application is the synthesizer built using the Tone.JS library. Tone.JS is a thin-ish wrapper over WebAudio that includes a bunch of handy stuff like a polyphonic synth manager (lets you play more than one note at the same time) and pre-built audio effects like filters. Since I used Rust and WebAssembly to implement the core of the application and Tone.JS is a JavaScript library, it was necessary to set up some shims for calling into Tone.JS from Rust. However, once that was finished, notes could be played by simply calling a function.

As it turned out, the default Polyphonic synthesizer built into Tone.JS had some issues when used to dynamically play notes such as when users seleted them on the editor. Voices (underlying monophonic synths) were getting re-used before the note they last played finished, leading to permananently playing notes and audio artifacts. To get around this, I implemented my own polyphonic synth state manager in Rust that uses the least-recently-used voice first. I combined this with a static scheduling algorithm that pre-calculated the optimal order in which to use voices in order to maximize the time between release and the next attack. This is used in the playback feature to schedule attacks/releases on individual voices all at once.

Saving + Exporting

Whenevery they are played, compositions are serialized into a binary format and Base64-encoded (all from Rust) and then saved into the browser's localStorage. Saved compositions are loaded during application initialization if they exist. In the future, the goal is to allow import/export from MIDI files and perhaps even MIDI keyboards using WebMIDI.

Future Work

This project is still very much WIP, and there's a lot left to do. For example, scrolling/zooming compsitions is still unimplemented (and may be a possible performance bottleneck). As previously mentioned, import/export to MIDI is missing as well. A help guide, more UI controls for stuff like BPM, more ergonomic behavior for playback, and a variety of other things are also missing. My goal with this isn't to create an all-inclusing web-based music production environment; I want to create an effective MIDI editor and synthesizer capable of allowing users to write and play back compositions. I may embed it into a large application or add more features later on.