During our residency at STEIM from 11th to 15th of November, we developed a series of tools for making our collaboration more confortable. Our first aim was to project the code of two live coders on the same screen, so we did this kind of “code display server” where we could send our code and just divide the screen into two regions (top/bottom). After the residency we rewrote the classes and architecture to be more usable by other people and larger ensembles. It is now more simple and flexible and can adapt to other ensemble’s needs. We put also MIDI input visualization in the middle of the screen as a splitter to our two code regions, turned up to look really nice and also put into the screen what Anne was doing at the MIDI keyboard.
In the first rehearsals it was clear that we needed some kind of easy recording what Anne was playing on the piano and the toy piano, so we wrote the LiveBuffer. The problem was, recording and playing live at the same time caused some clickings, when overdubbing on the same buffer, so we did this dual buffer tool where the buffer is being played is not the buffer it is being recorded, and the buffers can swap automatically when the new recording is required.
We also wanted to catch the MIDI events performed live on the keyboard, so we also did a tool for catching these events and use them directly live into our code. We didn’t use this at the concert, but it turned out quite nice to use this, maybe in other setups.
The other tools are just wrapper classes to make the code more readable by not cluttering much code together. All these tools for SuperCollider3 are available on GitHub.
The residency was very productive and we are looking forward to do this again in the future. Only from this kind of intensive collaboration in an ensemble one can really see what is missing and what we would like to do while live coding, but it would be too complicated to do it live. The concert turned out really good, and thanks to all the people for coming. Stay tuned.