DUO X (Laura Carmichael, clarinet and Naomi Sato, saxophone/sho) worked on a new phase of our staged concert program “Urban Doldrums” which incorporates live electronics and video.
During our STEIM residency our aim was to stage our concert program, integrating new wireless controllers (TouchOSC on the iPhone) into our performance practice, and to rehearse and fully test six pieces of electro-acoustic music being coordinated by a new “master patch” in MAX/msp, and to test test test all the necessary equipment in the four-channel environment. We worked in Studio 3, where we had plenty of space to simulate a staged environment, including using the projector for rehearsing with video as well as audio.
Our goal was to get the technology working to the point where it is almost invisible or un-noticed by the audience, so we can again bring the focus to the instrumental performer, the sound and visual aspects of performance. By integrating the controllers into our program of electro-acoustic music, we eliminated technical set-up changes between the pieces we perform, have more control within some of the pieces in which we improvise, and are thus not tied us to standing at the computer or interrupting the flow of the performance to accommodate technical demands.
Efficency of set-up:
We needed to streamline our set-up from one which required the assistance of two technicians during a performance and about eight hours of sound checking, to a set-up we could use completely independently and set up and sound check within two hours, for the purposes of touring. Prior to this phase of our work, we simply could not perform in a lot of venues because the technical demands of the program were too much, and there were not the personnel/support/time available at many venues.
Our expectations were to use a new “Master Patch” in MAX/msp that would control all six separate compositions and their accompanying video, start and stop them, allow for adjustment of levels, and trigger soundscapes that are in between the pieces. Each individual piece has its own patch, and previously we had to open and close each one, and each one had to be mixed or adjusted manually. We needed two technicians helping us to run the sound and build a set-up. We also did not have all the equipment we needed– different pieces required separate audio interfaces and computers. All of this needed to be standardized, simplified and made to fit our equipment and work 100% reliably.
MAX/msp programmer Luc van Weelden made the patch for us. Here you can see a two minute video describing the interface between the iPhone TouchOSC application, MAX, and a couple of examples of how we used it in performance:
Our expectations were fully met.
We started the residency with a beta version of the set-up, and during the residency we fully tested and tweaked it.
Now with the MAX/msp Master Patch and the TouchOSC on the the iPhone as a wireless controller, the set up is standardized, and from our perspective as performers (end users), it is fast and reliable. Everything is run off of one laptop (except for the piece which is designed to use two computers in a LAN). We can build the whole set up ourselves and perform without any additional personnel/engineer if necessary.
We achieved our goal to get used to running the whole program in this new environment— we were able to identify and solve places where the musical and theatrical flow was interrupted because of technical difficulties. In other words, we no longer need to have the choices of timing be dictated by technical issues; the technical aspects support the musical and theatrical ‘story’ rather than dictating it.
Additionally, we also achieved something we had not anticipated, namely we developed, tested and standardized our own protocol for building the set-up on tour:
We “practiced” tearing down and setting up our equipment from scratch several times, getting faster at it, and noting small details that became essential in producing reliable and repeatable results. In the process we developed a solid check-list and protocols for sound checks, which we have applied since. For example, becoming certain of exactly what order in which to open and set-up certain patches, build networks and change computer settings and equipment, so that it works reliably and is efficient.
In February 2010 this program was performed at the ‘Experimental Sound, Art & Performance Festival 2010’ in Tokyo Wonder Site, and at Theater Kikker in Utrecht.
Our performance won a prize at the Festival in Tokyo, for innovation and quality of performance. We have been invited to return for a feature concert in the next season.
We will continue performing this program for audiences in Europe, Japan and North America in the next season, both on contemporary music festivals and adventurous music series.
The simulation of the entire performance set up, with video and in a four-channel audio environment was crucial for our work. We found a few technical problems with our audio set-up and the programming, getting everything to work in the exact set-up we would use in concerts. The fact that we had access to very good audio equipment meant that we could accurately hear and see our results, and adjust accordingly.
We also discovered several small details of practical solutions regarding our interface with the computers and controller that were crucial (see list below). All of this was solved during the residency period.
YOU CAN SEE HERE A TRAILER OF THE FINISHED PERFORMANCE on VIMEO: http://www.vimeo.com/10964989