NoiseFold > Development Residency

NoiseFold performs 2.0 with guest musician David Bithell at the Merrill Ellis Intermedia Theatre in Denton Texas – This 3-screen work utilizes a 3-laptop network


NoiseFold was formed in late 2005 when Cory Metcalf and I began collaborative development of an integrated audio-visual performance instrument & synthesis system. Central to our approach is a method of simultaneous sonification and visualization of data that results from real-time mixing and processing of geometric equations. In brief, we manipulate virtual 3D objects that emit sound as a form of tightly integrated audio-visual performance. While this work is very much a continuation of earlier interests in fusing cinema, performance, sound and music, it was also hastened by our close association with Steina Vasulka < >. Steina is a seminal first generation video artist and former Artistic Director of STEIM, who was in residence during the mid 1990s. We were early adopters of the IMAGE/ine platform developed at STEIM under Steina’s guidance. She and I subsequently co-taught the first University level studio course to use this software.

Cory and I began our multi-part residency at STEIM in July 2010 and returned again in July 2011 (the two visits are encapsulated in this blog). Our initial work was focused on implementing a distributed system built in Max/MSP/Jitter that expanded our performance environment from 2 to 3 networked screens. The result was NoiseFold 2.0, which garnered an honorable mention in the Vida 13.0 Art and Artificial Life International Awards. Various movements from the project were presented by STEIM at OT301 in Amsterdam and the full work has been performed in museums, concert halls, theaters and arts laboratories in the USA and Europe. The conceptual and technical scope of the project is outlined in this video document . < >

Throughout our first residency, Cory and I were laying the groundwork for a series of related project components that we then undertook in my research laboratory at the University of North Texas. Central to these discussions was the sense that we needed to expand our range of sensor instrumentation, pursue the addition of acoustic improvisers in our performances and ultimately redesign our “nFolder” software from the ground-up to facilitate a system that is able to reconfigure itself “on the fly”. We were further encouraged in this direction by the critical insights of STEIM Director, Dick Rijken and Artistic Director, Takuro Mizuta Lippit (aka DJ Sniff). In the intervening year between the first and second residency we rethought our use of infrared sensors, and with the help of Ben Johansen at the University of North Texas, came up with a simple clamp-on gooseneck design that is easy to reconfigure. The 3-point IR system pictured here is the basis of a very stripped-down approach to X-Y-Z motion capture. NoiseFold also began the integration of acoustic musicians into our existing audio-visual vocabulary. Our work with Andrew May (composer, programmer, violinist) < >  and David Bithell (composer, trumpeter & performance artist) < >  resulted in an effective performance where acoustic instruments augment and/or control aspects of both sound and image.

Our return to Amsterdam for the second phase of work was focused on further experimentation with instrumentalists and laying the groundwork for the new modular software. We worked closely with cellist, Frances Marie Uitti < > and recorder player, Anna Stegmann < >. For the most part we were involved with integrating the musicians into existing works and experimenting with parameter mapping of acoustic control data. Cory designed a real-time data capture module and augmented our ability to route sensor controls to expand the gestural and video capabilities of our existing nFolder software programmed in Max 4.6. In the process, we composed a new movement for Anna Stegmann and re-orchestrated a number of existing movements for Frances Marie. We gave a second concert at OT301 with these two wonderful musicians. The event was made possible by the intrepid support of composer, producer and curator, Ivo Bol < >.

David Stout 2011 < >

Infrared sensors fitted to gooseneck clamps can be mounted in various configurations to facilitate simple gesture capture

The front control page of the original nFolder software – with aprox. 12 control pages encompassing various video, audio and data processing parameters

Performance Review  <–-ot301-amsterdam-–-21-juli-2011>


Comments are closed.