Shane Mecklenburger > Game Engine as Virtual Instrument / Collaborator

I worked at STEIM on a Research Fellowship from the University of North Texas from June to August 2010.  I’m very grateful for the generous technical & creative support from everyone at STEIM over this extended period.  As a non-musician dipping his toe into instrument design & sound production I was very out of my element, but STEIM’s creative advisors and technical staff bridged the gap so I could explore a new arena while helping me consider my project from many viewpoints.
I was able to prototype a wireless interface that manipulates live sound and image simultaneously for performance.  I also investigated the potential for game engine software to operate as an instrument / interface and a virtual collaborator.  Even with this extended period of research I feel I’ve barely scratched the surface of possibilities for game-generated/assisted improvisation and I’m very excited to explore and expand on the basic system I developed.
I’ll briefly describe my project, give some details on the results of my research, and offer my preliminary conclusions for artists interested in exploring this creative process.  Below are images from pwn3d, a media performance I gave at the Dallas Museum of Art’s C3 Theater last Spring.
The old joystick controller.
[vimeo 11978160]Screen capture from one game environment in the performance.
pwn3d unpacks the First Person Shooter (FPS) genre, currently the most popular form of media entertainment.  In the performance I attempt to control a set of artist-created FPS games whose refusal to function “properly” derails the performance into a technical catastrophe in which I’m the hapless victim.  The train wreck’s engineered audio / video glitches weave the drama and conflict of the creative process with that of American Western cinema’s gunslinger.
One goal of the residency was to replace my previous joystick control with a wireless gun interface.  In past performances, controlling game environments with a joystick tethered my body to a table.  At STEIM I replaced the joystick with a pair of Wii-based pistols to control the FPS.  This allows me to use my body and the space and perform “gunslinging theatrics” that reinforce other references in the performance to classic American Western cinema.
New wii pistols.
Unity 3D.
A second goal was to explore and develop the simulation game environment as a musical instrument.  I quickly learned that Unity 3D has severe limitations for sound processing.  Like all engines, Unity’s audio is entirely sample-based, offers dismally few options for sound manipulation, and spatializes in-game sounds in stereo but does not support 5.1 surround.  I’m now considering re-building the project in Valve’s Source Engine, which spatializes in 5.1.
STEIM helped me work around some of these limitations by separating the audio processing to its own separate system with LiSa and JunXion.  Frank Baldé’s miraculous customizable “magic wand” (as I call it) for JunXion & LiSa has revolutionized the sonic dimension of my performance.  Bluetooth connection limits require separate controllers and separate computers for audio and video, but the expanded possibilities provided by STEIM’s robust live audio processing software are well worth it.  Once I’ve ported to the Source engine I plan to experiment with audio and video processing on a single machine, but separating them on two dedicated devices works well as a functional short-term solution.
Media installation & performance artists [Kurt Hentschläger, Julian Oliver, Brody Condon, JODI] have shown that game engine software holds great promise as a system which can take on a “life of its own” as virtual collaborator and as an interface.  However, such dynamic interactions have been almost exclusively visual because the visual bias of the current generation of game engine software, which is designed for complex visual simulations and rather limited audio.  For non-programmer media artists, a separate audio processing setup is necessary.  Even so, game engines’ visual and haptic physics simulations remain useful and under-explored as instrument/interfaces to generate dynamic input for dedicated audio processing software/hardware.
Hartelijk Bedankt:
Dick Rijken
Jilt Van Moorst
Frank Baldé
Mark L. Franz
Daniël Schorno
Kristina Andersen
Takuro Lippit
Nico Bes
Gijs Molenar
Shane Mecklenburger

Comments are closed.