I came to the STEIM orientation with the expectation of gaining deeper understanding of the commercial software they have developed, and to present to the staff several artistic projects I currently have in development.
I have used JunXion in the past, but only for basic operations in mapping HID hardware. During my stay I was able to create several new patches for use in an upcoming performance titled “Still Life in Real Time”, which will premiere at the Denver Museum of Nature and Science in April of 2010. For this project, I am tracking position and 3-axis orientation of several objects, which constitute a “Still Life” composition, both visually, and sonically.
The aim of the project itself is to interrogate the possibilities of real-time, structured improvisational, audio-visual composition, while incorporating the symbolic potential of the interface. The traditional Still Life, and in particular Dutch Golden Age Still Life paintings, provided me with an art historical locus and context for referencing this potentially symbolic quality of objects.
The technical aspects of the project are driven by this concept of “interface as content” and the ways in which the tangible interface and the embodied activities of the performer come to convey meaning. That being said, the technical problems then become those of creating tangible relationships between the objects, which are projected through sound and image, by selective mappings.
JunXion provides a way to create these mappings in a manner that allows for rapid prototyping. So that’s what I did during my time at STEIM. I learned some of the deeper capabilities of the tool, which allowed me to explore some functionality that is key to the success of the project. I will outline these briefly below:
- Creating switches without “buttons”. I need to be able to tell the system to enable or disable certain functions or actions without clicking buttons, because that feels like it is an “un-natural” movement in the set-up. I was able to track large differences in accelerometer data, and turn that into a kind of switch, by storing, checking and updating a variable that denotes when the switch is on or off.
- The individual object orientations, as tracked by the 3-axis accel data transmitted from each object wirelessly. Using JunXion, I map each axis to one or more MIDI CC’s, and scale the data using the Tables in JunXion. This data is then forwarded on to audio and video parameters that respond to the movement of the objects.
- Relative positions of each object while they are situated on the plinth or pedestal. I was able to do some quick testing using JunXion’s built-in Object tracking, but it seems like I may have to use something else for the production version. I set-up a quick sketch that was able to detect blobs using Processing, but found that the actual “tracking” of the blobs is more complex. I need to be able to manage the blobs more precisely so the systems can determine how the blobs are related to each other spatially on a 2D surface. I am using a simple set-up of a frosted glass, with a camera underneath to let the system know when the objects are sitting on top of the glass, and where they are in relation to one another — the composition.
So this is mostly what I worked on but I also did some playful “what-if” scenarios with the other Orientation attendees, namely Berit Greinke, and Jeffrey Roberts. Berit was working on translating color to sound for her textile installation / performances, and Jeffrey was working with augmenting his live performance with a traditional Chinese string instrument called a Gu-Chin.
The discussions I had with Kristina and Robert were very helpful in giving me some perspective on my current work. This includes the Still Life compositions as well as my work with the Solar Wind Harp, a software (hardware under development) instrument that can “play” the solar wind in real-time via the NASA ACE satellite. I hope to be able to construct a large installation/performance version of the Solar Wind Harp with the assistance of STEIM staff in the future.
Most important to me was the interaction with the other orientation attendees. Hearing about the work they are doing, and the contexts within which they are working, provided a breath of fresh air, and a breadth of fresh ale to wash down some hard to swallow pills which seemed to be blurring my vision.
I have included links to brief videos of one of the experiments with Jeffrey Roberts playing Gu Chin, with one of my WiiMotes attached to his arm. The WiiMote is sending controller data via junXion to a patch in Ableton that is manipulating his live sound through various frequency domain dynamics effects processors. The other link is to a video of Jeffrey and Robert van Heumen playing together in the STEIM Studio 3.
Jeffrey Roberts and David Fodel
Jeffrey Roberts and Robert vanHeumen