One of the challenges we still face as electronic musicians is how to be as quick and responsive as acoustic instrumentalists, especially if you are dealing with hyper-speed players like Peter Evans. Sam Pluta came to STEIM last year to work on a new software instrument setup in Supercollider. That residency resulted in his new album with trumpeter Peter Evans and violinist Jim Altieri.
At STEIM, we have seen many attempts in live electronics and improvised music to create a cohesive dialogue between the acoustic and electronic performers. George Lewis’s Voyeger was probably one of the earliest and STEIM’s Joel Ryan and Robert van Heumen have devoted their whole musical practice in this field.
Typically we try to solve the issue by adding gestural and expressive controllers, but recent resident Icarus are part of the Live Algorithms of Music project which removes the physicality of the electronic musician completely. The laptop is left on stage only to respond to an audio input from another performer based on preprogrammed machine listening functions and generative algorithms.
[vimeo 7803614 500 281]
I think we have succeeded in connecting the body to the computer and we have endless possibilities in how to process and generate data, but we still struggle with the design of complex relationships between the performer and the machine, like traditional instrumentalists have with their instrument. Generally this process has been tedious and highly personalized. However, we do feel there can be tools made to help create multi-dimentional mappings in an intuitive and musical way.
We hope we can investigate this more in the coming years and continue to support projects like Sam’s.