From June 6 through 21st I began exploring the concept of using the Internet as a means of creating reverberance for physical drums. By using node.js, max/msp, and p5.js, I was able to take in realtime audio signal from contact microphones mounted on drums, process that signal for threshold detection (drum strike), send data to an amazon web server triggered by the drum strike, pass that data to any connected device running a webpage and create realtime audio and visual on the connected mobile device.
The resulting output is that when the drums are played, anyone connected with a smartphone to the website will hear sound and see a visual through their device. This sound and visuals are all produced in real time on the device. This creates a type of resonance through the internet. Just as playing a drum in a concert hall with send sound waves through the concert hall and bounce off surfaces creating reverberance, this process creates that reverberance by sending data from the performance location, to a server hosted in Oregon, and then disseminated back to any device viewing the webpage located at http://andrewblanton.com/node.html
The system was premiered at STEIM on Monday June 20th and June 24th at the Soundwave biennial at the Fort Mason Chapel in San Francisco.
I want to thank Yevette Granata in helping working out the conceptual details of this work as well as Neal Riley for technical help!