Robin presents a performance/installation piece which maps a visualization from a live radio input database, created in real time. The input is simply the radio dial, and whenever there’s a change to the dial, the system automatically makes a new ‘entry’ (sample) in the database.
He also presents a description of the mapping strategy for the motion graphics element
- X position proportional to loudness
- angle theta proportional to brightness
- radial distance from axis proportional to noisiness
- Mode selector simply permeates the mapping between polar co-ordinate and the three spectral features
Metaphor of ‘heat’ as an interactive system behavior - i.e.: Over time, how much ‘energy’ has been put into the system changes its behavior… All the input parameters are coupled, so you have no strictly linear interactions.
What Andy calls ‘biasing’ is what I think I call ‘modal thresholds’ - that is, the system’s behavioral mappings change when some key parameter (or combination of parameters) reach some point. I believe Robin’s using the overall heat of the system in this way… In any case a great analogy - the frog in the boiling water: Put a frog into boiling water, and it’ll jump right out. But if you put a frog into cold water and heat up the water slowly, it’ll never notice. I think the implication is that people using the installation are frogs.
Future work planned involves applying physical models to the gestural input, to perhaps make them slightly more understandable or ‘transparent.’
Some good questions at the end about the intention behind mappings - not how and what, but why do you choose certain mappings over others. Do we approach these problems through mathematical modeling (e.g.: the heat system described by Robin) or through a more empirical or user-centered approach.