After two years of hard work we might say that I Master the Instruments and the Interfaces.. All of them? Probably not.. Certainly the two I built during my master of research at STEIM.
My artistic research focused on the design of interactive musical systems-instruments for electroacoustic improvisation. During the study period I developed two projects:
- InMuSIC – an interactive system for electroacoustic improvisation
- Melotronica – an hybrid modified melodica
The research grounded on an embodied cognition of music practice. Corporeal intentionality was a key concept to investigate music expressiveness. The attempt was to compose instruments that could combine the acoustic and electronic dimensions through sonic and gestural interdependences.
This instrument is a modified melodica with sensors mounted on and inside the body of the instrument. The sensors control in real-time a DSP software (Max/MSP) for the manipulation of the acoustic sound and the generation of electronic timbres. The Melotronica is an hybrid interface that, providing a tight integration between the acoustic and electronic dimensions, allows for the simultaneous manipulation of the two.
The Melotronica was developed with the support of Chi Ha Ucciso Il Conte (Nicolò Merendino). The instrument uses the Ipson64 OSC interface developed by Lex van den Broek.
The Melotronica is equipped with forty sensors. Most of them are located inside the body of the instrument and are used to detected the keyboard activity. The remaining
sensors were placed on the body of the Melotronica using a custom support. A clip-on condenser microphone is also mounted on the instrument.
In order to allow immediate access to the sensors located on the body of the instrument, a
custom support was designed. The thumb of the hand that holds the instrument can therefore directly interact with the switches, the buttons and the slider. In addition, the support was designed to hold the Ipson64 board. This approach aimed to develop a relatively embedded instrument. The ideas was design a compact instrument that could be easily transportable.
The information detected by the various Melotronica’s sensors is therefore used either to directly control specific sound synthesis (straightforward cause-effect relationship)
or to influence stochastic sonic process (emergence of autonomous musical behaviours).
InMuSIC is an Interactive Musical System designed for electroacoustic improvisation (clarinet and live electronics). The system relies on a set of musical interactions based on the multimodal analysis of the instrumentalist’s behaviour (upper-body motion tracking and audio features analysis). The aim is to investigate compositional and performative strategies for the establishment of a musical collaboration between the improviser and the system.
During the performance the system analyses both the sounds and the movements articulated by the musician. The electronic interventions proposed by InMuSIC are therefore generated in relation to the correlation of the two analyses.
The notion of interaction investigated within this research focuses on the development of processes capable of generating their own musical conditions during an improvised session. In other words, the attempt is to design a system that, interpreting outside contributions, composes/ improvises music through a dialogical modality. The InMuSIC interactive framework is inspired by the spontaneous and dialogical interactions characterising human improvisation. The intention is to provide the system with an autonomous nature, inspired by the human ability to focus, act and react differently in relation to diverse musical conditions. In regards to each specific performance, the close collaboration between the musician and InMuSIC should enable the constitution and emergence of specific musical forms.
The generation, modification and temporal organisation of new sonic materials are established negotiating the musical behaviour of the performer and the system’s internal procedures. In order to facilitate the development of a spontaneous musical act, the platform should then be able to assess different degrees of musical adaptiveness (e.g. imitation/variation) and independence (e.g. contrast/discontinuity).
InMuSIC relies on the analysis and comparison of sonic and motion qualities. This is by identifying and processing abstracted expressive musical hints of the performer. The attempt of composing and exploring sonic and gestural interdependences is the foundation of the inquired interactive paradigm. Thus, the framework composed to frame and shape the musical interactions, in addition to the sonic dimension, aims to take into account fundamental performative and expressive aspects complementary to the sound production.