Instrumentality, perception and listening in crossadaptive performance
Crossadaptive processing describes situations where one performer’s output effects the audio processing of another, thus imposing direct modulation on the sound of another performer’s instrument. This is done by analysis of the acoustic signal, extracting expressive features and creating modulation vectors that can be mapped to audio processing parameters. Crossadaptive performance can be situated between the performance practices of the audio processing musician, augmented (acoustic) instruments, live algorithms, group improvisation and interconnected musical networks. The addition of crossadaptive processing to these musical practices brings up questions of agency and instrumentality. Performance with crossadaptive techniques produces complex behaviours that are difficult to describe by the performer or the listener. This paper covers issues of transparency and technical language, instrument and ensemble learning. For the performer a shared ensemble identity may emerge. And for the listener we discuss the role of intention and emergent musical behaviour.
Citation : Emmerson, S., Baalman, M. and Brandtsegg, O. (2018) Instrumentality, perception and listening in crossadaptive performance. ICLI 2018 - International Conference on Live Interfaces (Porto), June 14–16 2018. pp. 86-95
Research Group : Music Technology and Innovation - Institute for Sonic Creativity (MTI2)
Research Institute : Music, Technology and Innovation - Institute for Sonic Creativity (MTI2)
Peer Reviewed : Yes
- Leicester Media School