Shaping and Exploring Interactive Motion-Sound Mappings Using Online Clustering Techniques
Machine learning tools for designing motion-sound relationships often rely on a two-phase iterative process, where users must alternate between designing gestures and performing mappings. We present a first prototype of a user adaptable tool that aims at merging these design and performance steps into one fully interactive experience. It is based on an online learning implementation of a Gaus-sian Mixture Model supporting real-time adaptation to user movement and generation of sound parameters. To allow both fine-tune modification tasks and open-ended improvi-sational practices, we designed two interaction modes that either let users shape, or guide interactive motion-sound mappings. Considering an improvisational use case, we propose two example musical applications to illustrate how our tool might support various forms of corporeal engagement with sound, and inspire further perspectives for machine learning-mediated embodied musical expression.
NIME 2017 proceedings Nime 2017 https://hal.archives-ouvertes.fr/hal-01577806 Nime 2017, May 2017, Copenhague, Denmark. NIME 2017 proceedings, 2017, Proceedings of the 17th International Conference on New Interfaces for Musical ExpressionARRAY(0x7f5470ee5438) 2017-05-15