logo EDITE Hugo SCURTO
Identité
Hugo SCURTO
État académique
Thèse en cours...
Sujet: Music Interaction Design Through Co-Exploration
Direction de thèse:
Laboratoire: unknown (Laboratoire inconnu!, ,)
Voisinage
Ellipse bleue: doctorant, ellipse jaune: docteur, rectangle vert: permanent, rectangle jaune: HDR. Trait vert: encadrant de thèse, trait bleu: directeur de thèse, pointillé: jury d'évaluation à mi-parcours ou jury de thèse.
Productions scientifiques
oai:hal.archives-ouvertes.fr:hal-01577806
Shaping and Exploring Interactive Motion-Sound Mappings Using Online Clustering Techniques
International audience
Machine learning tools for designing motion-sound relationships often rely on a two-phase iterative process, where users must alternate between designing gestures and performing mappings. We present a first prototype of a user adaptable tool that aims at merging these design and performance steps into one fully interactive experience. It is based on an online learning implementation of a Gaus-sian Mixture Model supporting real-time adaptation to user movement and generation of sound parameters. To allow both fine-tune modification tasks and open-ended improvi-sational practices, we designed two interaction modes that either let users shape, or guide interactive motion-sound mappings. Considering an improvisational use case, we propose two example musical applications to illustrate how our tool might support various forms of corporeal engagement with sound, and inspire further perspectives for machine learning-mediated embodied musical expression.
NIME 2017 proceedings Nime 2017 https://hal.archives-ouvertes.fr/hal-01577806 Nime 2017, May 2017, Copenhague, Denmark. NIME 2017 proceedings, 2017, Proceedings of the 17th International Conference on New Interfaces for Musical ExpressionARRAY(0x7f5470ee5438) 2017-05-15
oai:hal.archives-ouvertes.fr:hal-01577815
VIMOs: Enabling Expressive Mediation and Generation of Embodied Musical Interactions
International audience
This paper presents a recently-started doctoral project towards a new framework, called Virtual Intelligent Musical Objects (VIMOs). VIMOs aim at combining features of interactive machine learning tools, autonomous agents, and collaborative media into the creation of motion-based, user adaptable, shareable interactive music systems. We propose two models for stylistic motion learning and generation under a " design through performance " interactive work-ow. We discuss further human and computer-related research challenges involving expressiveness rendering and social musical interaction, as well as novel artistic and educational applications to be led within the scope of VIMOs.
4th International Conference on Movement Computing https://hal.archives-ouvertes.fr/hal-01577815 4th International Conference on Movement Computing, 2017, Londres, United Kingdom. 2017ARRAY(0x7f5470849080) 2017