The proposed project aims to enhance Human Motion–Computer Interaction by leveraging on a multidisciplinary approach between experimental psychology, music technology and computational modelling. Firstly, the project looks at skilled activities, in particular music performance, in order to understand fundamental cognitive and psychological aspects of control and expression in human motion. The project involves computational models of motor control and expressive variations built from music performance data collected during psychophysical studies. Secondly, the project broaches the implementation of these models in Digital Musical Instruments (DMI), thus creating a new type of digital instrument based on sensorimotor learning mechanisms. The resulting DMI is then assessed through a user study in which elements of exploration and engagement will be tested over several sessions. Therefore, the project contributes to two main uncharted research areas. Firstly it contributes to the fundamental understanding of sensorimotor learning processes by considering complex human motion, specifically motion in music performance. Secondly, it represents an original application of computational modelling by modelling expressive musical gestures and transferring these models to interactive systems.
Aufforderung zur Vorschlagseinreichung
Andere Projekte für diesen Aufruf anzeigen