Forschungs- & Entwicklungsinformationsdienst der Gemeinschaft - CORDIS


METAGESTUREMUSIC Berichtzusammenfassung

Project ID: 283771
Gefördert unter: FP7-IDEAS-ERC
Land: United Kingdom

Mid-Term Report Summary - METAGESTUREMUSIC (Meta-Gesture Music: Social, Embodied, Interactive Musical Instruments)

The MetaGesture Music (MGM) project set as its ambitious goal, to look at visceral, embodied relationships we have with music through cultural, technological, and physiological means. The MGM team is led by Prof Atau Tanaka, a leading composer and researcher in the area of gestural, interactive music systems. With the ERC funding, he established a new full professorship in Media Computing at Goldsmiths, University of London. Joining Atau is post-doctoral fellow, Dr Baptiste Caramiaux, who received in PhD from IRCAM, Centre Pompidou, conducting research on machine learning techniques to recognise and characterise corporeal movements in musical gesture. Two PhD students are funded by MGM. They include Alessandro Altavilla, whose thesis looks at sound as a medium for embodied experience, and Marco Donnarumma, who extends Tanaka’s original work on muscle interfaces as musical instruments.

The MGM team works within a new research unit at Goldsmiths, the Embodied AudioVisual Interaction unit (EAVI). This situates MGM’s musical research in broader contexts of human-computer interaction (HCI), virtual reality, brain-computer interfaces, and gaming. They are key stakeholders in the design and management of laboratory and studio facilities in Goldsmith’s iconic Ben Pimlott building and new St James Hatcham centre. The Goldsmiths Digital Studios (GDS) has audio/video recording, motion capture systems, and physical computing hardware. The Sonics Immersive Media Labs (SIML) is a surround sound/image projection space with 360º seamless video projectsion (7,680 pixels wide) and 12.2 surround sound.

Our research is experimental and relies on creative practice and participatory methods. Technologies for gesture recognition by machine learning are validated in user trials where subjects demonstrate their ability to reproduce gestures from a vocabulary and perform expressive variations. Sensing systems that detect physiological signals of muscle tension are used to capture hand and arm gesture in musical performance. Different sensing modes are used in multimodal configurations, exploiting complementarities of electromyogram and mechanomyogram sensing, as well as differences between biosignal sensing and motion sensing. We carry out group activities exploring “sonic incidents” from everyday life as ways to drive brainstorming of novel corporeal interactions with sound in our environment. These interactive systems are taken on stage for performance in concert in front of live audiences.

The first phase of the project has led to outputs in important conferences in the field, such as ACM CHI (Computer-Human Interaction), and New Interfaces for Musical Expression (NIME). We have also presented our work to related ERC project events such as Prof Georgina Born’s (ERC AdG MusDig) Music, Digitisation, Mediation: Towards Interdisciplinary Music Studies (Oxford). Our researchers have taken part in the European Sound Studies Association conference. With a book chapter contribution in the MIT Press book, Sonic Interaction Design (Franinovic and Serafin), we took part in the book launch at the Zurich Academy of Arts. We organised and hosted the NIME2014 conference in London with 300 international delegates, and a 3 day programme of scientific papers as well as public concerts in clubs and concert halls across London.

Reported by

United Kingdom
Folgen Sie uns auf: RSS Facebook Twitter YouTube Verwaltet vom Amt für Veröffentlichungen der EU Nach oben