Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Sensorimotor Foundations of Communicative Musicality

Periodic Reporting for period 2 - MUSICOM (Sensorimotor Foundations of Communicative Musicality)

Reporting period: 2023-03-01 to 2024-08-31

Humans across different ages and societies often mark social and communicative behavior with joint musical activities. Examples span from caregivers singing to babies to regulate their emotional state, up to larger groups of perfect strangers singing in unison. While these collective actions have been shown to be universal across human societies and to augment prosocial behavior, it remains largely unclear what are the neuro-cognitive mechanisms that underline such widespread human “musicality”. Uncovering these mechanisms would lead to important societal benefits, not only shedding light upon the biological origins of musical behavior, but also triggering important applications in education and health, notably contexts where music is successfully used to promote learning and wellbeing. The current project – MUSICOM – aims to identify behavioral and neural mechanisms that underlie human musicality. Through a series of experiments involving individuals or dyads, composed of either adults or infants, we are searching for the roots of musicality in humans’ spontaneous movements produced in response to music, or while making music, across different musical contexts including dyadic dance, infant-caregiver interaction and joint music making. Pairing fine-grained descriptions of full-body kinematics with recordings of neural activity and physiological measures, we aim to characterize the biological bases of human communicative musicality.
The work conducted from the beginning of the project until now can be summarized by the following points:

1. We developed the e-music box, a crucial piece of technology supporting several of the proposed experiments (as detailed in the ERC proposal). In short, this is an electronic instrument permitting participants to make music even if they do not have musical training. This e-music box has been developed by relying entirely on in-house resources.
2. We have developed analytical tools used to decode movement primitives from complex kinematics produced by participants freely dancing to music, or making music, with or without a human partner.
3. The above tools have permitted us to describe specific movements that are driven by music perception, and other specific movements that have communicative functions and instead are driven by the actions performed by a dancing or performing partner.
4. We have also developed analytical tools to decode neural activity from freely-moving participants, either dancing or making music. We were able to characterize specific patterns of electrophysiological activity responding to music, or marking self-generated movements, or responding to the movements produced by a partner, or even tagging the synchrony between the movements produced by the self and by the partner. Notably, we have been able to tease apart all of these activities even when they occur simultaneously.
5. As supported by a Marie Curie project following up on the main goals of MUSICOM, we have provided evidence that both infants and non-human primates neurally track specific musical features such as musical timing, and generate expectations about it, while listening to music.
6. We have published two review articles summarizing the state of the art of our field and anticipating some of the crucial steps forward that our project aims to take.
Points 2-5 from the list above are scientific progresses beyond the state of the art.
By the end of the project, we aim to further apply these tools and notions to characterize:
- Infant-caregiver real-time musical interactions.
- How musical expertise shapes body movements and neural activity to support communicative joint music making and listener-directed music performance.
- The causal relationship between neural and behavioral markers of real-time interactions.
My booklet 0 0