Skip to main content

How musical rhythm moves humans: functional mechanisms of entrainment and perception-action coupling

Periodic Reporting for period 2 - Rhythm and Brains (How musical rhythm moves humans: functionalmechanisms of entrainment and perception-action coupling)

Reporting period: 2020-10-01 to 2022-03-31

Music is a creation process common to all human societies, with increasingly recognized pro-social and therapeutic effects. In particular, the rhythms of music powerfully compel us to move the body, showcasing the remarkable ability of humans to perceive and produce rhythmic inputs. What are the neural mechanisms underlying the ability for musical rhythm? Identifying the neural mechanisms underlying rhythm perception thus provides a unique opportunity to uncover the roots of this universal nonverbal means of communication both at the level of the individuals' lifespan and also the evolution. Moreover, understanding how the ability for musical rhythm maps into the brain is expected to provide answer to the debated hypothesis of an active role of motor brain regions for perception. Finally, addressing this question is timely given the growing interest in music-assisted practices for the rehabilitation of sensory and motor disorders caused by brain damage.

The overarching goal of this research program is to progress in our knowledge of how the human ability for musical rhythm maps into the brain. Specifically, we aim to test whether this mapping critically relies on motor brain regions. We also aim to determine whether these neural processes develop early in infants' brain, in particular prior to the acquisition of sensorimotor abilities such as language and sensorimotor synchronization (e.g. the ability to tap the hand in time with a periodic drum beat). Finally, we aim to clarify the extent to which these processes are subject to neural plasticity when sensory or motor function is impaired, and whether training with musical rhythm could boost sensory-motor coupling and therefore help restoring the impaired function in patients.
The beat, i.e. perceiving temporal periodicities in music, can be considered a cornerstone of musical rhythm perception and production. Even when music is not periodic, humans mentally organize the constituent acoustic rhythms according to internal periodic pulse-like beats. These periodic beats are used to coordinate body movement in time with the music, for example when we spontaneously bob the head or clap the hands in time with music.
Crucially, beat perception is considered a high-level perceptual process rather than a direct response entirely determined by the physical properties of the rhythmic sensory input. For example, the perceived beats are not necessarily marked explicitly in music. Many musical scenarios across cultures use highly complex rhythmic patterns in which the perceived beats are not physically cued by prominent acoustic onsets. Yet, people can spontaneously feel the beat with these complex rhythms and move to it. Moreover, a single musical rhythm can lead to different perceived sets of beats and, in turn, various rhythmic patterns can give rise to similar perceived beats. These examples illustrate the remarkable flexibility of the association between rhythmic input and internally perceived beat.
A promising approach to capturing the brain processes underlying beat perception in music is electroencephalography (EEG), which offers the advantage of measuring brain activity at the millisecond scale. In particular, the combination of EEG with frequency-tagging has proven successful at objectively relating the rhythmic input with brain processing of the rhythmic input.
Based on this methodological development, key outcomes of the research project so far included replication and extension of evidence that beat perception is related to selective activity of neural populations at beat frequencies. This neural activity cannot be trivially explained by bottom-up responses to rhythmic stimuli, as it is also observed when the beat frequencies are not physically prominent or even present at all in the acoustic stimulus, thus controlling for acoustic confounds. Moreover, this prominent neural activity at the beat frequencies cannot be fully explained by the processing of peripheral and subcortical relays of the ascending auditory pathway, as demonstrated using biologically-plausible models. This suggests that this neural activity might take place at the cortical level. Finally, neural activity elicited by an auditory rhythm is modulated by prior auditory rhythmic context. This activity persists longer when the rhythmic input gradually changes from regular to complex, as compared to rhythmic sequences gradually changing from complex to regular. Importantly, this effect of context also reflects into behavioral data of tapping the beat with the sequences, thus corroborating the view that this selective neural activity at the beat frequencies is functionally relevant. Together, these findings thus move us forward in our understanding of the neural underpinnings of musical rhythm: they indicate that beat perception is related to selective neural activity at the beat frequencies, and that this selective neural activity relies on high-level cortical processes that can be shaped at short-term level, especially when the rhythmic input is highly complex.
The project progresses beyond the state of the art in several ways. First, it offers a completely original framework that helps putting together and organizing the diversity of studies on beat perception. This framework proposes to organize these studies into four different levels based on the sets of processes that would be necessary to beat perception depending on the rhythmic complexity of the input and output. For example, when tapping the finger to metronomic sounds, the prominence of beat periodicity is high in both the input and output (i.e. strictly periodic acoustic input and close to perfectly periodic tapping in the best cases). Producing a periodic output in response to a periodic input is thus likely to rely on neural processes of lower level as compared to other scenarios such as producing periodic finger tapping in response to highly irregular rhythmic patterns. In the latter case, the rhythmic input does not contain a set of prominent periodicities that can serve as a cue to perceive periodic beats, hence requiring higher-level processes for the beats to be perceived.
Building this four-level theoretical framework requires a measure of the prominence of beat periodicities in these signals at the first place. To achieve this, the project thus proposes a new method to quantify and compare the prominence of periodicities corresponding to the perceived beats in a variety of signals such as acoustic input, electrophysiological data and body movement.
This measure of prominence of beat periodicity is critical to understand how a rhythmic input is processed and represented in a variety of output such as brain data. Therefore, through direct comparison of the prominence of beat periodicities across the rhythmic acoustic input and neural activity collected from a large set of electrodes either placed on the head surface (non-invasive surface EEG in healthy volunteers) or implanted within the grey matter of the brain for example (invasive intracerebral EEG in epileptic patients), this project is expected to progress beyond the state of the art by providing a unique insight onto how beat perception maps into the human brain. Evidence on this brain mapping is expected to help us uncovering the rich set of neural processes underlying beat perception in the human species and beyond, and how these processes are shaped by individual development and learning.
Combining signals of acoustic rhythm, brain activity and body movement to understand beat perception