Periodic Reporting for period 1 - RHYTHMSYNC (Rhythm synchronization between music and spoken language)
Période du rapport: 2017-04-01 au 2019-03-31
This project investigated how rhythms in music and language interact by exploiting the fact that the perception of auditory rhythm is often spontaneously accompanied by synchronized rhythmic motor behavior. When we listen to music we often tap our finger or our foot to the beat of the music. Also, when we speak our hands and our head move in synchrony with the prosody of our spoken utterances. How does rhythmic synchronization between language and music – that are both highly modular cognitive abilities – occur? How does the mind resolve the differences in the basic elements carrying rhythm in language (consonants vs vowels) and in music (beats)? How is the production of musical rhythm synchronized to the speech signal, and does it differ from the way the production of speech or song is synchronized to musical melodies?
This project therefore had the following Objectives:
1) To investigate rhythm perception by looking at how rhythm interacts in two highly modular cognitive abilities – i.e. language and music.
2) To exploit rhythm synchronization as a direct online measure for comparing rhythm in language and music.
3) Determine how the human mind resolves rhythmic conflicts between speech and music.
4) Explore how the predisposition for rhythm synchronization interacts with developmental factors by studying infants at different developmental.
The project also investigated rhythm synchronization in infants of different ages, ranging between 5- and 10-months-of-age. These ages are interesting because they correspond to different developmental milestones. For example, 5-month-old infants produce babbling sounds, are not thought to know many words and cannot yet fully control their hands and legs. By 7-months-of-age infants have acquired considerable finger dexterity to handle objects with agility, they already know many common nouns and have developed considerable rhythmic knowledge. Finally, at the end of the first-year infants begin to utter their first words, they begin to walk and they are thought to master the rhythms of their native tongue. It is therefore interesting that synchronized rhythmic behavior was observed in infants’ already in the youngest age group. Developmental changes to rhythm synchronization could therefore only be detected in a graded manner, with stronger results being observed for older infants and adults. This suggests that synchronized rhythm perception is fundamental for our congitivie repertoire from the earliest stages of development.
The results of this project show that rhythm perception in music and spoken language can emerge from the same neuro-cognitive resources. This is important, because it explains how early exposure to music and regular musical training can enhance our linguistic skills. It also suggests that depending on the language we speak – i.e. different languages have different rhythms – we may be selectively enhancing or hindering our ability to perceive rhythm in music. The results are also important because if our ability to synchronize to rhythm is linked to our linguistic abilities – for example dyslexic children are known to be worse in beat synchronization – then evidence for shared rhythmic processing between language and music may help to uncover the roots of language and music related pathologies. The novel method developed for studying rhythm synchronization may therefore provide a basis for developing a valuable tool for detecting language, music and general cognitive pathologies, already in young infants who cannot be instructed to complete linguistic, musical or cognitive task.