Community Research and Development Information Service - CORDIS


UNDER CONTROL Report Summary

Project ID: 323961
Funded under: FP7-IDEAS-ERC
Country: Spain

Mid-Term Report Summary - UNDER CONTROL (Mechanisms of cognitive control and language learning)

How do humans manage to learn language? Most research in this field has focused on how the different types of linguistic knowledge (sounds, words, grammar...) are acquired. However, to achieve this goal, humans must sort out what information is relevant and this is not always trivial. Even worse, some information that may be important in one language may be irrelevant in another. For instance, in some languages a change in the position of the stress changes the meaning of words, such as with the Spanish words “sábana” (meaning bed sheet) and “sabána” (meaning savannah). However, other languages, such as French do not bear this property, and stress is meaningless at the word level, as all words are stressed on the last syllable. Given that the acoustic properties characterizing stress (duration, intensity and frequency) are present in any speech signal, how does the brain of the listener manage to extract and organize the relevant information? The present project wants to understand the relationship between general-domain mechanisms, in particular attention, and language learning. To this end we are comparing individuals across a selected set of tasks tapping at different attention and language mechanisms. We are studying how language learning is shaped by attention in infants who are learning their first language, but also in adults who have learned a second language. We are also comparing individuals who grow in a monolingual environment with bilinguals, as the latter have been claimed to develop specific attentional mechanisms to deal with the need of sorting and classifying the two languages.

We have collected behavioural, electrophysiological and brain imaging measurements of more than 130 proficient bilingual adults when they perform different linguistic and non-linguistic tasks. Although the data are still analysed, the results have shown that four different factors seem to underlie individual differences in second language learning. The first one refers to the linguistic capacities of the individual, in particular the ability to perceive and produce speech, either native or non-native. A second factor refers to the predictive capacity of the brain in processing auditory information. The third and fourth factors refer to the capacity to integrate audio-visual information and to sensory-motor abilities, respectively. In the following months we hope to be able to unveil other properties of the brains of good and poor second language learners.

To understand how infants decipher the language code we are creating a large database with results of hundreds of infants in several studies investigating how they learn their native (or natives) language(s), how they perceive and pay attention to the world and how they process social information. We plan to use sophisticated analyses to identify the relationship between the development of their cognitive and social abilities and the acquisition of language. Although we will not be able to get comprehensive answers until more data have been collected and analysed comparing different experiments, the results of the individual studies are already providing some new and exciting knowledge. We have observed that during the first two years of life, bilingual infants pay more attention to the mouth of people than to their eyes, even more, they are less able to shift their attention from the mouth to the eyes when new information appears in the eye zone of a character. This increase of attention to the mouth is not restricted to linguistic stimuli, as bilingual infants will look more to the mouth than monolingual ones, even if they see non-speaking faces, for instance, laughing or crying ones. Our result has also unveiled what is different in infants’ brains when they listen their native language. We have observed differences between monolingual and bilinguals in oscillatory changes in the theta band range (4-8Hz), when they hear a native language (but not when they hear an unfamiliar language). The bilinguals’ higher values in theta band synchronization could be reflecting higher cognitive demands, when compared to monolinguals. Finally, several of our studies provide converging evidence in that from very early in life, infants are sensitive to the social (group membership) cues that language provides. Infants seem to treat in a preferential way actions and information that individuals speaking their own language perform or provide. Language seems to be an important cue triggering increase in attention to incoming information and therefore likely increasing learning from individuals with whom we share culture in general, and language in specific.


Eva Martin, (Head of the Research Services)
Tel.: +34935422078
Record Number: 187732 / Last updated on: 2016-08-23