Periodic Reporting for period 1 - AVISSO (Audiovisual Speech Segmentation and Oscillations)
Periodo di rendicontazione: 2016-05-02 al 2018-05-01
- Implementation of the experimental paradigms (including task design, creation and rating of the stimuli, and programming of the task for the EEG and fMRI studies).
- EEG data acquisition and analyses.
- fMRI piloting.
- Collaborations with colleagues on different related projects.
- Student supervision, tutoring of Problem-Based lectures, grading.
- Preparation and submissions of manuscripts for publication.
- Securing a 4-year postdoctoral fellowship (Sir Henry Wellcome postdoctoral fellowship, UK), following the present fellowship.
Main results:
The main results of the project can be summarized as follows: (1) When semantic content was degraded, listeners successfully relied on corresponding visual and auditory prosodic features to decide whether the two modalities were presented synchronous or asynchronous. In contrast, when lip movement information was removed with a blurred mask over the speaker’s face, listeners were still able to match visual and auditory modalities but only when the audiovisual temporal alignment was intact. These results established that listeners not only rely on visual and auditory correspondence at a syllabic time scale, but also extract the temporal structure conveyed at the slower delta time scale by prosodic information to successfully process multimodal speech. (2) At neural level, we found a specific increase of delta (1-3Hz) power when visual and auditory modalities where presented in asynchrony as compared to synchrony. This increase of power might reflect an increase of difficulty for the listeners to extract the temporal structure of prosody when there is a misalignment between visual and auditory inputs of speech. (3) In line with our hypothesis, the increase in delta power was found over a left fronto-central sensor area, which may reflect the convergence of visual and auditory delta information to generate high-level temporal predictions to improve audiovisual speech processing (i.e. engaging the supplementary motor area, SMA).
Exploitation and dissemination:
Conference Presentations:
1.Branzi F. M., Biau E., Martin C.D & Costa A. (2017). Bilingual lexical access is triggered bythe intention to speak: behavioral and ERP/ EEG evidence. Dutch Neuroscience meeting, June 15-16th, Lunteren (The Netherlands). Poster Presentation.
2.Branzi F. M., Biau E., Martin C.D & Costa A. (2017). Bilingual lexical access is triggered by the intention to speak: behavioral and ERP/EEG evidence. Cognitive Neuroscience Society (CNS) Annual Meeting, March 25-28th, San Francisco (USA). Poster Presentation.
3.Biau E. (2017). Beat gestures in audiovisual speech: Prosody extends to the speaker’s hands. Invited speaker at the Max Plank Institute of Nijmegen, Gesture Centre (Netherlands).
4.Biau E. (2016). Beat gestures and speech processing: When prosody extends to the speaker’s hands. SNL (London), sensorimotor speech symposium at the UCL. Talk.
Manuscripts:
1. Biau, E., Fromont, L., & Soto-Faraco, S. (2017). Beat gestures and syntactic parsing: An ERP study. Language Learning.
2. Fromont, L.A. Soto-Faraco, S., & Biau, E. (2017). Searching high and low: Prosodic breaks disambiguate relative clauses. Frontiers in Psychology, 8:96.
3. Biau, E. & Kotz, S. A. Lower beta: a central coordinator of temporal prediction in multimodal speech (in review).
4. Schwartze, M., Brown, R.M. Biau, E., & Kotz. Timing the “magical number seven”: effects of temporal structure on verbal working memory (in review).
5. Biau, E., Gunter, T., & Kotz, S. A. Audiovisual speech processing relies on multimodal prosody integration (in preparation).
6. Schultz, B. G., Biau, E., & Kotz, S. A. Frame rate control during audiovisual presentation in EEG paradigms: a new toolbox. (in preparation).
7. Biau, E., Schultz, B.G. Schwartze, M., & Kotz, S.A. Mind the Gap: oscillatory correlates of temporal predictions across modalities (in preparation).
8. Schultz, B.G. Biau, E., Schwartze, M., & Kotz, S.A. Oscillatory correlates of temporal predictions across modalities in a sensorimotor task (in preparation).