Community Research and Development Information Service - CORDIS

Studying multisensory integration

In many situations, what people see is affected by what they hear, and vice versa. An EU-funded study explored the effects of multisensory perception.
Studying multisensory integration
The project 'Neural correlates of predictive mechanisms in multisensory perception' (PREDICTIVENEUROSENS) developed a series of experiments to test how people integrate multisensory information. The study was based on the hypothesis that the brain integrates visual and auditory cues to predict future events.

In a former study, this ability was illustrated by an experiment showing that sounds synchronised with a visual target made it easier to detect that target. Other research also provided evidence for multisensory integration and functional flexibility of cortical regions. These previous findings set the stage for the PREDICTIVENEUROSENS study.

Researchers trained groups of participants to discriminate between red or green random-dot-kinematograms. One group was trained with auditory information congruent with the visual cues, one group had just the visual cues, and one group had auditory sounds unrelated to visual cues. Results showed that those trained with the congruent auditory and visual cues performed significantly better than those from the other groups.

Investigators also found that those with the congruent cues learned and retained the information differently. Based on brain activity in post-experimental recall assessments, it became clear that those in the group with congruent information used more integrated neural pathways to perform than those from the other groups. This finding shows that the brain optimises and stores information about a visual task by making use of other areas of the brain, providing further evidence for the brain's capacity to optimize its use use of the full range of sensory information.

A second study was performed with the help of patients with schizophrenia. They were asked to watch desynchronised audiovisual speech and to report what they heard and to judge the synchrony of the sound and the person speaking. Patients displayed an impairment in timing audiovisual inofrmation but were not impaired in their comprehension of audiovisual speech.

These findings suggest that while timing is essential to multisensory processing, our conscious representation of timing entails separate mechanisms. These findings can be used as a basis for further research and development on sensory substitution devices that take advantage of cross-sensory mapping in the brain.

Related information


Multisensory integration, multisensory perception, brain dynamics, neural correlates, predictive coding, , auditory cues, visual cues, neural pathways, timing, simultaneity, multisensory processing, MEG, attention, supramodal processing, automaticity
Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top