Community Research and Development Information Service - CORDIS

Final Report Summary - BRAINSENSE (Multisensory processing in cortical networks underlying the formation of supramodal percepts)

In this project, we propose to test the hypothesis that heteromodal interactions in the cortical areas that are classically defined as modality-specific play a role in multisensory integration and the emergence of supramodal object perception. More specifically, we investigate whether interactions between sensory modalities shape early sensory representations of a unimodal stimulus towards a more invariant representation of the object which may be triggered by a variety of combinations of inputs according to multisensory evidence and the behavioral context. To give a concrete example, the idea is to evaluate whether e.g. the auditory cortex representation of a tone ressembling the one from a bell is made more similar to the representation of the real bell tone when the picture of a bell is shown at the same time. Using awake mice as a model system, the project covers auditory, visual, tactile and olfactory modalities in order to assess whether early cross-modal interactions in the cortex correspond to a generic computational process required for multimodal processing, or are rather atypical integrative processes subserving strong natural constraints in the perception of certain modality pairs. We also plan to use optognetic perturbations of the intracortical heteromodal connections linking primary sensory areas to assess their role in the crossmodal representation of multisensory stimuli.
During the reporting period, we have first set up as planned the necessary apparatus and protocols to perform the experiments. This included:
1/ A two photon microscope setup to perform GCAMP6-based two-photon calcium imaging or tetrode recordings of large neuronal population in mouse cortex during wakefulness.
2/ A multichannel electrophysiology setup to record large neuronal population in cortical areas that are not optically accessible.
3/ An optigenetic setup.
4/ Stimulation devices and protocols for audition, vision, olfaction and whisker sensing.
We have also started to perform awake two-photon imaging experiments to assess multimodal interactions during perception of looming and receeding stimuli, in particular in visual and auditory cortex.
We found that visual cortex displays strong auditory responses which to the most part add up linearly with purely visual responses. However in a non-negligible fraction of neurons (~10%), auditory-visual stimulation leads to non-additive response which signal a variety of auditory-visual coincidences. We also found that this crossmodal code is asymmetric between vision and audition with a lack of visual responses in auditory cortex. Our results indicate that specific respresentations of the auditory-visual scene are build very early on in the visual system of the mouse, challenging current models of multisesnory processing.
We hope that together with optogenetic manipulations of the intercortical connections potentially underlying the multisensory interactions and investigation of similar effects in the olfactory and tactile cortical pathway, we will reach by the end of the project a greatly extended description of multisensory processing in the brain that could on the long term inspire new intelligent sensing technologies.

Reported by



Life Sciences
Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top