Community Research and Development Information Service - CORDIS

ERC

DynaSens Report Summary

Project ID: 646657
Funded under: H2020-EU.1.1.

Periodic Reporting for period 1 - DynaSens (Understanding the neural mechanisms of multisensory perception based on computational principles)

Reporting period: 2015-11-01 to 2017-04-30

Summary of the context and overall objectives of the project

The fact that we are equipped with multiple sensory modalities, such as vision, hearing or touch, provides considerable benefits for perception. Depending on the current needs and the composition of the sensory environment, we can selectively integrate information across the different senses. For example, the adaptive combination of multiple sensory inputs increases our ability to identify objects in a sensory scene or to communicate with one another in a noisy environment. At the same time, our senses can fool us when information is incorrectly combined across the senses, such as for example during the ventriloquist illusion. While our brain efficiently handles these multiple sensory inputs we still have a limited understanding of the neural mechanisms that underlie multisensory integration. This project seeks to advance our knowledge of how the brain processes its sensory environment by linking the underlying brain mechanisms with specific multisensory computations and with perception.
Understanding the neural mechanisms underlying multisensory integration is important not only to reveal how the brain creates a coherent and unified percept of our environment, but also to pinpoint mechanisms that may be relevant for perceptual deficits that emerge during aging or in specific clinical conditions. For example, it seems that specific deficits in multisensory - more so than with unisensory - perception emerge in the elderly and are present in individuals with autism. Furthermore, knowledge about the perceptual and neural principles underlying multisensory processing is also relevant for the design of human-computer interfaces, as these pose specific challenges for the combination of multiple sensory cues from the environment.
This project aims to advance our basic understanding of the neural mechanisms underlying multisensory integration by addressing the following timely questions: What are the neural processes transforming multiple sensory inputs to a unified representation guiding behaviour? How does the brain control the dynamic weighting of multiple inputs and assigns these to either a single or multiple causes? Which perceptual and neural processes are affected in the multisensory deficits seen in autistic individuals or the elderly? We address these by combining neuroimaging in human volunteers with advanced statistical analysis of brain activity in conjunction with modelling approaches to link brain activity to specific processes involved in multisensory integration.
We hope that by following this agenda we can provide a more principled and comprehensive understanding of how the brain handles and merges multiple sensory inputs, and pave the way for a framework for addressing pressing problems associated with multisensory perceptual deficits seen in cognitive disorders and during our life span.

Work performed from the beginning of the project to the end of the period covered by the report and main results achieved so far

Our ongoing work focuses on the brain mechanisms underlying the dynamic weighting of sensory information that arrives in the auditory and visual senses. We focus on different sensory scenarios, defined by different types of stimuli. In the last period we have implemented studies that involve visual and acoustic motion cues, judgements about the location of brief stimuli, ratings of temporal flicker, as well as speech. For each kind of stimulus we implemented psychophysical tasks that require the judgements about specific sensory attributes, such as locating a stimulus or the comprehension of a specific word. We then combined these behavioural tasks with high-resolution neuroimaging of brain activity, as implemented either via Electroencephalography or Magnetoencephalography, which both provide non-invasive measurements of electric brain activity. To analyse these measurements of brain activity, and to extract and disentangle different neural processes involved in either the analysis of uni-sensory (sensory-specific) information or in the merging sensory information, we used sophisticated statistical and model-based analysis. To achieve such an ambitious aim we have developed two analysis methodologies that can link single trial measures of brain activity with either specific sensory attributes, model-based predictions about sensory integration process, or with the participant’s behavioural choice on each trial. We are currently applying these to the different sensory paradigms under consideration to disentangle processes mediating cue integration from other uni- and multi-sensory processes, and linked to choice-related activity – in order to map the transformation from sensory inputs to perception. This should provide a detailed understanding of the neural mechanisms underlying cue integration (‘where’ and ‘how’). In addition this should reveal those brain structures that control the dynamic modality weighting; i.e. those areas mediating top-down control over sensory selection and integration. The results so far indicate a cascade of processes, in which uni-sensory information is processed first, within the first 100ms following stimulus onset, and is dynamically adjusted according to the reliability of each sensory input. Shortly after, between 100 and 300ms depending on the type of stimulus, multisensory information is combined and weighted according to each modality’s reliability. This integration process emerges already in high-level task-relevant sensory cortices.

Progress beyond the state of the art and expected potential impact (including the socio-economic impact and the wider societal implications of the project so far)

By comparing the mechanisms of sensory integration across different tasks we begin to see and disentangle common mechanistic patterns as well task- or stimulus- specific processes, an issue that has often been neglected in previous work. Unravelling the cascade of uni- and multisensory processes underlying perception will pave the way to pinpoint those that are affected in conditions where sensory integration fails and hence behavioural performance declines, such as in the elderly or specific clinical conditions. Finally, in the context of this work we have also established advanced procedures for model-based and information theoretic data analysis that can be used by other researchers and which may prove useful also in the context of very different scientific questions.
Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top