Periodic Reporting for period 1 - MultisensoryIntegration (Multisensory Integration in Time and Space)
Reporting period: 2016-03-11 to 2018-03-10
Physiological studies of sensory perception have historically focused on isolated sensory modalities. However, humans constantly combine multimodal sensory cues to better understand their environment, such as matching lip movements and auditory cues during a conversation. Mounting evidence suggests that brains are optimized to process naturalistic sensory cues, yet despite its ethological significance, little is known about the neural mechanisms underlying audiovisual integration. Thus, establishing how and where multisensory cues are integrated has wide-ranging importance for understanding the principles of sensory processing.
Several competing models have attempted to explain how sensory information, correlated in space or time, is combined within or across sensory modalities. For example, whether sensory information is exchanged between primary sensory cortices remains a controversial topic. Without recordings of cortical activity during multisensory integration, these controversies will remain unresolved. My proposed research will provide critical experimental data to constrain existing models of sensory processing and inform future research.
Furthermore, deficits in combining multisensory cues have been linked to a number of psychological disorders, including autism and schizophrenia. Although it has been established that autistic individuals show decrements in recognizing audiovisual temporal correlations, until we understand how and where the brain integrates this information, clinical progress will be limited. By establishing a rodent model of temporal multisensory integration, my work will create a new tool to probe the mechanistic underpinnings of these disorders.
Thus, my proposed research is both important to progress our basic understanding of sensory processing and bears direct clinical relevance to existing disorders.
I proposed to characterize the role of cortex in both the spatial and temporal aspects of multisensory integration by exploiting the advanced behavioural repertoire and recording techniques in the mouse model system. This project relies on the completion of three key objectives
1) Train mice to perform multisensory spatial and temporal integration tasks.
2) Characterize cortical regions which respond to audiovisual correlations in space and time.
3) Determine which cortical regions are required for multisensory behaviours.
Using these chambers, I have trained mice to perform simple tasks that demonstrate a basic level of audiovisual integration. Thus, although I only utilized nine of the originally scheduled twenty-four-month grant, I have made substantial progress toward the completion of my first objective: to train mice to perform multisensory spatial and temporal integration tasks.
In addition, I have used two-photon microscopy to record from thousands on neurons in the visual cortex of multiple mice. This data is still preliminary, but will reveal to what extent this classically unisensory region is involved in processing multisensory stimuli. This is a key step toward the completion of my second objective: to characterize cortical regions which respond to audiovisual correlations in space and time.