Human visual perception is one of the best-studied areas of research on the human mind. However, 99% of that research is concentrated on the central region making up less than 1% of our visual field. This is the region that gets mapped onto the fovea, where vision is best. However, information from the peripheral parts of a scene is highly important. Mediated by attention and eye movements, it is essential for guiding us through our environment. In the brain, the foveal and peripheral parts of the visual field undergo vastly different processing regimes. Since objects normally do not change their appearance, whether we view them foveally or peripherally, our visual system must integrate and calibrate peripheral information before an eye movement with foveal information after an eye movement.
We are planning to address these processes in four series of experiments. First, we will study the perception of basic visual features, such as orientation, numerosity and colour across the visual field and their integration in peripheral and foveal vision across eye movements. Second, we will investigate how this integration is supported by attention and memory resources. Third, since the integration requires learning and plasticity, we will track changes across the life span and study how healthy subjects can learn to compensate for artificial changes of peripheral and foveal vision. And fourth, we will explore whether we can manipulate the integration process for the optimal guidance of eye movements in complex natural search tasks.
The project will provide insights how the brain achieves a stable and homogeneous representation of the visual environment despite the ever changing sensory input and the inhomogeneity of processing across the visual field. We will reveal the basic learning mechanisms that allow a continuous calibration of peripheral and foveal vision, and that could be used in the long run for behavioural training of patients suffering from vision impairments.
Call for proposal
See other projects for this call