Skip to main content
European Commission logo print header

Calibration and integration of peripheral and foveal information in human vision

Periodic Reporting for period 4 - PERFORM (Calibration and integration of peripheral and foveal information in human vision)

Periodo di rendicontazione: 2020-10-01 al 2021-06-30

Contrary to our daily experience, visual processing is not homogeneous across the visual field. Visual contrast sensitivity and acuity peak at the center of the visual field, the so-called fovea, and decline towards the periphery. Foveal vision is used to identify small spatial details, for instance letters during reading. Peripheral vision is essential to guide us through the environment. Despite the large differences of visual processing in the periphery and the fovea, objects normally do not change their appearance whether we view them foveally or peripherally. We investigated how the visual system calibrates and integrates peripheral information before an eye movement with foveal information after an eye movement. Our results indicate that attended, task-relevant information can be integrated by a resource-limited mechanism. This integration can be nearly optimal and follows similar computational principles as the integration of information from different sensory modalities.
Combining psychophysical experiments and computational modelling, we were able to show that peripheral information before an eye movement and foveal information after an eye movement are combined close to the statistical optimum. This means that the visual system has access to the relative quality of peripheral and foveal information and weighs them accordingly. We found near-optimal integration performance for low-level visual features, such as orientation and color as well as for high-level visual features, such as numerosity, suggesting that integration across eye movements is a general principle in visual processing. High-level features are integrated even when low-level features are changed during the eye movement, indicating that integration occurs on an abstract representation rather than a pictorial representation. Integration performance is impaired by distracting attention and by increasing memory load, suggesting that integration across eye movements relies on limited resources of attention and memory. However, integration is not constrained to the eye movement target but can also occur flexibly at other, task-relevant locations in the visual field.

Interestingly, we observed one exception where peripheral and foveal information were not optimally weighted: under dim lighting conditions, participants trusted foveal information more than peripheral information, although foveal information was only inferred and not veridical due to the absence of rod photoreceptors in the fovea. This suggests that the visual system is overconfident for its own inferences.

With respect to the calibration of peripheral and foveal information, we found that there can be small mismatches in appearance between the periphery and the fovea. These appearance differences are taken into account in the perception across eye movements: stimulus changes during an eye movement are easier to detect if they are inconsistent with the typical appearance difference compared to if they are consistent with the typical appearance difference. This indicates that the visual system predicts the foveal appearance based on peripheral information and generates an expectation of the typically experienced difference between peripheral and foveal appearance. Furthermore, visual processing after an eye movement is strongly affected by visual stimulation before the eye movement and optimized for the uptake of new information.
Our results revealed multiple interactions of peripheral and foveal vision and showed that the weighting of peripheral and foveal information is accurate and statistically optimal under some but not under all conditions. Deviations from the optimal weighting are particularly interesting, because they have the potential to challenge the dominant Bayesian model of perception.