Skip to main content

The retinae as windows to the brain: An oscillatory vision

Periodic Reporting for period 3 - OscillatoryVision (The retinae as windows to the brain: An oscillatory vision)

Reporting period: 2019-03-01 to 2020-08-31

Several sophisticated image processing circuits have been discovered in animal retina, many of which manifest massive neural synchrony. A major insight is that this type of synchrony often translates to high-frequency activity on a macroscopic level, but electroretinography (ERG) has not been tapped to examine this potential in humans. Bolstered by our compelling results combining ERG with magnetoencephalography (MEG), this project will address several open questions with respect to human visual processing:

1) Could variable retinal timing be linked to intrinsic image properties and pass on phase variance downstream to visual cortex? Our data suggests the retina responds to moving gratings and natural imagery with non-phase-locked high gamma oscillations (>65 Hz) just like visual cortex, and that slower ERG potentials exhibit strong phase locking within stimuli but large phase variance across stimuli.

2) Do such retinal gamma band responses, both evoked and induced, directly drive some cortical gamma responses? Pilot data suggests that it can, through retinocortical coherence, our novel ERG-MEG mapping technique.

3) Several kinds of motion have now been shown to elicit massive synchrony in mammalian retina circuits. Does this also result in macroscopic high-frequency activity? If so, our experiments will finally reveal and characterize motion detection by the human retina.

4) Do efferent pathways to the retina exist in humans? We discovered that the ERG exhibits eyes-closed alpha waves strikingly similar to the classic EEG phenomenon and, leveraging our retinocortical coherence technique, that this activity is likely driven by contralateral occipital cortex. Then, can retinal responses be influenced by ongoing cortical activity?

Characterizing retinocortical interaction represents a complete paradigm shift that will be imperative for our understanding of neural synchrony in the human nervous system and enable several groundbreaking new avenues for research.
ERG responses were examined with respect to the corresponding responses of thalamus and visual cortex, as reconstructed with MEG. We implemented a novel neuroimaging strategy, combining beamforming with the Hilbert transform to examine high-frequency response in bands ranging from 55 Hz to 145 Hz. The first cortical responses appear at 27 ms at ~115 Hz, lagging the corresponding retinal oscillatory potential by 8 ms.

Some studies suggest that the processing of dark stimuli may occur more quickly by taking advantage of greater neural resources in the visual system. In a second experiment, longer light pulses of about half a second were employed. In the cortex but not in the retina, high frequency responses occurred more quickly with transitions from light to dark compared to transitions from dark to light. Interestingly, while dark-to-light transitions involved a wide range of frequencies (55-195 Hz in the retina, and 55-145 Hz in the cortex), light-to-dark transitions were restricted to the 75-95 Hz frequency band in both retina and cortex.

The rapid timing of these responses across various conditions together with their sequential appearance first in the retina and then the cortex support the view that such high-frequency modulations reflect the precise timing of information handling throughout the human nervous system. These responses occurred much earlier than classic visual evoked responses arising from the retina or the cortex.

Measuring ERG together with MEG thereby provides a more informative measure of information processing at each stage of the visual pathway. It may furthermore constitute a potential strategy to uncover disturbances of the visual pathway in disease, not only in disorders of vision but also as a diagnostic of systemic abnormalities relevant to many neurological and psychiatric disorders.

We have also made significant strides in contributing to open source software for MEG/EEG analysis. Group member Britta Westner has been porting and improving MEG/EEG source reconstruction methods to MNE-Python, a rapidly growing open source toolbox ( This included the several beamformer variants, including the Hilbert beamformer method developed in our group to reconstruct amplitude and phase information across frequency bands. We have found it is particularly well-suited to high gamma band responses (75-150 Hz) that have been notoriously difficult to track down with other techniques.

Group member Sabine Leske furthermore developed a new method for removing electrical interference from our MEG/EEG recordings. The noise originating from electrical cables and equipment has been a nuisance since the early days of neurophysiological recordings, and techniques for managing it are about as old. AC power is ubiquitous and couples strong sinusoidal interference into sensitive electrophysiological recordings. Increasingly, MEG and EEG are reaching into higher frequencies that traverse the fundamental power line frequency (50 Hz in most of the world and 60 Hz in North America) and its harmonics (multiples of the fundamental frequency). As a first line of defense, modern laboratories aim to simply shield the recording environment as much as possible from such interference, but with varying degrees of success depending on the setting (e.g. an urban hospital) and whether electrical equipment needs to be in the recording room for stimulus presentation. Furthermore, there is growing interest in “field” recordings (including sleep studies, brain-computer interfaces, and neural prosthetics), where effective shielding is simply not feasible. Our method is now included in a leading open source toolbox, FieldTrip ( for the immediate benefit of the MEG/EEG community, and a manuscript describing the technique is in revision.

Finally, group member Tommy Clausner led the design of a camera array and associated software for photogrammetric reconstruction of research participants’ heads together with MEG/EEG sensor positions. This involved the construction of a geodesic dome that simultaneously photographs the head from multiple angles while the volunteer wears either an EEG cap or MEG fiducial markers. The software, Janus3D, then derives the positions of the MEG/EEG sensors relative to the volunteer’s head and matches it to their MRI. This ultimately increases the accuracy of MEG/EEG source activity and its correspondence with an individual’s brain anatomy.
We have initiated a line of research investigating how retinal and cortical dynamics interact. Our initial experiments suggest that many high-frequency responses of visual cortex may indeed be driven by retinal activity. We are now working to extend these results, and have started to investigate more complex visual processing. These include perception related to digital photographs of natural scenes, motion of synthetic patterns, as well as the processing of the visual system during slow wave sleep. We expect that, as in the animal retina, we will find neurons of the human retina are responsible for initial processing in all of these situations, and that this processed information is communicated to different regions of visual cortex through oscillations.

We have furthermore made improvements to MEG/EEG methodology that can equally benefit the technique at large, namely, improved removal of powerline interference, localization of sensors with a camera array, and implementation of novel brain source mapping algorithms in the leading open source Python toolbox for MEG/EEG analysis.