Our ability to intuitively tell whether an object is near or far away is not something we typically think about. “When we look out the window, it is immediately clear that the street in front of us is closer than the clouds in the sky,” notes MouseDepthPrey project coordinator Mark Hübener, research group leader in the Synapses – Circuits – Plasticity Department at the Max Planck Institute of Neurobiology, Germany. “Our brain’s visual system does this automatically. It seems effortless, but there are many underlying neural processes that make this possible.” Light is transformed to neural signals by our eyes’ retinas. These signals are then processed to extract information, which helps us, among other things, to tell how far away an object is. “We have a good handle on understanding how these sources of depth information make it from the eye to the brain,” says Hübener. “What we are less sure of is how these signals are combined and processed to provide us with instantaneous depth perception, by just looking.”
Investigating depth perception
The MouseDepthPrey project, undertaken with the support of the Marie Skłodowska-Curie Actions programme, sought to understand how various depth cues are processed in the brain. “To tackle this question, we decided to rely on the mouse,” explains Drago Guggiana Nilo, the postdoctoral fellow who worked on the project. “We needed a way to ask the mouse how far it thinks objects are.” To achieve this, a strong natural behaviour in the mouse was harnessed: prey capture. Mice readily hunt crickets and other insects for food. They rely heavily on vision to do so, most likely using depth perception. A high-speed, 12-camera video system to track the mouse’s position within an arena was set up. This information was used to create a virtual visual environment from the perspective of the mouse. Software developed for gaming was used to refine the environment to make it as realistic as possible. A professional animator was commissioned to render a highly detailed cricket, serving as virtual prey. Brain activity of the mouse during prey capture was recorded using a miniature, head-mounted microscope.
Brain behaviour revealed
This innovative virtual reality experiment enabled the team to obtain incredibly detailed characterisations of prey capture behaviour. “We learned that although prey capture is an innate behaviour, mice still improve their performance over days,” says Guggiana Nilo. “Detailed analysis of hundreds of hunting sequences demonstrated that mice follow a surprisingly stereotypical pattern. This suggests that mice succeed by tiring the cricket.” While preliminary, the data indicate that distance-to-cricket information is present at the level of the primary visual cortex, the first brain region to combine information from both eyes. Moreover, distance information seems to be a network property rather than something that is represented in particular neurons. “We are now continuing the project into its main goal – the understanding of depth cue integration in the mouse,” notes Hübener. “This includes developing 3D mouse goggles that will stimulate both eyes independently. This is basically the same technique that enables us to see depth in 3D movie theatres.” Beyond providing insight into a biological phenomenon, Hübener believes that understanding depth perception will benefit a range of emerging fields. This includes remote collaborative work, for which virtual and augmented reality technologies have been shown to be essential. “We are confident that biology-based improvements can help alleviate current imperfections in this technology,” he adds.
MouseDepthPrey, brain, visual, neural, retinas, depth, light, visual cortex, 3D, virtual reality, augmented reality