Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

PLAsticity of perception in real and virtual spaCES

Periodic Reporting for period 1 - PLACES (PLAsticity of perception in real and virtual spaCES)

Reporting period: 2023-05-01 to 2025-04-30

Space is the substrate of our lives. The perception of space is the basis for our actions. We learn to act in space and adapt our perception when sensory access to space is altered, for example by injury or degeneration, but also during healthy aging, for example when we develop the need to wear glasses. PLACES investigates how our perceptual space changes with sensorimotor interactions. We study how the representation of space in the brain is calibrated by eye, arm and body movements and use virtual reality (VR) to simulate altered sensory states. We think of VR as both a tool for research and a technology to benefit society in the future. As people will increasingly use VR for work, pleasure and social interactions it is imperative to understand how sensorimotor interactions in VR affect spatial perception in VR and in RR, the Real Reality of actual space. Since perceptual space has been shown to be plastic under sensorimotor interactions, altered sensorimotor contingencies in VR are expected to shape perception in VR and potentially in RR, which might pose problems for long-term VR exposure. We will use VR as a tool to better understand the mechanisms, limitations, and consequences of perceptual plasticity. At the same time, knowledge about perceptual mechanisms and their reliance on sensorimotor contingencies will be essential for the continuing development of VR as a technology. For example, limitations and illusions of self-motion perception have been employed to increase the perceived range of walkable space in VR over the physical space limitations of RR. Consumer grade VR head-mounted displays are now typically equipped with eye trackers to monitor gaze of the user. While this holds enormous potential for future applications, knowledge about the interactions of the oculomotor system with full 3D space perception and with the spatial direction of attention is required to fully exploit this potential. The proposed work in the present application is characterised by a deep link between real and virtual spaces, which we call the RR/VR cycle. We will use this link to study the control of eye-, arm- and self-motion in the brain on the perceptual and physiological level, to study perceptual plasticity in VR and RR based on novel sensorimotor contingencies that can only be introduced in VR, and to advance technological use of VR for social interaction and for simulation processes in the development of RR vision aids.
Work performed in PLACES covers these specific aims:
- to use cutting edge VR technology and machine learning to advance scientific knowledge of the neural mechanisms underlying spatial perception, action and plasticity
- to apply scientific knowledge about spatial perception and sensorimotor plasticity to advance VR technology and enhance VR applicability
- to understand, predict and use action intentions of users in VR
- to understand how usage of VR interacts with perceptual and sensorimotor states in VR and RR
- to translate research findings into applied fields in vision aids and social telepresence

In WP1 we study multi-sensory interaction in the control of walking and reaching in humans and in non-human primates to compare the functional specializations of the “vision-for-action” in the brain and better understand their specifics and commonalities. We use VR setups and machine learning to further investigate how Vvisual spatial maps in the brain maps combine visual input with non-visual information to provide spatial representations that can be used to guide reaching and walking.

WP2 focusses on gaze-based VR, the possibilities that open up by using gaze tracking in VR headsets. Many headsets nowadays contain eye tracking as a standard tool. Information about the gaze of the user will enable new capabilities for VR applications. Using eye tracking in VR will allow VR practitioners to exploit gaze-based illusions and properties of spatial perception and its plasticity to develop novel interfaces and VR experiences. We aim to apply scientific knowledge about spatial perception and sensorimotor plasticity to advance VR technology and enhance VR applicability.

WP3 is concerned with understanding, predicting and using action intentions to produce fluid and seamless vision in VR, a central methodological issue, since collecting movement data and displaying respective visual feedback takes time and produces a display lag that may lead to noticeable imperfection and possible cybersickness. The human sensorimotor system similarly suffers from inevitable delays, but the brain accounts for these by predicting upcoming actions and their sensory consequences. By predicting the action intentions of the user we aim to overcome the lag, just as the brain does, and increase the fluidity of the sensorimotor experience.

WP4 starts from the hypothesis that extended experience of a virtual environment and frequent alternation between real and virtual environments is likely to create novel sensorimotor mappings between physical and perceptual space. Users need to adapt to novel sensorimotor contingencies in VR and to be able to effortlessly switch back-and-forth between mappings and environments. We want to understand how usage of VR interacts with perceptual and sensorimotor states in VR and RR, i.e. to characterise the plastic processes in the VR/RR cycle. We will do this by conducting adaptation studies following experimental procedures established in psychology and applying them between VR and RR environments.

WP5 aims to connect the project with applications. The research on visual perception and sensorimotor interaction in PLACES is strongly related to applied issues in optometry, ophthalmology and VR displays. WP5 is therefore tasked to translate research findings into applied fields in vision aids and social telepresence. We look at opportunities of electronic lenses for automatic focal spectacles and novel VR display technology, monitor the state of HMD technology and extension to augmented reality (AR) displays, and aim to establish a virtual research institute to to conduct research within the virtual realm, provide a space for the communication of research findings to the general public and raise awareness about the data that can be collected by eye and action tracking in VR.
The project has started work in all work packages. Results are obtained throughout the funding period. Early results in the first year were concerned with searching for targets in a VR scene by active movements of the eye and head. It showed that saliency, i.e. the effect of some objects to pop-out in a scene, is less obvious when the scene is brought into view by an active eye-head movement than when displayed on a computer screen.
In another study we found that the direction of covert attention without any eye movements had significant effects on the dynamics of motor control of goal-directed vision-guided arm movements. These results may help to better understand the neural bases of asymmetrical neurological diseases like hemispatial neglect.
For the development of gaze-based VR we have developed methods to predict the occurrence of an eye movements and developed and compared methods to induce eye-blinks. Both blinks and eye movement allow to manipulate the VR scene without the user noticing any discomfort.
The connection between virtual and real space that forms the basis of the approach in PLACES
My booklet 0 0