Periodic Reporting for period 1 - PLACES (PLAsticity of perception in real and virtual spaCES)
Periodo di rendicontazione: 2023-05-01 al 2025-04-30
- to use cutting edge VR technology and machine learning to advance scientific knowledge of the neural mechanisms underlying spatial perception, action and plasticity
- to apply scientific knowledge about spatial perception and sensorimotor plasticity to advance VR technology and enhance VR applicability
- to understand, predict and use action intentions of users in VR
- to understand how usage of VR interacts with perceptual and sensorimotor states in VR and RR
- to translate research findings into applied fields in vision aids and social telepresence
In WP1 we study multi-sensory interaction in the control of walking and reaching in humans and in non-human primates to compare the functional specializations of the “vision-for-action” in the brain and better understand their specifics and commonalities. We use VR setups and machine learning to further investigate how Vvisual spatial maps in the brain maps combine visual input with non-visual information to provide spatial representations that can be used to guide reaching and walking.
WP2 focusses on gaze-based VR, the possibilities that open up by using gaze tracking in VR headsets. Many headsets nowadays contain eye tracking as a standard tool. Information about the gaze of the user will enable new capabilities for VR applications. Using eye tracking in VR will allow VR practitioners to exploit gaze-based illusions and properties of spatial perception and its plasticity to develop novel interfaces and VR experiences. We aim to apply scientific knowledge about spatial perception and sensorimotor plasticity to advance VR technology and enhance VR applicability.
WP3 is concerned with understanding, predicting and using action intentions to produce fluid and seamless vision in VR, a central methodological issue, since collecting movement data and displaying respective visual feedback takes time and produces a display lag that may lead to noticeable imperfection and possible cybersickness. The human sensorimotor system similarly suffers from inevitable delays, but the brain accounts for these by predicting upcoming actions and their sensory consequences. By predicting the action intentions of the user we aim to overcome the lag, just as the brain does, and increase the fluidity of the sensorimotor experience.
WP4 starts from the hypothesis that extended experience of a virtual environment and frequent alternation between real and virtual environments is likely to create novel sensorimotor mappings between physical and perceptual space. Users need to adapt to novel sensorimotor contingencies in VR and to be able to effortlessly switch back-and-forth between mappings and environments. We want to understand how usage of VR interacts with perceptual and sensorimotor states in VR and RR, i.e. to characterise the plastic processes in the VR/RR cycle. We will do this by conducting adaptation studies following experimental procedures established in psychology and applying them between VR and RR environments.
WP5 aims to connect the project with applications. The research on visual perception and sensorimotor interaction in PLACES is strongly related to applied issues in optometry, ophthalmology and VR displays. WP5 is therefore tasked to translate research findings into applied fields in vision aids and social telepresence. We look at opportunities of electronic lenses for automatic focal spectacles and novel VR display technology, monitor the state of HMD technology and extension to augmented reality (AR) displays, and aim to establish a virtual research institute to to conduct research within the virtual realm, provide a space for the communication of research findings to the general public and raise awareness about the data that can be collected by eye and action tracking in VR.
In another study we found that the direction of covert attention without any eye movements had significant effects on the dynamics of motor control of goal-directed vision-guided arm movements. These results may help to better understand the neural bases of asymmetrical neurological diseases like hemispatial neglect.
For the development of gaze-based VR we have developed methods to predict the occurrence of an eye movements and developed and compared methods to induce eye-blinks. Both blinks and eye movement allow to manipulate the VR scene without the user noticing any discomfort.