Skip to main content

Article Category

Article available in the folowing languages:

Achieving perceived spatial presence and ego-motion

Under the auspices of the POEMS project sophisticated and robust computer models and simulation software was developed for a lean, elegant and cost-effective virtual reality set-up.

Digital Economy

By adopting a user-centred approach the POEMS project focused on cognitive and perceptual aspects to improve performance of spatial tasks in virtual environments. On the basis of user needs and experiences the project resulted in reliable multi-level measurement methods for perceived spatial presence and ego-motion. Aided by these methods optimal multimodal parameters and cross-modal, synergistic interactions have been established and modelled. The new virtual environment set-up offers simulation of ego-motions by combining auditory, visual, and vibration cues rather than actually moving the observer. The Applied Acoustics Department of the Chalmers University of Technology in Sweden developed two types of visual computer models. One type includes complete geometrical models offering real time exploration of the virtual environment and the other type consists of 'roundshots' of real places (360° panoramic photographs). The roundshot models vary a lot from open spaces to interior rooms and feature extremely high graphical fidelity with low complexity that makes them computationally undemanding. Unlike the roundshot models, the geometrical models feature increased complexity, yet they offer unrestricted large space navigation with any kind of motion. Borrowing from the POEMS models database visual computer models correlate to acoustic ones, such as HeadScape and Binscape models. The high acoustic fidelity HeadScape model can only be used in conjunction with the visual roundshot models as it is related to rotational motions. On the other hand, BinScape models are of lower acoustic fidelity but allow any kind of motion. In cases where full user interaction is unnecessary or very high fidelity acoustic simulations are required, the 'Walkthrough convolver' allows the user to render acoustic walkthroughs offline. In order to investigate the importance of idiosyncratic head-related auditory cues in acoustic motion simulation a novel tool was also realised. The Head Related Transfer Functions (HRTFs) acquisition system and software provides more personalised simulation scenarios rather than motion ones. Offering a very high degree of control of both timing and features simplification, the system is suitable for running psychophysical experiments for the purposes of the project. For more information on the project click at: http://www.poems-project.info/

Discover other articles in the same domain of application