European Commission logo
English English
CORDIS - EU research results
CORDIS

SONICOM - Transforming auditory-based social interaction and communication in AR/VR

Project description

Transforming virtual experiences through immersive audio

Sound is an integral part of the human experience. As one of the most important ways of sensing and interacting with our environment, sound plays a major role in shaping how the world is perceived. In virtual or augmented reality (VR/AR), simulating spatially correct audio is of vital importance to delivering an immersive virtual experience. However, acoustic VR/AR presents many challenges. Using the power of artificial intelligence, the EU-funded SONICOM project aims to deliver the next milestone in immersive audio simulation. The goal is to design the next generation of 3D audio technologies, provide tailored audio solutions and significantly improve how we interact with the virtual world.

Objective

Immersive audio is our everyday experience of being able to hear and interact with sounds around us. Simulating spatially located sounds in virtual or augmented reality (VR/AR) must be done in a unique way for each individual and currently requires expensive and time-consuming individual measurements, making it commercially unfeasible. Furthermore, the impact of immersive audio beyond perceptual metrics such as presence and localisation is still an unexplored area of research, specifically when related with social interaction, entering the behavioural and cognitive realms. SONICOM will revolutionise the way we interact socially within AR/VR environments and applications by leveraging methods from Artificial Intelligence (AI) to design a new generation of immersive audio technologies and techniques, specifically looking at personalisation and customisation of the audio rendering. Using a data-driven approach, it will explore, map and model how the physical characteristics of spatialised auditory stimuli can influence observable behavioural, physiological, kinematic, and psychophysical reactions of listeners within social interaction scenarios. The developed techniques and models will be evaluated in an ecologically valid manner, exploiting AR/VR simulations as well as real-life scenarios, and developing appropriate hardware and software proofs-of-concept. Finally, in order to reinforce the idea of reproducible research and promoting future development and innovation in the area of auditory-based social interaction, the SONICOM Ecosystem will be created, which will include auditory data closely linked with model implementations and immersive audio rendering components.

Call for proposal

H2020-FETPROACT-2018-2020

See other projects for this call

Sub call

H2020-FETPROACT-2020-2

Coordinator

IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Net EU contribution
€ 1 829 708,75
Address
SOUTH KENSINGTON CAMPUS EXHIBITION ROAD
SW7 2AZ LONDON
United Kingdom

See on map

Region
London Inner London — West Westminster
Activity type
Higher or Secondary Education Establishments
Links
Total cost
€ 1 829 708,75

Participants (10)