European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Adaptation, learning and training for spatial hearing in complex environments

Periodic Reporting for period 2 - ALT (Adaptation, learning and training for spatial hearing in complex environments)

Okres sprawozdawczy: 2018-01-01 do 2019-12-31

In everyday listening, humans are exposed to multiple concurrent auditory and speech stimuli in complex, continuously changing environments. To correctly extract relevant information, they adapt their processing to reflect the specifics of the current scene, they learn from previous experience and generalize across new settings. The current project advanced our understanding of the mechanisms that the brain uses to achieve adaptation and learning in complex listening environments.
The main objectives of this research program were to 1) combine behavioral experiments, neural imaging, and computational modeling to study neural adaptation to reverberation for speech processing, sound localization and attentional processing in multi-talker environments, 2) develop a brain training computer game module that can enhance the listening capabilities of normal-hearing and hearing-impaired listeners, and 3) organize workshops for students and young researchers to provide them with cognitive neuroscience training related to the consortium research activities.
The main goal of the exchanges was to develop a program of collaborative research, clinical applications, workshops, and training activities that allowed the groups to combine the specific local expertise, special experimental equipment, and technical and methodological expertise available to only one of the groups. Also, the early-stage researchers of the collaborating groups were trained and familiarized with the equipment, methods, and facilities available to only one of the groups, resulting in establishing long-term collaborations among the groups.
The results of the grant allow us to better understand the mechanisms underlying adaptation and learning in complex auditory environments. They are important advancements in the basic auditory neuroscience, but they also help us to better address the needs of special populations, including more effective brain training applications for the rehabilitation of people with auditory processing deficits, development of new prosthetic devices and new virtual reality technologies. Finally, the workshops, training and exchanges of expertise are resulting in the strengthening of the European Research Area and in the establishment of new long-term collaborations between EU and US researchers.
The work performed achieved all the key objectives of the project and the main conclusions are described below separately for each work package.
WP1: Adaptation to room reverberation for speech perception
We performed two studies that examined how the brain adaptation to new acoustic environments influences speech perception and learning of new phonetic categories. In the first study we showed that the normal-hearing listeners benefit from previous exposure to speech in a consistent room, but that when the room is switched, the adaptation to a previous room has a negative impact on speech understanding in the new room. In the second study we showed, using a simple computer game, that the brain can learn new phonetic categories spontaneously, i.e. without direct training, when these are presented in different reverberant environments, but not when they are presented in a single environment. Fig. 1 illustrates the results of the second study.
WP2 - Adaptation to room reverberation for distance and horizontal perception
We examined adaptation processes in auditory distance and horizontal sound source location perception. In the WP2 studies on distance perception, we 1) identified the brain region encoding the distance of sound sources (Fig. 2), and 2) showed that auditory distance perception is calibrated by vison such that the vision-induced adaptation is dependent on whether the visual adaptor stimuli are closer or farther than the auditory stimuli. In the WP2 studies on horizontal sound localization we performed a series of experiments showing that the representation of horizontal sound source locations is strongly adaptive based on the preceding stimulation and on the attentional state of the listener, and we identified some electrophysiological correlates of these adaptive/attentional processes.
WP3 - Spectral cues for spatial hearing: Objective measures & modeling
We studied and modeled the auditory processing of monaural spectral cues in spatial hearing. We also implement an electro-encephalography (EEG) system at the OEAW. We focused on the perception of sound externalization, i.e. sound perceived inside/outside of the head, by means of psychoacoustic tests and objective measures (EEG), considering both original and modified listener-specific head-related transfer functions (HRTFs). We showed that the brain is more sensitive to sounds that are approaching us vs. those that are moving away from us (Fig. 3).
WP4 - Auditory brain training game
We developed a real-time spatial audio module for an auditory brain-training game and performed several studies showing how games with the new module can be used to train normal-hearing and hearing-impaired listeners to improve their spatial auditory perception. The game Listen with the audio module (Fig. 4) can train horizontal sound localization (distinguishing sounds that come from the left vs. right). It is aimed at normal-hearing listeners, who might benefit in everyday tasks like identifying dangerous situations, e.g. an approaching car, by hearing. A modification of the game was used to train listeners how to combine the acoustic cues to horizontal localization. This is particularly useful for hearing-impaired listeners. Finally, we developed another game that can improve the listeners’ ability to learn new phonemes in a foreign language.
WP5 - Workshops
We organized two workshops that allowed dissemination of the project results to a wide audience and provided the ALT consortium with the opportunity to invite experts from various fields of cognitive neuroscience to present their newest results (Fig. 5).
The main impact of the project is the increased research capacity and improvement of the European Research Area and the EU partners, thanks to the transfer of knowledge and expertise from the US partners, and thanks to the world-wide experts participating in the workshops. Specifically, a lot of new expertise has been obtained in EEG and fMRI neuroimaging methods, brain-training game development, and more generally in Cognitive Neuroscience knowledge and methodology presented at the Workshops.
Progress beyond the state of the art: The results of this grant advance our understanding of the adaptive and attentional processes influencing auditory spatial perception, particularly speech perception, phonetic speech learning, horizontal and distance localization, and spectral cues for virtual audio. The spatial audio module and its utilization in brain-training games is a new tool for improving perception of normal-hearing and hearing-impaired listeners, as well as a tool in basic auditory neuroscience research. With it, we are continuing to investigate brain-training games that can be used to improve spatial perception in hearing-impaired listeners wearing cochlear implants. We plan to continue collaboration with our US partners who are developing brain training applications for populations with special needs, such as people with auditory processing deficits (OSHU) and for normal population for which the aim is to enhance their cognitive and perceptual abilities (UCR Brain Game Center).
Fig. 2 Brain areas encoding distance of sound sources in front of the listener
Fig. 4 “Listen” game in which a real-time spatial audio module was developed.
Fig. 5 Attendees of the 2nd ALT workshop organized in Kosice on 3-5 June 2019.
Fig. 1 (L) Game used for implicit training of new phonemes. (R) Improvement in performance.
Fig. 3 Electrical brain potentials evoked showing increased sensitivity to approaching sounds.