Skip to main content

Cognitive Control of a Hearing Aid

Article Category

Article available in the folowing languages:

Cognitively-controlled hearing aid

Nearly 7 % of the European population are classified as hearing impaired. Existing hearing aids improve sensitivity but fail to pick out a weak voice among many, a feature required for effective social communication.


Healthy ears have the unique ability to extract weak sounds from a noisy environment, but this ability is lost with ageing. A number of recently developed technologies can be recruited to improve the signal-to-noise ratio of standard hearing aids. However, there is no easy way to direct the device to process the sounds the user wishes to attend to. Turning brain signals into acoustic control The EU-funded COCOHA project addressed this problem by proposing to derive a control signal directly from the user’s brain. More specifically, project scientists obtained brain signals from an electroencephalogram (EEG) to steer the acoustic analysis hardware. “The idea was to match intention signals picked up by EEG electrodes on the scalp or within the ear canal to acoustic sources within the environment,″ explains project coordinator Dr Alain de Cheveigné. COCOHA brought together a multidisciplinary team of basic scientists and industrial partners across Europe. The teams had to tackle multiple methodological and technological challenges associated with cognitive control and acoustic processing. Brain signals have a high background electrode noise as well as motion and muscle artefacts, rendering interpretation hard. The COCOHA solution contains an acoustic front end, which provides clean streams as sources in the environment. The cognitively controlled back end decodes the user’s intent, providing a control signal used to steer acoustic processing towards the source the individual wishes to attend to. Scientists performed behavioural and EEG experiments with normal listeners to better understand auditory attentional processes. They presented a mixture of two voices to normal subjects and asked them to attend to one or the other while matching EEG signals with the voice. As Dr de Cheveigné outlines, “this provided us with cues on how the brain processes particular sounds and suggested ways of decoding these signals.″ Based on these data, researchers developed algorithms and computational methods for decoding brain signals into control signals. The future of cognitive control of hearing aids “Undoubtedly, a better understanding of attentional processes and methods to alleviate hearing impairment by cognitive control are the most important achievements of COCOHA,″ states Dr de Cheveigné. A first, COCOHA demonstrated that brain signals can be converted in real-time and used for steering the hearing aid. In addition, they have investigated eye gaze as a complementary means of hearing aid control. During the project, several prototypes were generated to test acoustic processing through a wireless network of microphones, signals measured from the ear canal, pupillometry as well as EEG-based signals. However, at this stage, the current solution is not fast and ergonomic enough to be included in a commercial product. Researchers are confident they can significantly boost the performance of the COCOHA hearing aid solution with further development of the brain-decoding methodology and reformulation of the acoustic signal processing methodology. Given that hearing disabilities can lower quality of life, lead to social exclusion, and increase associated costs for society, offering the ability to hearing impaired individuals to pick out sounds within noisy environments is the way forward to enable ‘natural communication’.


COCOHA, acoustic, hearing aid, cognitive control, brain signal, EEG

Discover other articles in the same domain of application