CORDIS
EU research results

CORDIS

English EN
Audio-Visual System for the Blind Allowing Visually Impaired to See Through Hearing

Audio-Visual System for the Blind Allowing Visually Impaired to See Through Hearing

English EN

Using sounds to see, in a first-of-its-kind audiovisual system for the visually impaired

EyeSynth uses sonic landscapes to take advantage of the brain’s interpretation power, in helping the visually impaired to better understand their environment in unprecedented detail.

HEALTH

© EyeSynth
The World Health Organization estimates that there are 285 million people globally living with vision impairment. With an ageing population, along with sight damaging illnesses such as type 2 diabetes, this figure is fast-growing. With no known cure, the visually impaired rely on assistive devices, such as canes or guide dogs. However, while these methods avoid obstacles, they do not help people better understand their environment. The EU-supported project, EyeSynth, has developed an audiovisual system for the visually impaired, consisting of glasses with embedded cameras which record surroundings in 3D. Through a microcomputer connection, the data collected is converted into abstract sounds conveying spatial information. A radically different solution The starting point for the EyeSynth system was not technology, but the brain. “Our system does not describe the environment in words. We provide raw spatial information, and the user's brain decodes that. Co-design with our testers was critical to developing this user-friendly and effective interface,” says project coordinator Mr Antonio Quesada. The team ran a large number of tests, calibrating the underlying algorithm based on user feedback, until the users were able to identify shapes and spaces, as well as measure depth and locate objects accurately. “Designed to be intuitive, the system takes users on average 45 minutes to an hour and a half to master it enough to distinguish simple objects or move around the office avoiding obstacles,” says Quesada. The system is composed of two main elements: the glasses and the image processing unit – a small CPU the size of a smartphone, with an internal battery that powers the system. A normal USB powerbank can also be connected. The result is lightweight glasses, with ease of movement. There are two high-quality (60 frames per second) image processing modes. In Tracking Mode, only the image’s ‘central column’ is analysed, with the user tracking left to right, similar to using a white cane. In Full Panoramic mode, the whole landscape is represented simultaneously, providing much more sonic information and making tracking unnecessary. Based on daily need, in both modes, the adjustable analysis distance is 0.8-6 metres. Crucially, this range caters well for the detection of obstacles in the street. The 3D data is transferred into real time acoustic representations reminiscent of ocean sounds, with their composition and timbre corresponding to the recorded shapes. These sounds are transmitted directly through the head bones, so the ears are free to listen. “We decided not to use spoken language, partly as it is difficult to describe phenomena verbally, it would be too distracting, and we would have had to make different language versions,” says Quesada. As testing primarily involved blind users, the team are undertaking a medical study with ophthalmology specialists at the Provincial Hospital of Castellón in Spain, to work with low vision patients. Results so far have also been encouraging. Empowering technology EyeSynth extends the autonomy and independence of people with visual impairment, avoiding the problems and accidents associated with physical barriers, while helping the user navigate unknown locations. The team aim to have the first batch in shops this summer. “We will continue to upgrade the system with new features such as face recognition or text,” says Quesada. “To keep this working as empowering technology which helps in education, employment, leisure, and ultimately improves lives, we will continually listen to our users.”

Keywords

EyeSynth, blind, visual impairment, cameras, acoustic, sonic, sonar, 3D, sounds, glasses, smartphone

Project information

Grant agreement ID: 757202

Status

Closed project

  • Start date

    1 February 2017

  • End date

    31 January 2019

Funded under:

H2020-EU.2.1.1.

H2020-EU.2.3.1.

  • Overall budget:

    € 1 478 950

  • EU contribution

    € 1 035 265

Coordinated by:

EYESYNTH SL