Skip to main content

Panda Guide

Periodic Reporting for period 1 - Panda Guide (Panda Guide)

Reporting period: 2019-01-01 to 2019-04-30

Visual impairment entails various risks, making people experience difficulties leaving the security of their homes, which results in vicious circles. Sight loss is linked to increased risk of depression, reduced quality of life, declining physical health, decreased mobility, higher rates of poverty than in the general population.
Visually impaired people face various challenges in their daily lives ranging from self-navigation outside well-known environments, engaging relationships, practicing a profession or even accessing modern technology mainly based on touchscreens. Imagine for a moment navigating in unfamiliar environments without the use of your eyes, choosing which clothes to wear when you do not see colors, exchanging banknotes when you cannot read the numbers, typing a touchscreen when you rely on Braille to decipher visual content…
PANDA will disrupt the market by offering a multifunctional device that not only provides guidance, but helps to learn, earn a living, socialize and have fun. PANDA replaces many single-task assistive devices currently available but unable to provide virtual sight (e.g. magnifying devices, guide dogs, Braille, text-to-speech software, etc.). Closest competitors’ solutions are based on Visual enhancement through Augmented Reality which involves heavy and cumbersome equipment creating non-aesthetic appearance and targeting partially sighted people only.

We believe everyone deserves to see and therefore, we are developing a ground-breaking wearable technology called “PANDA Guide” which will allow all visually impaired (partially sighted people + blind people) to navigate, to socialize and to engage in all daily life activities. PANDA is worn as a headset to be placed around the neck with a discreet earpiece. The high-resolution camera captures in real-time surrounding environment in full 3D (depth perception). Images are pre-processed by the microcontroller for AI-algorithms processing and conversion into intuitive sounds. The latter are based on a combination of abstract sounds and semantic speech. Sounds are transmitted via bone conduction so that ears are free to hear surrounding environment. Therefore, the wearer can see through his ears as fully sighted people do. Our headset has already been tested by over 200 visually impaired people.
SME Instrument Phase 1 aimed at validating the vocal instruction patterns as our headset is based on audio interface only, extending existing IP portoflio and initiating European wide market uptake.