Project description
Innovative haptic communication model to address deafblindness
Deafblindness is a condition that significantly restricts a person's ability to communicate and access information. Currently, approximately 2.5 million individuals in the EU are affected. However, innovative technological advancements rarely cater to individuals with profound dual impairments in vision and hearing. The EU-funded SUITCEYES project proposes an approach to haptic communication through adaptable and intelligent soft interfaces. These will be designed according to the specific needs of users. The project aims to combine various technologies such as smart textiles, sensors, semantic technologies, image processing, face and object recognition, machine learning, and gamification. The project seeks to overcome the challenges related to perceiving the environment, exchanging semantic content, as well as fostering learning and enriching life experiences with joy and fulfilment.
Objective
Useful ICT innovations are continuously developed improving the quality of life for many people. However, such solutions do not typically included people with severe dual vision and hearing impairments, and at times also coupled with cognitive disability. Deafblindness is a grave condition. Though, rare at birth, it can be acquired due to different causes. There is an estimated 2.5 M deafblind persons in the EU. Limited communication is a major problem for this group; something that will be addressed by SUITCEYES in a novel way. Benefits are not limited to this group; rather the solution will scale to other areas.
SUITCEYES proposes a new, intelligent, flexible and expandable mode of haptic communication via soft interfaces. Based on user needs and informed by disability studies, the project combines smart textiles, sensors, semantic technologies, image processing, face and object recognition, machine learning, and gamification. It will address three challenges: perception of the environment; communication and exchange of semantic content; learning and joyful life experiences. SUITCEYES will extract and map the inner structure of high-dimensional, environmental and linguistic clues to low-dimensional spaces, which then translate into haptic signals. It will also utilize image processing, mapping environmental data to be used for enriched semantic reasoning. SUITCEYES’ intelligent haptic interface will help the users to learn activation patterns by a new medium. With this interface, users will be able to take more active part in society, improving possibilities for inclusion in social life and employment.
The solution will be developed in a user-centred iterative design process, with frequent evaluations and optimizations. The users’ learning experiences will be enriched through gamification and mediated social interactions. The proposed solution will take into account the potential differences in levels of impairments and user capabilities and adapt accordingly.
Fields of science
Programme(s)
Funding Scheme
RIA - Research and Innovation actionCoordinator
50190 Boras
Sweden