Skip to main content
European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

Smart, User-friendly, Interactive, Tactual, Cognition-Enhancer that Yields Extended Sensosphere - Appropriating sensor technologies, machine learning, gamification and smart haptic interfaces

Periodic Reporting for period 2 - SUITCEYES (Smart, User-friendly, Interactive, Tactual, Cognition-Enhancer that Yields Extended Sensosphere - Appropriating sensor technologies, machine learning, gamification and smart haptic interfaces)

Période du rapport: 2019-07-01 au 2021-06-30

The overall objective of SUITCEYES is to improve the level of independence and participation of persons with deafblindness and to enhance their communication, perception of the environment, knowledge acquisition, and conduct of daily routines. SUITCEYES proposes a new, intelligent, flexible and expandable mode of haptic communication via soft interfaces. It brings user needs and policy studies together with disability studies, haptics, psychophysics, smart textiles, sensor technology, semantics, face and object recognition, image processing, and gamification in order to address the following three challenges: (a) perception of the environment; (b) communication and exchange of semantic content; (c) joyful learning and life experiences. The haptic communication interface is a positive step in supporting people with deafblindness to achieve more of their aspirations. Availability of improved communication solutions (i.e. from distance rather than e.g. tactile signs from person to person) can enhance the user's ability to be a more active participant in society, and hence society at large will benefit from the contributions of the intended users of the technologies developed in the project.

The overall objectives were:
- To extend users’ independent perception of the physical surrounding environment
- To develop a haptic interface for informing of the physical surrounding environment
- To extend users’ communication capabilities by facilitating communication or extending the range of haptic vocabulary at their disposal
- Capturing, translating and semantically representing environmental clues
- To facilitate learning and extend fun life experiences through gamification and affective computing
- To improve the circumstances in which the users find themselves
- To identify and raise awareness of the priorities and aspirations of participating deafblind people for leading active and fulfilling lives, and the barriers to achieving these
- To identify good practice in policy frameworks across the participating countries and make recommendations for where policy could be improved
Within the SUICEYES project:
a) A large user-study was conducted to develop an understanding of whether and how new haptic technologies might be useful in everyday life and learning situations, and the types of opportunities and barriers that affect their availability and use by persons with deafblindness. Based on these, a set of user requirements were identified and use-case scenarios formulated. Additionally, the relevant national policy and legal frameworks have been examined in 5 EU countries. These findings in turn fed into and informed other tasks in the project and related decisions.
b) Sophisticated algorithms were developed for real-time object, face, and scene detection and recognition.
c) Semantic representation and reasoning and experimenting, with haptic solutions that can mimic social haptic communication has taken place.
d) Sensors were reviewed and most relevant options were identified and used.
e) Prototypes were developed that detect obstacles and obstacle-free spaces and smart textile prototypes that accommodate related sensors and actuators in a comfortable, discrete, and well-presented fashion.
f) Gamified scenarios were identified and experimented with.
g) All efforts have been tested in psychological experiments for relevance and effectiveness (although due to Covid-19 restrictions were imposed on the levels of user participation).
h) Other innovations included: one-to-many haptic communication; tactile board that enables sending haptic messages to the user either by drawing the haptic patterns on a touch screen, or by sound or by text; a haptogram design tool kit that allows the use of a large set of actuators and design of sophisticated haptic patterns or haptograms.
Regarding law and policy, WP2 has clarified the situation of people with deafblindness and other disabilities across 5 EU countries in relation to new technologies. The need for protection from exploitation and manipulation was highlighted and specific measures to ensure the uptake of opportunities as users of new technologies by these groups. For example, exemptions to permit the use of face recognition and biometric identification have been recommended to policymakers, together with the need for continuing involvement of people with disabilities and the scope for further technical development. The work has been linked to current developments to ensure it is timely and relevant.

Regarding visual analysis (WP3) we have developed algorithms for real-time detection and recognition of objects, faces, scenes high accuracy using sophisticated dimensionality reduction and machine learning approaches running both on sophisticated servers and on onboard Raspberry Pi device. With respect to semantics, we have developed a full-fledged ontology-based model that can very flexibly represent various aspects and respond to users queries with natural language which can be then converted to haptograms and be conveyed to the user. Moreover, we have successfully integrated the two components which also comprises great progress beyond state of the art approaches for environment cues perception.

WP4 achieved proper selection and implementation of sensors for location and depth sensing and computer vision that are suitable for being worn, and development of a modular mini-computer based system for using them; implemented a ROS- and MQTT-based communication system to enable modular communication with sensors and processors both on and off the garment; mounted and tested sensors and algorithms developed.

With regards to psychophysical testing (WP6):
- We completed and published 2 extensive literature overviews (one on thermal devices and one on haptic communication devices) to provide all partners (and the international haptic community) with the necessary background knowledge; a third one on haptic navigation is made internally available, but will be submitted to a journal at a later stage.
- In several studies we gathered fundamental knowledge on how well humans can perceive vibrotactile patterns that are envisioned to be used with the HIPI.
- We produced a list of recommendations for the design of haptic vibration patterns.
- We made and tested a first version of an app that can be used to learn SHC. We received positive feedback from a person with deafblindness.

WP7 achieved the following results and impacts:
- We now have a much better understanding of the concept of play and playfulness of persons with deafblindness
- We derived concepts of gamified scenarios tailored to individuals with deafblindness, designed to enrich experiences, gamify learning of HIPI and enhance social experiences.
- We implemented several gamified concepts for individuals with deafblindness (e.g. Easter-Egg-Hunt-Scenario, Follow-your-Partner-Scenario).
- In several studies we gathered knowledge on how humans with deafblindness can playfully interact with vibrotactile patterns.
- We developed the tactile board, a multimodal augmentative communication device for individuals with deafblindness.

In WP8, dissemination and exploitation, we reached a broad target audience of various stakeholders (research, industry, interest-groups, general public) through various means, such as the website, newsletters, social media presence and various events or fair participation. This helped increase project awareness.
Project Logo