Skip to main content
European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Socially Pertinent Robots in Gerontological Healthcare

Periodic Reporting for period 3 - SPRING (Socially Pertinent Robots in Gerontological Healthcare)

Período documentado: 2023-06-01 hasta 2024-05-31

In the past five years, social robots have been introduced into public spaces, such as museums, airports, commercial malls, banks, company show rooms, hospitals, and retirement homes, to mention a few examples. In addition to classical robotic skills such as navigation, grasping and manipulating objects, i.e. physical interactions, social robots must be able to communicate with people in the most natural way, i.e. cognitive interactions.
What if robots could take on the repetitive tasks involved in receiving the public? There are already forms of artificial intelligence which are capable of interacting with humans. While there are “butler” robots which can provide the weather forecast or give geographical directions, they are not able to execute complex social tasks autonomously, such as escorting users around a building. To be able to carry out such tasks, a social robot must be capable of perceiving and distinguishing signals emitted by different speakers, understanding these signals and identifying that they are addressed to the robot, and then react accordingly. This is a daunting challenge, because it requires numerous perceptive abilities and a capacity for automatic learning in order to execute autonomous decision-making. SPRING's overall objective is to answer this challenge.
But how do we enable a robot to identify from a set of conversations which request is addressed to it; to understand that it is being asked where a person may sit; to look around and find a vacant seat, determine the path to accompany the speaker to their seat while avoiding other patients and staff on the premises, and then perceive the relevance of offering distraction in the form of conversation? There are numerous technological difficulties and hurdles to overcome in order to accomplish this type of complex task. With regard to movement, SPRING opted to implement the reinforcement learning method. In order to determine its speed, approach angle and other parameters of movement, the robot is trained through an artificial intelligence system which calculates the adequacy between optimal action and the action actually undertaken, and attributes “rewards” for successful outcomes. This training phase enables the robot to come across a wide variety of possible cases in full autonomy, without human intervention to correct pathways. Once placed in real conditions, the robot continues to learn and identify the optimal action for each situation. This opens up the possibility of its use in a hospital setting. This is the aim of the second phase of SPRING, which started in 2022 and ended in May 2024: to validate the use of the robot in a hospital and to assess its impact on users and their habits, in addition to its acceptability. Entrusting even simple social tasks to a robot is nevertheless far from innocuous and raises numerous ethical and organisational issues, which are also handled within the project.
The SPRING project has reached most of its original objectives. 30 out of 33 of the TRL advancement steps from our original plan are achieved. This is a good result for a research-oriented project such as SPRING, with inherent uncertainties. Most importantly, the project produced important advances from the research perspective in the fields of Computer Vision (source-free domain adaptation), Audio Processing (audio-visual concurrent speaker detection), Dialogue Modelling (multi-party conversational robots with large language models), Reinforcement Learning (extension of the successor feature framework to non-linear reward functions), Social Robot Acceptance (measuring usability and acceptance by real patients and companions in gerontological healthcare), to cite a few. This is attested in part by its publication record (as of May 2024, 115 published papers, >2350 citations).
Hereafter is a rapid overview of the achieved progress with regards to each of the SPRING’s objectives.
Overall objective: to develop Socially Assistive Robots with the capacity of performing multi-person interactions and open-domain dialogue.
--> This objective has been achieved almost completely (>90%). We prioritised the evaluation of the robot acceptance and usability in single-user interactions in hospital settings, while evaluating the multi-user interaction in controlled environments.
Scientific objective: to develop a novel concept of socially-aware robots through innovative methods and algorithms.
--> The SPRING project produced numerous scientific outputs, ranging from methods processing visual and auditory content for scene, speaker and dialogue understanding, to methods allowing the robot to chat with several people as well as to navigate in environments with diverse social properties.
Technological objective: to create a new generation of robots that are flexible enough to adapt to the needs of the users.
--> Hardware: the robotic platform produced is the perfect low-cost/high flexibility platform for a new generation of social robots.
--> Software: we fostered the development of the ROS4HRI standard, specifically designed for social robotic platforms., We enabled crucial capacities for social robotic platforms, with 54 dedicated software modules publicly available.
Experimental objective: to validate the technology based on HRI experiments in a gerontology hospital, and to assess its acceptability.
--> Validation of the platform was achieved on both patients and medical staff. Usability and acceptability scores have improved over the incremental experimental waves.
Progress beyond the State of the Art has been achieved:
- To perform self-localisation and tracking in cluttered and populated spaces
--> Fully achieved
- To build single- and multiple-person descriptions as well as representations of their interaction
--> Fully achieved
- To augment the 3D geometric maps with semantic information
--> Fully achieved
- To quantify the users’ levels of acceptance of social robots
--> Partially achieved: we prioritised the evaluation of the acceptability and usability in the hospital setup, and achieved the deployment of the automatic measure of the robot acceptance only in controlled environments.
- To endow robots with the necessary skills to engage/disengage and participate in conversations
--> Partially achieved: every item was validated in the hospital setup except but the capacity of the robotic platform to join a conversation, due to ethical constraints (validated only in controlled environments).
- Empower robots with skills needed for situated interactions
--> Fully achieved
- Online learning of active perception strategies
--> Partially achieved: we developed a strategy to effectively adapt an action policy to take the user feedback into account, and showed that pre-learning can reduce the amount of feedback needed.
- Demonstrate the pertinence of the project’s scientific and technological developments
--> Fully achieved
SPRING logo
SPRING-ARI robot at Inria (C) Inria
SPRING-ARI robot at Inria (C) Inria