Community Research and Development Information Service - CORDIS

H2020

AIDE Report Summary

Project ID: 645322
Funded under: H2020-EU.2.1.1.4.

Periodic Reporting for period 2 - AIDE (Adaptive Multimodal Interfaces to Assist Disabled People in Daily Activities)

Reporting period: 2016-02-01 to 2017-01-31

Summary of the context and overall objectives of the project

Around 80 million people in the EU, a sixth of its population, have a disability. They are often hindered from full social and economic participation by various barriers physical, psychological and social factors. Nowadays, the recent trends in assistive technology for supporting activities of daily living (ADL), mobility, communication and so on are based on the integration of the capabilities of the user and the assistive technologies.
The AIDE project has the ambition to strongly contributing to the improvement of the user-technology interface by developing and testing a revolutionary modular and adaptive multimodal interface customizable to the individual needs of people with disabilities (Fig.1). It will, furthermore, focus on the development of a totally new shared-control paradigm for assistive devices that integrates information from identification of residual abilities, behaviors, emotional state and intentions of the user on one hand and analysis of the environment and context factors on the other hand. The initial concept has been redesigned to fulfill the needs, requirements and desires of the target end-users with the collaboration of occupational therapists, healthcare students and end-users.
A series of applications for the AIDE system have been identified across several domains in which disabled people could greatly benefit: Communication, Environmental control, Wearable robots and Entertainment.

Work performed from the beginning of the project to the end of the period covered by the report and main results achieved so far

During the first period, the ethical guidelines to be adopted within AIDE were defined. After that, the target scenarios were identified by end-users’ focus groups. Specifically, four scenarios emerged gathering the user requirements of the different focus groups: 1) communication; 2) Environmental control; 3) Hygiene task; and 4) Preparing and eating a meal.
Based on end-users’ requirements in each scenario, AIDE system were characterized and specified. To do that, AIDE system were divided in three main sub-systems: 1) Hardware and software architecture of the multimodal interface; 2) Multimodal sensory processing; and 3) Shared human-machine control. All the components of each sub-system were technically specified including its expected performance.
One of the main results of the first period was the prototype of a wearable and wireless system for biosignal recording and processing to be used as a key element in the AIDE multimodal sensory processing system (Fig.2 and 3).
The main outcomes of the second year are:
- Cedar Foundation obtained the ethical approval for testing AIDE system and it is ready to receive AIDE system to begin validation with end users during the third year of the project. In addition, three experimental sessions were conducted to monitor the design and developments of AIDE hardware/software components (Fig.4).
- A modular standard architecture for AIDE multi-modal interface based on the messaging system Yet Another Robotic Platform (YARP) has been developed, completed and tested.
- A novel shoulder-elbow robotic exoskeleton attached to a wheel chair fixed system with the possibility of using it for either the left or the right arm, according to the user's need and residual motion capabilities. To complete AIDE exoskeleton, a new prono-supination and hand assistance exoskeleton has been developed, integrated and tested.
- A universal system to control the movements of electric wheelchairs linked with AIDE interfaces has been developed and tested.
- Two intention detection methods based on: 1) hybrid EEG/EOG system; and 2) EMG activity.
- Algorithms to detect and track the 3D position and orientation of texture-less objects. Gaze estimation algorithms have been developed and tested in different experimental sessions.
- Algorithms to recognize the user activity in real ADL tasks: Support Vector Machine (SVM), Artificial Neural Network (ANN) and Decision Tree (DT) algorithms.
- Development and optimization of the Low Level Controller of AIDE arm exoskeleton.
- A motion planning system grounded on a Learning by Demonstration approach and Dynamic Movement Primitives (DMP) was developed.
-The Finite-State Machine (FSM) was enriched with additional modules, to give the user full control of the task execution.
- Regarding the HLC and LLC for communication, control and entertainment, mainly five tasks was carried out: the development of the AIDE-BJ Service, the development of the AIDE-BJ Cmd, the creation of specific grids for Grid3, the development of AIDE-BJ SHX software (a custom-built solution for EC and entertainment), the development of AIDE-BJ SHX hardware /firmware.

Progress beyond the state of the art and expected potential impact (including the socio-economic impact and the wider societal implications of the project so far)

We expect a high impact with the redesign of the arm exoskeleton for radically new design approaches used to achieve AIDE requirements and specifications. From a commercial point of view, a single device able to change the arm configuration has a greater impact since it can reach a wider number of potential end users. The inclusion of passive degrees of freedom and size regulations in the kinematic chain increases the safety of the physical human-robot interface (avoiding the transferring of undesired loads to the articulations by allowing the active joint to be always aligned with the respective anatomical joint axes), as well as the comfort of the device, which can be adapted to users with different anthropometries. It is worth noting that the redesign of the exoskeleton led to a new invention that SSSA is going to protect by filing a patent application.
In the state of the art of multi-modal architectures for the control of upper-limb robotic exoskeletons, few wearable robotic platforms have been already developed and tested in activities of daily living scenarios with shared human-robot control systems. Differently from the previous works, the AIDE system has been integrated with a hybrid control interface, based on gaze tracking, EEG and EOG, which allows the user to (i) trigger the execution of different sub-actions independently, and (ii) to interrupt the task at any time. Since the general framework of the AIDE project is focused on multi-modal interfaces for assistance in daily life scenarios, the shared control system was designed to provide the users with an autonomous control for the selection and execution of different tasks. We hypothesized that the user with disabilities would prefer having high autonomy and control over the single actions performed by the system, and that this would improve the users' independence and self-fulfilment in daily life scenarios.
It should be noted that the hybrid BNCI initially developed in WAY project and redesigned and improved in AIDE project were tested with 6 SCI subjects outside the laboratory. The results was published in Science Robotics by S.R. Soekadar et al. with a high impact in the scientific community and mass media.
The proposed hierarchical learning control system improves the motion planning strategies in assistive robotics since it will be able to manage modification in the robot initial and target pose as well as complex trajectories.
We are analyzing the chance of protecting the universal system to control the movements of electric wheelchairs linked with the interfaces proposed by the project. In this period, a wireless mechatronic device has been developed and tested in UMH facilities. From a commercial point of view, BJ is interested to introduce the device inside its products portfolio.

Related information

Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top