European Commission logo
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS

Multifunctional, adaptive and interactive AI system for Acting in multiple contexts

Periodic Reporting for period 1 - MAIA (Multifunctional, adaptive and interactive AI system for Acting in multiple contexts)

Periodo di rendicontazione: 2021-01-01 al 2022-06-30

Our interactions with objects in the environment are essential for life. They are possible through the coordination of eye, hand and body movements which represent our main interfaces with the external world.
Despite the alarming data on disability, no short-term solutions can provide a complete recovery based on the regeneration of damage nerve tissue. Therefore, in the non-acute phase of disabling disease, enhancing functional activities could represent a gainful solution. In the last few years, neuroscientists and engineers have shown the promising possibility to use cortical recordings from the human brain to drive Human Centric Artificial Intelligence (AI) controllers integrated in robotic devices to allow interactions with the environment. However, much work has to be done to achieve human-acceptable levels of control and mimic the adaptive nature of human motor behaviour. Recent research on the healthy brain has shown that a large part of this job is performed by brain areas classified as high-order cognitive areas in the posterior parietal cortex (PPC).
Although the PPC contains all necessary information, it is extremely difficult to decode at once this large cortical structure, while a successful neuroprosthetic control in real time is demanding for efficient assistance. Supplementing neural data with overt behavioral indicators of intention and attention, such as eye movements, can reduce problem complexity and improve the precision of the decoding procedures of the AI controller.
We will apply the predictive processing approach to improve even more the efficiency and the precision of the AI controller.
The MAIA project aims to develop a Human Centric AI, that exploits neural signals in combination with behavioural signals and can be integrated in different types of assistive devices such as robotic arms, wheelchairs and exoskeletons, by an approach guided by the real needs and expectation of end-users MAIA points to establish a European innovation ecosystem that can potentially span from healthcare to industry, and space exploration.
During the first 18 months of the project, we achieved the following main results:
- In WP1, different studies of gaze behaviour were conducted to establish how eye movements can be used in intention decoding, in efficient and automatic error communication and in trust building. These investigations highlighted that humans can flexibly choose which visual feature defines the target and drives oculomotor learning, depending on the current intentions and tasks. The results suggest that coloring target objects or target regions in augmented reality may be a promising candidate for the presentation of feedback from the system to the user.
- In WP2, transcranial magnetic stimulations experiments in humans and neural acquisitions in non-human primates (NHP) investigated the neural substrate to prepare the ground for the design of action intention decoding. In humans, we successfully tested the parieto-frontal networks underlying dexterous arm movements in three experiments. TMS experiments were performed on healthy participants and they allowed to demonstrate the causal role of PPC in programming and reprogramming of reaching movements, particularly when depth (distance) is crucial. In NHPs, we investigated the activity of several areas of the PPC during the preparation and execution of arm movements; we found striking similarities with the activation patterns of motor cortices already reported in scientific literature. In particular, the discharges of the parietal neurons resulted organized in discrete phases defined ‘neural states’ and identifiable with Hidden Markov models as the task progressed. Moreover, the PPC activity structure and the encoding of relevant features shared common characteristics with the activity in NHP motor cortex, that was found functionally similar to the motor cortex of a stroke human patient, implanted with a multielectrode array (see WP4). We moved to a decoding approach to prove the possibility to develop neuroprothesis based on PPC signals. We did succeed in decoding PPC activity We also employed the innovative Active Inference framework as an underlying computational theory of predictive goal-directed sensorimotor control implemented by the parietal-motor complex and as an efficient computational solution for decoding action intentions.
- In WP3, to create a trustworthy human-centric AI technology, we acquired information from potential end-users, either patients or caregivers, on their attitudes towards assistive technologies and AI-driven systems, we evaluated their expectations and suggestions regarding expected functionalities, outfits, and services that these systems should embody for future development.
- In WP4, to develop and test the methods proposed for the new paradigm of MAIA, the human dataset acquired in a chronic stroke patient within the ISMORE clinical trial was used during this first part of the project. An array for intracortical brain activity recordings had been implanted in the primary motor cortex (M1) of the ipsilesional hemisphere, in the perilesional area next to the cerebrovascular accident. A preliminary analysis on the intracortical activity of partial data was done to evaluate whether representative neural signatures of attempted motor tasks were replicated in the stroke condition.. By the application of ANOVA statistical tests and dimension reduction techniques, we found results similar to those obtained from the monkey when considering a similar task. Additionally, a novel EMG decoding method that can be integrated within a hybrid BMI for stroke rehabilitation was developed and tested in human participants, demonstrating that it can elicit motor learning.
- The main purpose of WP 5 is to create an ecosystem fully aware of the technological potential of the proposed approach, which paves the way to the future extension of the human centric AI-based control for cooperative devices to other fields going beyond the pure biomedical applications.
In the conception of MAIA, the basis of a novel technological paradigm for neuroprosthetic control was established by exchanging state-of-the-art knowledge, concepts and approaches from different fields like AI, robotics neuroscience and cognitive and social sciences. MAIA advances along two avenues: using gaze behavior in addition to body-motion signals to decode human intentions, and targeting specific regions of PPC recently found to be critical for guiding both reaching and grasping and containing self-motion information.
Specifically, MAIA results may have several scientific-technological impacts since: 1) a strong engagement of companies working on machine learning field; 2) the creation of an interactive human-centric AI system can lead to a higher acceptance and trust in the technology; 3) awareness on the possibility to improve accuracy of AI system in providing motor responses for example in medical environments; 4) shading light on new scientific and industrial realities in the field of human and AI cooperation; 5) fill the social gaps that invalidating pathologies generate.
MAIA CONCEPT