European Commission logo
English English
CORDIS - EU research results

AI-Powered Manipulation System for Advanced Robotic Service, Manufacturing and Prosthetics

Periodic Reporting for period 1 - IntelliMan (AI-Powered Manipulation System for Advanced Robotic Service, Manufacturing and Prosthetics)

Reporting period: 2022-09-01 to 2023-09-30

A key challenge in intelligent robotics is creating robots that are capable of directly interacting with the world around them to
achieve their goals. On the other hand, robot manipulation is central to achieve the promise of robotics, since the definition of robot
requires that it has actuators that it can use to change the world.

In the last decades, a substantial growth has been observed in research on the problem of robot manipulation, which aims to exploit
the increasing availability of affordable robot arms and grippers to create machines capable of directly and autonomously interacting
with the world to implement useful applications. Learning will be central to such autonomous systems, as the real world contains too
many variations for a robot to have an accurate model of human requests and behaviour, of the surrounding environment, the
objects in it, or the skills required to manipulate them, in advance.

The main objective of the IntelliMan project is focusing on the question of “How a robot can efficiently learn to manipulate in a
purposeful and highly performant way”. IntelliMan will range from learning individual manipulation skills from human demonstration,
to learning abstract descriptions of a manipulation task suitable for high-level planning, to discovering an object’s functionality by
interacting with it, to guarantee performance and safety. IntelliMan aims at developing a novel AI-Powered Manipulation System with
persistent learning capabilities, able to perceive the main characteristics and features of its surrounding by means of a heterogeneous
set of sensors, able to decide how to execute a task in an autonomous way and able to detect failures in the task execution in order to
request new knowledge through the interaction with humans and the environment. IntelliMan further investigates how such AI-
powered manipulation systems are perceived by the users and what factors enhance human acceptability.
- The application requirements and KPIs have been analysed and defined for all the use case considered in the project;
- The experimental protocol to be used in UC1 has been defined by INAIL to involve patience in the project-related data collection;
- The fusion of control and sensing data as a Product of Experts have been defined has basic tool for a unified approach across the project;
- The investigation on extracting low dimensional manifolds with tensor factorization and multiresolution shape encoding with Gaussian process implicit surfaces has been started;
- Hierarchical shared autonomy models based on Hidden Markov Models has been employed to detect the various phases within the grasping process, enabling a more precise understanding of the task at hand;
- Advanced bidirectional human-robot interaction modalities are under investigation using an incremental learning approach, allowing updates of the model with few data samples, fostering the co-adaption process between user and system;
- Novel algorithms and hardware for Motor Unit extraction from sEMG signals are under development to be used in a gesture classification;
- Development of on-demand incremental update mechanism based on the merging of the grasp reliability confidence based on both Ridge Regression and Hidden Markov Models, with the aim of allowing for a more seamless and stable HITL grasp control experience;
- Data fusion modalities considering the different types of sensors that will be used in the project have been defined and will be validate din the next period on the different robotic platforms;
- The design and implementation of novel AI-Oriented manipulation devices have been achieved and experimental validation is ongoing;
- The development of understanding and reasoning about manipulation task structures has been started;
- Methodologies for the evaluation of human acceptability during human-robot interaction have been defined also making use of simulation and virtual reality environments;
- The user acceptability in the case of prosthesis and the general perception about AI empowered HMI is under development to assess the effectiveness of the methodologies defined and implemented in the project;
UNIGE and UNIBO developed a monolithic 3D printed solution for prostethic arms showing relevant reliability and mobility. This result has been patented.

IDIAP showcases the capabilities of geometric algebra when applied to robot manipulation tasks. In particular, it has been shown how the modelling of cost functions for optimal control can be done uniformly across different geometric primitives leading to a low symbolic complexity of the resulting expressions and a geometric intuitiveness. This approach can have a significant impacy in terms of simplication and computational efficiency in many robotic tasks, and in particular in the project use cases.

DLR developed a novel grasp planning algorithm for hybrid grippers that allows for multiple grasping modalities as a promising strategy to improve the manipulation capabilities of a robotic system.

UNIBO developed an advanced Human-In-The-Loop (HITL) control strategies for robot hands based on surface electromyography (sEMG) combining Non-Negative Matrix Factorization (NMF) with Deep Neural Networks (DNN) in order to both avoid explicit labeling procedures and have powerful nonlinear fitting capabilities. THis research can have a significant impact on the effectiveness of prosthesis control strategies.

UNIGE developed a neural architecture to optimize tactile elaboration systems taking into account the computational cost of the whole pipeline consisting of data preprocessing and a convolutional neural network (CNN) model to extract information. The experimental results show that systems based on standard CNNs outperform state-of-the-art techniques in terms of accuracy and computational cost, while the ones based on binary CNNs further reduce the computational cost with a small accuracy drop.

UNIGE developed a Recurrent Spiking Neural Network (RSNN) using surrogate gradient descent for naturalistic textures classification. The obtained results indicate that the aforementioned neuromorphic devices achieve several orders of magnitude gains in energy over von-Neumman hardware. Moreover, the proposed RSNN model overcomes similar state-of-the-art solutions in terms of classification accuracy and hardware complexity making it a promising candidate for embedded electronic skin applications.

UPC developed an ontology-based manipulation framework where reasoning is used to enhance perception with situation awareness, planning with domain awareness and execution with the awareness of the execution structures and enabling autonomous service robots to adapt the stages of the perceive-plan-execute cycle to perturbations ranging from small deviations on the poses of objects to large unexpected changes in the environment, as well as to recover from potential failures.

UCLV proposes a novel method to refine the 6D pose estimation inferred by an instance-level deep neural network exploiting the depth measurement of a standard RGB-D camera to estimate the dimensions of the considered object, even though the network is trained on a single CAD model of the same object with given dimensions. The improved accuracy in the pose estimation allows a robot to grasp apples of various types and significantly different dimensions successfully.