Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Robotics and Artificial Intelligence for Critical Asset Monitoring

Periodic Reporting for period 1 - RAICAM (Robotics and Artificial Intelligence for Critical Asset Monitoring)

Reporting period: 2023-01-01 to 2024-12-31

The central scientific goal of RAICAM is to conduct research into the underlying technologies in Robotics and Artificial Intelligence (AI) that will unlock the capability for a fleet of robots to conduct coordinated sampling campaigns in industrial facilities with varying levels of autonomy.
The use of robots for the inspection, maintenance, and repair (IMR) of critical infrastructure assets will revolutionise many sectors such as energy generation, energy transmission, civil infrastructure and utilities. In fact, for some industries, the use of robotics is the only way to realise their full potential. For example, offshore wind energy generation is only feasible at the scales required if robotic platforms are used, from a safety, economic and skills shortage perspective.
To realise their full potential, robots will need to interact with the environment to perform maintenance and repair activities. Environmental interaction is a highly complex activity requiring a multi-disciplinary approach combining cognition (perception and reasoning), control, planning and, depending on the level of autonomy, human interactions.
Whilst human technicians currently conduct sampling regimes, there is often significant uncertainty associated with repeatability, accuracy, and efficacy of the samples. A recent study has shown that automated systems have the potential to outperform humans when collecting samples. Humans are also limited as to where they can take samples from, either due to environmental hazardous (radiation, chemicals, heat) or location (confined spaces or heights).
The central scientific goal of RAICAM is to conduct research into the underlying technologies in Robotics and AI that will unlock the capability for a fleet of robots to conduct coordinated sampling campaigns in industrial facilities with varying levels of autonomy.

The RAICAM Doctoral Network will train the next generation of robotic systems engineers who will develop creative and innovative multi-disciplinary skills, enhancing their inter-sectoral mobility.
FORTH: DC completed initial CAD of modular legged robots and integrated it into simulations (MuJoCo, IsaacSim). A RL-based locomotion control method was developed, and controllers are being deployed on the Unitree Go2 for real-world testing.
ENSTA-PARIS: Six studies showed how cognitive load and trust impact human-robot interaction, highlighting the need for balance in information, autonomy, and reliability for effective collaboration.
TUM: Contributions include a ROS2 state machine and valve pose estimation for autonomous manipulation. Research focused on optimizing robot motion speeds in human-shared environments, with dynamic speed adaptation algorithms.
IIT: Developed a teleoperation interface integrating locomotion and manipulation, plus autonomous manipulation strategies using DDPG for object sliding and shared autonomy for surface sampling.
USE: Investigating soft manipulators for flapping-wing robots, developing control algorithms and sensor systems to improve interaction and perform cooperative tasks.
TAU: Developed an open drone platform for visual navigation, including a digital twin and learning-based model, with successful preliminary experiments.
UTU: Advanced a Birds Eye View architecture for efficiency and accuracy, and is preparing a review article on multi-robot systems in perception, planning, and control.
UNIMAN: Yifeng Tang, in his first year, developed a Dual Quaternion model for the MIRRAX robot, setting the foundation for future motion control development.
UWE: Focus groups with operators from UKAEA and Sellafield Ltd. informed design requirements for teleoperation interfaces in nuclear facilities.
KIT: KIT's DC joined in December 2024 and developed a real-time point cloud streaming system for robot teleoperation using XR, with a GPU-accelerated pipeline for real-time scene reconstruction.
FORTH: The modular design of the legged robots studied is unique, enabling easy adaptation to different terrains and simplified controller use.
ENSTA: Our studies highlight the balance of information and robot autonomy to boost user trust and reduce mental effort, improving safety and efficiency in tasks like teleoperation. Bayesian models predict trust changes, and reinforcement learning adapts guidance based on the user's mental state. Expanding studies to industrial settings will help validate these models and promote adoption in real-world applications.
IIT: Our DDPG RL framework for non-prehensile sliding, enhanced by an LSTM model for friction estimation via an LSTM, achieves robust zero-shot sim-to-real transfer. Our teleoperation interface offers low-cost, sensor-independent control and demonstrates strong potential for commercialization and international adoption.
KIT: The integration of real-time point cloud visualization for teleoperation represents an advancement beyond the current state of the art for improving 3D representation and accuracy. Multi-camera fusion addresses depth perception issues and enhances real-time scene reconstruction.
TAU: The developed learning-based visual navigation mode is the first drone navigation model that was purely data-driven and embodiment-agnostic.
TUM: The candidate research has the potential to generate new industry standards regarding human safety, improving the robot efficiency without jeopardizing human co-workers safety, which would increase the range of applications where robots are convenient to be used.
USE: A Port-Hamiltonian-based control method for cooperative manipulation tasks with two flapping-wing robots is introduced. The algorithm ensures even load distribution and demonstrates effective trajectory tracking for dual-arm tasks on a system comprising two flapping-wing birds.
UTU: We developed an innovative bird's eye view algorithm for ground robots, enhancing situational awareness and navigation, especially in complex environments. It improves object recognition, multi-robot coordination, and task allocation.
UWE: We identified 22 design requirements for robot teleoperation in nuclear facilities, guiding the design of user interfaces to meet end-users' needs in high-risk environments.
RAICAM Consortium 18 th of Nov. 2024, during the second training school at ENSTA-PARIS.
2nd image related to the progress of IIT during the 1st reporting period
Highlights from the Hi Paris Robotics Symposium that took place on the 4th and 5th of June at ENSTA
The drone developed by TAU during the 1st reporting period
1st image related to the progress of IIT during the 1st reporting period
Workshops 1,2,3 and Training school 1 were held at IIT, Genova, Italy, during 15th - 18th of April
RAICAM DCs. 18 th of Nov. 2024, during the second training school at ENSTA-PARIS.
UNIMAN DC and supervisors at the ENSTA-PARIS premisses during the second Training School
XR environments, related to the progress of KIT during the 1st reporting period
On 29 April, Alessandro Melone (TUM) was involved in the organization of a workshop
uan José Garcia Cardenas (ENSTA-Paris) awarded the Best Pitch Award as a 1st year PhD Student
Design of the drone developed by TAU during the 1st reporting period
Workshops 1,2,3 and Training school 1 were held at IIT, Genova, Italy, during 15th - 18th of April
DC of USE (Sahar). Research on advanced manipulation using flapping wing robots.
The Kick-off meeting of RAICAM was held in TUM’s facilities, in Munich
My booklet 0 0