European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

SecondHands: A Robot Assistant For Industrial Maintenance Tasks

Periodic Reporting for period 4 - SecondHands (SecondHands: A Robot Assistant For Industrial Maintenance Tasks)

Période du rapport: 2019-04-01 au 2020-04-30

SecondHands aims to design a collaborative robot that can proactively offer support to maintenance technicians working in routine and preventative maintenance for warehouses and factories.

The project was given its name because the robot provides a “second pair of hands” to workers. The robot learns through observation and by providing a second pair of hands will augment the technicians' capabilities by completing tasks at a level of precision and physical strength not available to human workers. The robot can offer useful assistance - like holding, lifting, reaching, or passing objects. People can concentrate on the ‘skilled’ part of a job whilst the robot takes the support role - thus enabling human and machine to actively enhance each other’s complementary strengths.

The SecondHands project combines the skills of world class researchers focusing on a real-world industrial use case to deliver:
- the design of a new robotic assistant
- a knowledge base to facilitate proactive help
- a high degree of human-robot interaction
- advanced perception skills to function in a highly dynamic industrial environment

As robots evolve from industrial machines performing repetitive tasks in isolated areas of large factories to highly complex systems, SecondHands had the ambitious goal to solve one of the greatest challenges facing the robotics field: developing collaborative robots that can safely and intelligently interact with their human counterparts in a real-world factory environment.

The underlying robot platform, the ARMAR-6, was developed at the Karlsruhe Institute of Technology for the project requirements with respect to human-robot interaction. Thanks to its high level of hardware-software integration, it combines advanced sensorimotor skills with learning and reasoning abilities. It can infer when a human needs help, and proactively offer the most appropriate assistance. The robot can recognise human activities and intentions, reason about situations, and interact with humans in a natural way. It can also grasp and manipulate objects bimanually to accurately and safely use maintenance tools.
As a final staget, the consortium has successfully achieved the final milestone: “Robot can handle some exceptions (for instance the technician needs to go back to collect more tools) and generalize to new events.”

Key breakthroughs in the area of Grasping and Manipulation include:
• New humanoid robot incorporating two compliant 8 DoF arms, under-actuated hands, holonomic platform, a head with 5 cameras and a functional control architecture for integration of sensorimotor skills, learning and reasoning abilities.
• Novel methods for grasping objects and maintenance tools by combining visual and haptic sensing with model-based and data-driven machine learning approaches
• Methods for grasping small and challenging objects with under-actuated hands
• Methods for manipulating large objects in collaboration with humans where the robot decides how to grasp an object or tool depending on human actions.
• Learning techniques ranging from explorative learning to teaching or coaching by humans.
The ability to test this use case in real-world environments of Ocado’s highly automated warehouse was fundamental to the success.

ARMAR-6’s versatile artificial intelligence capabilities allow it to act in situations that are not foreseen at programming time. This means it’s capable of autonomously performing maintenance tasks in industrial facilities. It can recognise it’s collaboration partners' need of help and offer assistance. The ability to teach how to see in 3D without the requirement of fully supervised 3D training data will revolutionise the field of Machine Learning.

Key breakthroughs in the area of Task Understanding for Proactive Collaboration include:

Learning:
- Algorithms for gathering knowledge, from activities and actions sequence recognition to tools segmentation and context classification.
Reacting:
- Novel architectures for Help Recognition for a fast interaction and assistance via anticipation and forecasting;
- Advancements in guaranteeing safety - a state-dependent dynamical system to provide solutions for compliant interaction with humans;
- Reactive motion planning which enables the robot to react to human/environment interaction forces.
Scene Recognition:
- New algorithms for dynamic 3D scene understanding that can build geometric and semantic 3D maps of the environment even when objects move independently from the camera.
- Pioneering supervised and unsupervised approaches for 3D reconstruction of human poses and semantic scene understanding that can learn from just 2D labels or even no labels at all.

With collaborative robots, trust and adoption are key: what you build needs to fit into the natural ways people work, which includes being able to respond in human-suited timescales and in ways that are meaningful to humans. This means that developments in natural language are crucial to usability. Key breakthroughs made by the consortium members in this interaction include:

- Creation of a speech interface between humans and robots that is solely based on all-neural models: all-neural automatic speech recognition, all-neural dialog modelling and all-neural speech synthesis.

As part of the project dissemination, the project has produced the final project video: https://youtu.be/-KF5XSSTn_o.
This project has brought together industry and academia, engaging world-leading experts from a range of robotics and scientific disciplines. The consortium partners have developed best-in-breed technologies across robotic perception, robotic communication, and human-robot interaction. After 5 years of collaboration, the innovations that have come out of this project far exceed what each partner could have accomplished alone.

The project outcomes have the potential solving societal challenges beyond the warehouse. Eventually the collaborative robots will be key for solving many societal challenges - both as assistive robots in industry, and out in the wider world. They will bring about major benefits in terms of allowing new levels of collaboration.

SecondHands developed a complete robot system which is able to proactively assist human workers in an industrial environment. Providing the right type of help at the right time is crucial, especially when working in hazardous environments. However, to provide useful help, requires training and prior knowledge of the tasks. Helping people in a highly complex and dynamic task such as the Ocado maintenance tasks requires a variety of control, sensing and reasoning skills such as manual dexterity, force balancing, handling heavy components at height, perceptual coordination to interact with the human worker, speech dialog, understanding, planning and execution. Although it is always possible to constrain the problem in an industrial environment, the defined robot requirements for SecondHands are still very challenging.

SecondHands has completed the last year of the research and produced significant algorithmic development in robotics and computer vision for smooth and compliant object manipulation, fast and robust object tracking and complex reasoning in the face of uncertainty. This work has led to more than 80 publications. Many of the components have been released as open source. Publications, press releases and engagement with academic and commercial communities have generated significant interest.
Trials at Ocado lab 2
Trials at Ocado lab 1
SecondHands system providing help during maintenance
ARMAR-6 Robot