Skip to main content

SecondHands: A Robot Assistant For Industrial Maintenance Tasks

Periodic Reporting for period 3 - SecondHands (SecondHands: A Robot Assistant For Industrial Maintenance Tasks)

Reporting period: 2017-10-01 to 2019-03-31

SecondHands is a EU-funded Horizon 2020 project aiming to design a collaborative robot (cobot) that can proactively offer support to maintenance technicians working in routine and preventative maintenance for warehouses and factories. The robot will learn through observation and by providing a second pair of hands will augment the technicians' capabilities by completing tasks at a level of precision and physical strength not available to human workers.

The SecondHands project combines the skills of world class researchers focusing on a real-world industrial use case to deliver:
- the design of a new robotic assistant
- a knowledge base to facilitate proactive help
- a high degree of human-robot interaction
- advanced perception skills to function in a highly dynamic industrial environment

As robots evolve from industrial machines performing repetitive tasks in isolated areas of large-scale factories to highly complex systems, SecondHands has the ambitious goal to solve one of the greatest challenges facing the robotics field: developing collaborative robots that can safely and intelligently interact with their human counterparts in a real-world factory environment.
"The project has successfully achieved Milestone 3: Second proof of concept: ""Robot can anticipate that help is needed throughout the entire maintenance task and is able to help accordingly"":
Robot design, mechatronics, control and software interfaces:
KIT-H2T have developed the ARMAR-6, a novel humanoid robot for advanced mobile bimanual manipulation designed for the SecondHands domain. The robot has a high level of hardware-software integration with the real-time control framework ArmarX-RT allows high-performance robot control and integrates seamlessly into the robot operating system ArmarX. At this stage of the project KIT did complete hardware and software control architecture and version update of the operating system.

Knowledge and planning:
The components from Sapienza Università di Roma interpret the scene and the human to plan and monitor the actions of the robot. The inference of action sequences uses fast downward planning. A visual execution monitoring framework verifies action pre- and post-conditions using a Mask R-CNN. A visual search policy using deep reinforcement learning control the robot head. Human action recognition is based on visual identification of relations between the human and objects present in the scene. A database of human motion primitives is automatically discovered using a compositional approach for human-robot interaction and action recognition. Estimation of the workspace dynamics is achieved. The system performs consistency checks for knowledge updating. Planner is integrated in the control architecture and is implementing continuous generation of helping activities from recognized actions and learned knowledge.

Grasping and mobile manipulation:
At KIT-H2T, the research created the Skeleton Grasp Planner for precision and power grasps. The ability for grasping unknown objects uses a deep convolutional neural network working on depth images. The grasping pipeline for part-based grasping of familiar objects has been extended by transferring grasping knowledge to similar shapes. Distance-aware motion-planning is based on using point clouds only to control mobile manipulation in unknown environments. Coordination of manipulation actions of the upper body and mobility capabilities of the mobile base for cooperative execution of manipulation tasks. Interface with skill learning and planning completed.

Human-robot action and skill learning:
EPFL defined the algorithm for compliant adaptation based on the dynamics of the cooperation using dynamical-system-based impedance-control. For the use case, an implementation of a guard-support controller using a compliant-control approach has been tested on real robotic system at EPFL, and integrated onto the ARMAR-6 robot being tested in the demonstration environment. EPFL has defined a method to infer human intention to know when to provide assistance. Learning dynamics of cooperation. Learning of spatio/temporal coordination of the motion of two-humans in the action compound of bimanual manipulation.

Dynamic scene perception, recognition and attention:
UCL have implemented Co-Fusion, dynamic real-time SLAM that can recover 3D geometry of complex scenes where objects and the camera are moving independently. Co-fusion segments, tracks and fuses each moving object independently, resulting in a temporally consistent semantic 3D map. A novel model-based 6-DoF 3D pose estimation offers state of the art performance and a dense non-rigid shape+shading modelling method captures the deformations and details of generic shapes from video. Human 3D pose is estimated from a single 2D image that outperforms previous solutions and using only 2D pose supervision for training. Online human pose tracking. Extension of recognition to further object categories and integration with activity recognition.

Natural language and multimodal interfaces:
From KIT-ISL, a domain adapted language model has been built for the automatic speech recognition component that extends"
SecondHands intends to provide a complete robot system which is able to proactively assist human workers in an industrial environment. Providing the right type of help at the right time is crucial, especially when working in hazardous environments. However, to provide good help, requires training and prior knowledge of the tasks. Helping people in a highly complex and dynamic task such as the Ocado maintenance tasks requires a variety of control, sensing and reasoning skills such as manual dexterity, force balancing, handling heavy components at height, perceptual coordination to interact with the human worker, speech dialog, understanding, planning and execution. Although it is always possible to constrain the problem in an industrial environment, the robot requirements for SecondHands are still very challenging.

SecondHands is entering the last fifth year of the research programme and already produced significant algorithmic development in robotics and computer vision for smooth and compliant object manipulation, fast and robust object tracking and complex reasoning in the face of uncertainty. This work has led to 66 publications. Many of the components have been released as open source and gain excellent feedback from the community of users. Publications, press releases and engagement with academic and commercial communities have generated significant interest, partly because of the ambitious nature of the project and partly because of the nature of the ambitions.
SecondHands vision