Community Research and Development Information Service - CORDIS

ERC

CONT-ACT Report Summary

Project ID: 637935
Funded under: H2020-EU.1.1.

Periodic Reporting for period 1 - CONT-ACT (Control of contact interactions for robots acting in the world)

Reporting period: 2015-06-01 to 2016-11-30

Summary of the context and overall objectives of the project

What are the algorithmic principles that would allow a robot to run through a rocky terrain, lift a couch while reaching for an object that rolled under it or manipulate a screwdriver while balancing on top of a ladder? By trying to answer these questions in CONT-ACT, we would like to understand the fundamental principles for robot locomotion and manipulation and endow robots with the robustness and adaptability necessary to efficiently and autonomously act in an unknown and changing environment. It is a necessary step towards a new technological age: ubiquitous robots capable of helping humans in an uncountable number of tasks.

Dynamic interactions of the robot with its environment through the creation of intermittent physical contacts is central to any locomotion or manipulation task. Indeed, in order to walk or manipulate an object, a robot needs to constantly physically interact with the environment and surrounding objects. Our approach to motion generation and control in CONT-ACT gives a central place to contact interactions. Our main hypothesis is that it will allow us to develop more adaptive and robust planning and control algorithms for locomotion and manipulation. The project is divided in three main objectives: 1) the development of a hierarchical receding horizon control architecture for multi-contact behaviors, 2) the development of algorithms to learn representations for motion generation through multi-modal sensing (e.g. force and touch sensing) and 3) the development of controllers based on multi-modal sensory information through optimal control and reinforcement learning.

Work performed from the beginning of the project to the end of the period covered by the report and main results achieved so far

In the first 18 months of the project, we have developed the main parts of the receding horizon control architecture (1st objective of the project). We have proposed new methods to plan whole-body multi-contact behaviors for legged robots in near realtime. We are now able to plan complicated motions, for example a humanoid climbing up stairs, walking over stepping stones or using its hand and legs to climb up on an obstacle. An important part of our work was to study the mathematical structure of the optimization problems related to the multi-contact behaviours. This allowed us to simplify the problem and to propose algorithms that were significantly faster than the state of the art. Complementary to this work, we have also studied how the timing at which a robot takes a step changes the stability of walking. We have proposed a new algorithm able to quickly adapt step location and timing and showed that it significantly improves the stability of the robot when it is pushed or when it slips on the ground. In parallel to these optimal control problems, we have also made progress on the problem of fusing multiple sensor modalities (force, inertial and position sensors) to get good estimation of the state of the robot during contact tasks (2nd objective of the project). Finally, we have studied how uncertainty in the knowledge of contact locations changes the optimal way of creating a contact on an object. We have used risk-sensitive optimal control techniques to propose a new algorithm able to handle contact uncertainty during contact interactions. As a result, the controller is able to create very gentle touch with an object when its position is uncertain, which allows to increase the safety and robustness of the interaction (3rd objective of the project).

Progress beyond the state of the art and expected potential impact (including the socio-economic impact and the wider societal implications of the project so far)

Thus far, we have gained a better understanding in problems related to the motion of robots in contact with their environment and we have proposed algorithms able to compute complicated locomotion patterns in multi-contact that are significantly faster than the state of the art. We are now able to plan in near realtime complicated motions, for example a humanoid climbing up stairs, walking over stepping stones or using its hand and legs to climb up on an obstacle. This opens many possibilities to create more reactive and robust behaviors. Moreover, these results significantly improve our understanding of the fundamental algorithmic principles of locomotion and manipulation. We hope that this will be useful to the development of autonomous legged robots that are able to locomote in unknown and challenging environments. Potential applications of such robots include disaster relief scenarios, construction and service robots.
Record Number: 198111 / Last updated on: 2017-05-16
Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top