Periodic Reporting for period 4 - CogIMon (Cognitive Interaction in Motion)
Reporting period: 2018-02-01 to 2019-05-31
Compared to the richness and complexity of cognitive compliant interaction in humans, the knowledge of how to endow robots with such capabilities remains shallow. To enhance it, we need to understand which information is conveyed through physical interaction by humans and how it is used in mutual adaptation. This in turn requires to understand how models of impedance in interacting humans or robot partners can be formed, how to predict the partners' motion behaviours from their kinematics, and to observe and predict interaction forces.
Interaction in human-robot teams has typically been limited to a narrow set of carefully engineered scenarios. Thus, the overarching objective of the CogIMon project is to advance key technologies that lead to a step-change in such cognitive compliant interaction. CogIMon aims to integrate physical human-robot interaction, visually guided manipulation and model-based control software design in a systematic way. To achieve this ambitious objective, cognitive modeling, compliance and impedance control, and learning need to be integrated into a coherent approach. The project is to this aim centered around three main experimentation scenarios dedicated to the key abilities developed in the project: Compliant catching and throwing with application in physiotherapy, compliant human-humanoid interaction for joint manipulation of larger objects, e.g. carrying a table, and multi human-robot interaction for joint manipulation in an shop floor scenario.
A new scaled-up COMAN+ robot with advanced actuation and improved design has been constructed and has been presented to the scientific community at the ICRA 2019 exhibition. It already has been used for interactively guided walking and compliant handling in human-robot cooperation. Through model-driven systems engineering, open-source dynamics simulations for the target platforms COMAN, COMAN+, KUKA LWR-IV+, Stäubli TX60 and Panda have been established and are widely used in the consortium. These mirror the control interfaces of the real robots and allow for transparent switching from simulation to real world. To this aim, a component-based modeling and simulation framework CoSiMA utilizing OROCOS was developed together with a novel scheme for combining and integrating respective domain specific software languages, with particular emphasis on real-time capability. A dedicated toolchain was devised and was validated to model a tetra-robot handling of large and rather heavy workpieces.
A second focus was the investigation of human-human interaction in motion. Sensorimotor control strategies for catching have been investigated in experiments with humans exploiting motion capture. Experiments shed light on interpersonal coupling, perception of deceptive movements, and the influence of dynamic variables on the kinematics of juggling. Coupled human walking has been investigated and models are developed and used to model compliant walking in contact. Respective evaluation measures that specifically target the physical, force-based interaction were derived from the experimental data.
New skills were developed. Dynamical systems based methods for bi-manual coordination, soft catching, and to establish contacts smoothly were developed and implemented on a pair of Kuka-LWR+ and the COMAN+. Human throwing data was modeled and a respective real-time, full-body synthesis of robotic behavior was realized. Walking under constraints, step-planning to maintain and optimize contact, and coupled walking algorithms are being developed. An approach derived from human data models impedance adaptation based on perceived interaction forces in cyclic tasks and has been integrated in the CoSiMA simulation framework.
The project targets robotic experimentation to demonstrate the step-changes towards compliant interaction. Requirements have been assessed to meet the overall targeted technical readiness levels of the final experimentation scenarios. By means of vertical integration, the developed methods are embedded in the model-driven systems engineering approach. A fully modeled tetra-arm manipulation the allows for human-interaction was realized. Furthermore, VR and real-time animation, and control through CoSiMA have been integrated to enable safe throwing and catching in VR while running the real-time robot control system of the COMAN. The latter has already entered first successful testing with physiotherapy patients for increasing training intensity. Learning of manipulator dynamics, hybrid models, and friction identification has been focused on to enhance the underlying model-based control.
Partners have established innovation strategies to deploy novel methods in robotics, manufacturing, healthcare and general industry e.g. through enterprise offices and dedicated transfer labs, which frequently run smaller projects with medium sized companies. Work package three targets to derive measures for quantitative evaluation of physical interaction, which will be an important factor for the acceptance of novel assistive and rehabilitation robotics applications. Follow-up work on this has been started. In this context, a patent for a new measurement device was granted. Protocols for prospective application of throwing and catching in physiotherapy have been defined. The impact to the scientific community unfolds through more than 150 publications in high-ranked journals and scientific conferences. Also, the partners’ principal researchers are all heavily involved in academic teaching, give talks, courses and summer schools to educate young researchers. In summary, we expect that the CogIMon advancements in safe and efficient interaction with compliant and humanoid robots will raise the general level of acceptance for robotic technology, which is indispensable for applications in the private realm in the civil or health domain.