Skip to main content

Integrating robotic control and planning with human activity prediction for efficient human robot collaboration

Periodic Reporting for period 1 - Predict-Plan-Control (Integrating robotic control and planning with human activity prediction for efficient human robot collaboration)

Reporting period: 2018-05-01 to 2020-04-30

Despite recent advances in robotics technologies, robots are still rarely thought of being physically engaging with humans and efficient physical human-robot collaboration (pHRC) is one of the key open challenges in robotics research. For autonomously interacting with humans, robots need better decision-making capabilities which are hindered by the lack of a reliable assessment and understanding of human physical capabilities, ergonomics, and what is considered to be comfortable and safe. This understanding is crucial to any intelligent system that aims to either predict/analyze human movement or to leverage human’s response to physical collaborative actions.

This project addressed exactly this problem. More specifically, this project focused on building intelligent robots able of helping humans to perform complex, risk or tedious tasks in a safe and comfortable manner. One can imagine a friendly robot that smoothly moves and regrasps a shared board during a DIY task to reduce your muscular load (and the chance of muscular stress), postural ergonomics, safety-perception and overall human-robot collaborative comfort experience.

The success of this project can lead to reduced muscular load, and consequently lower incidence of musculoskeletal strains (which are the largest cause of work-related injuries in many industrial countries) and boost intelligent manufacturing in the UK and Europe. Finally, the interdisciplinary research taken throughout this project has an enormous potential in producing biological inspired robot applications (e.g. improving service robots, robot-based and teleoperation-based rehabilitation) and improve the understanding of human musculoskeletal response in daily forceful activities.
To build this envisioned integrated robotic system, this project first designed a collaborative robot motion generation strategy that explores grasp freedoms. We designed a technique that allows multi-arm robots to change their grasps "in-the-air", similarly as a human would do, and adjust the gripper position in order to minimize the number of regrasps (making the interaction more fluid). This was then extended to also account for possible environment contacts (and contact points from the gripper and the robot arm). In this way, surfaces available in the environment could be explored to improve the stabilization of the shared object.

Secondly, we introduced freedoms in the workspace thus allowing the robot to deviate from the desired pose/trajectory within the region where the human is still comfortable. To this aim, we designed and evaluated different assessments for human comfort: peripersonal and muscular comfort and postural ergonomics.

Peripersonal comfort relates to the psychological concept of peripersonal space that accounts for the workspace where the human feels comfortable to perform a task. It is useful to improve human awareness, safety perception and minimize the perceived risk of robot intervention.

Muscular comfort accounts for the biomechanics response during the exchange of forces. To reason over human biomechanics perspective, we modelled the influence of external forces over different musculoskeletal models for the upper-limb. With this information, we built a predictive model capable of analyzing the human muscular response and also predict the kinematic configuration (position of the joints) during physical interaction with the robot.

Finally, we integrated established concepts for industrial ergonomics with muscular-informed comfort to introduce the comfortability concept. The comfortability allows assessment of comfort in a similar fashion to the way robot manipulability computes a quality index for manipulation tasks. In addition, we designed a method to quickly assess the comfortability distribution over human workspace. Our methodology based on precomputing relevant information about ergonomics and muscular capability considerably simplifies the comfortability distribution. This comfortability distribution thus allows a designer to observe regions in the human workspace (where the arm reaches) where the human is more likely to find a comfort configuration to interact.

The results obtained from combining the predictive power of the muscular-based model and the human-centred planning and control strategies demonstrate the success of this project and reveal the potential of the project to improve human-robot interaction performance, alleviate and reduce stress and lower incidence of musculoskeletal disorders (the largest cause of work-related injuries in many industrial countries). From human-robot experiments, we have found that the comfort-based planner is able to, on average, reduce the muscular load by 69.5% compared to a user-based selection of poses. This strategy is key to achieving intuitive and fluid human-robot interaction. Experiments were performed in collaboration with the research group from Dr Chakrabarty, a sensory-motor neurophysiology specialist from the University of Leeds.

Overview of results and dissemination

From a scientific perspective, this project obtained results have been published in three of the most important robotics conferences (IROS-18, Humanoids-18, Humanoids-19), and one major journal (AURO) in the robotics community. Additionally, there are other two journal publications under review focused on human-robot applications (ACM-THRI and RAL) and another under preparation.

Obtained results were also shared to the UK robotics community at the 3rd UK Robot Manipulation Workshop, the main robotics event in the UK. We organized the 2019 workshop which involved approximately 110 delegates from roughly 32 UK universities and industries nationwide. During this Workshop, we also organized demos to showcase our methods and potential advantages to fellow researchers and research teams from industry laboratories, e.g. Ocado, Amazon Robotics, Google Deepmind, and Dyson.

For the general community, this project also produced a tool for collaborative AI that allows the computation of a human comfort quality index distribution, the Rapid Human-Robot Manipulability Assessment. The RHuMAn draws from the predictive models obtained from this project to produce an efficient comfort-quality assessment that can be quickly tailored to specific tasks and purposes. The tool is available at the AI4EU European Project.
This project explored basic concepts underlying human manipulation and leverage these concepts to produce predictive models of human motion, forces, ergonomics, comfort and safety-perception which ultimately guided robot actions during collaboration.

The main objectives were divided into three thrusts: Improving robot actions based on human response, building predictive models for human activity, kinematics, force and general comfort and planning strategies for a sequence of collaborative tasks. Individually, these lead to progress beyond the state of the art in each field and combined resulted in a novel robotic framework that closes the gap between physical and fluid human-robot collaboration. The solutions produced clearly demonstrated the potential of exploring robots to reduce muscular load and they can lead to new findings in both robotics (e.g. service and elderly-care robotics, robot-based rehabilitation) and in general the understanding of human musculoskeletal response in daily forceful activities.
Forceful human-robot collaborative experimental scenarios