"CON-HUMO focuses on novel concepts for automatic control based on data-driven human models and machine learning. This enables innovative control applications that are difficult if not impossible to realize using traditional control and identification methods, in particular in the challenging area of smart human-machine interaction. In order to achieve intuitive and efficient goal-oriented interaction, anticipation is key. For control selection based on prediction a dynamic model of the human interaction behavior is required, which, however, is difficult to obtain from first principles. In order to cope with the high complexity of human behavior with unknown inputs and only sparsely available training data we propose to use machine-learning techniques for statistical modeling of the dynamics. In this new field of human interaction modeling – data-driven and machine-learned – control methods with guaranteed properties do not exist. CON-HUMO addresses this niche.
Key methodological innovation and breakthrough is the merger of probabilistic learning with model-based control concepts through model confidence and prediction uncertainty. For the sake of concreteness and evaluation the focus is on one of the most challenging problem classes, namely physical human-machine interaction: Because of the physical contact between the human and the machine not only information, but also energy is exchanged posing fundamental challenges for real-time human-adaptive and safe decision making/control and requiring provable stability and performance guarantees. The developed methods are a direct enabler for societally important applications such as machine-based physical rehabilitation, mobility and manipulation aids for elderly, and collaborative human-machine production systems. With its fundamental results CON-HUMO lays the ground for the systematic control design for smart human-machine/infrastructure interaction."
Field of science
- /natural sciences/computer and information sciences/artificial intelligence/machine learning
Call for proposal
See other projects for this call