Periodic Reporting for period 1 - Real-Move (A Robust, Real-time, and 3D Human Motion Capture System through Multi-Cameras and AI)
Berichtszeitraum: 2024-01-01 bis 2025-06-30
Our goal is to deliver an AI-based, markerless solution capable of tracking human motion in dynamic environments with high accuracy, robustness, and efficiency. Real-Move combines multi-camera inputs with learning-based pose estimation algorithms to reconstruct 3D human motion in real time. This design eliminates wearability and setup challenges, while a weighted multi-score reconstruction methodology ensures resilience to occlusions and accurate multi-person tracking. The modular use of standard RGB cameras makes the system both scalable and cost-effective, overcoming the barriers of current motion capture technologies.
Beyond raw skeletal tracking, Real-Move provides application-ready insights such as ergonomic indicators, gait and posture analysis, joint movement tracking, and activity recognition (e.g. walking, falling, gesturing). These outputs are designed to be directly usable by practitioners and organizations without requiring motion capture expertise, opening the way to adoption in fields as diverse as rehabilitation, sports performance, workplace safety, and human–robot interaction.
Real-Move builds on the experience of the ERC StG Ergo-Lean project (GA. 850932), which highlighted the shortcomings of existing systems when applied to ergonomics in human–robot–environment interaction. While tolerable in research labs, the limitations of current solutions proved unacceptable in real-world contexts. Real-Move emerged as a response to this challenge, aiming to create a truly practical, markerless motion tracking platform. Through this POC, the project seeks to raise the technology readiness level (TRL), validate the system across domains, and prepare for market adoption, establishing Real-Move as the first robust, scalable, and real-time 3D motion capture solution designed for wide impact.
On the software side, development advanced in parallel with hardware prototyping, ensuring seamless integration across the stack.
• 6DoF object tracking, enabling the system not only to detect object positions but also their orientation and movement in space. (Figure 1)
• Rapid calibration workflows, drastically reducing setup time and ensuring consistent performance across different environments. (Figure 2)
• Action recognition capabilities, allowing the system to classify and interpret human activities beyond simple pose tracking. Figure 3)
• Virtual and augmented reality ergonomics monitoring, designed to anticipate and prevent ergonomic risks in immersive environments by providing real-time feedback and corrective measures. (Figure 4)
Together, these advancements represent a holistic strengthening of the Real-Move platform, laying the groundwork for a versatile system that can evolve beyond motion tracking into a full context-aware human movement analysis tool.