Skip to main content
Weiter zur Homepage der Europäischen Kommission (öffnet in neuem Fenster)
Deutsch Deutsch
CORDIS - Forschungsergebnisse der EU
CORDIS

A Robust, Real-time, and 3D Human Motion Capture System through Multi-Cameras and AI

Periodic Reporting for period 1 - Real-Move (A Robust, Real-time, and 3D Human Motion Capture System through Multi-Cameras and AI)

Berichtszeitraum: 2024-01-01 bis 2025-06-30

The Real-Move Proof of Concept (POC) addresses the limitations of traditional motion capture systems, which depend on markers, suits, or wearable sensors that add complexity, cost, and usability barriers, making them impractical for large-scale or everyday deployment. These technologies often require lengthy setup, compromise user comfort, and lack the flexibility needed for real-world environments such as clinics, sports arenas, workplaces, or industrial settings.
Our goal is to deliver an AI-based, markerless solution capable of tracking human motion in dynamic environments with high accuracy, robustness, and efficiency. Real-Move combines multi-camera inputs with learning-based pose estimation algorithms to reconstruct 3D human motion in real time. This design eliminates wearability and setup challenges, while a weighted multi-score reconstruction methodology ensures resilience to occlusions and accurate multi-person tracking. The modular use of standard RGB cameras makes the system both scalable and cost-effective, overcoming the barriers of current motion capture technologies.
Beyond raw skeletal tracking, Real-Move provides application-ready insights such as ergonomic indicators, gait and posture analysis, joint movement tracking, and activity recognition (e.g. walking, falling, gesturing). These outputs are designed to be directly usable by practitioners and organizations without requiring motion capture expertise, opening the way to adoption in fields as diverse as rehabilitation, sports performance, workplace safety, and human–robot interaction.
Real-Move builds on the experience of the ERC StG Ergo-Lean project (GA. 850932), which highlighted the shortcomings of existing systems when applied to ergonomics in human–robot–environment interaction. While tolerable in research labs, the limitations of current solutions proved unacceptable in real-world contexts. Real-Move emerged as a response to this challenge, aiming to create a truly practical, markerless motion tracking platform. Through this POC, the project seeks to raise the technology readiness level (TRL), validate the system across domains, and prepare for market adoption, establishing Real-Move as the first robust, scalable, and real-time 3D motion capture solution designed for wide impact.
During the first phase of the POC, the team achieved substantial progress on the technical front. A prototype 2.0 was developed, representing a major leap forward in terms of performance, modularity, and usability compared to earlier iterations. This version provides the foundation for robust real-world testing and future scalability.
On the software side, development advanced in parallel with hardware prototyping, ensuring seamless integration across the stack.

• 6DoF object tracking, enabling the system not only to detect object positions but also their orientation and movement in space. (Figure 1)

• Rapid calibration workflows, drastically reducing setup time and ensuring consistent performance across different environments. (Figure 2)

• Action recognition capabilities, allowing the system to classify and interpret human activities beyond simple pose tracking. Figure 3)

• Virtual and augmented reality ergonomics monitoring, designed to anticipate and prevent ergonomic risks in immersive environments by providing real-time feedback and corrective measures. (Figure 4)

Together, these advancements represent a holistic strengthening of the Real-Move platform, laying the groundwork for a versatile system that can evolve beyond motion tracking into a full context-aware human movement analysis tool.
On the business side, Real-Move began with a market consultancy study conducted with Dawn Advisory – Bocconi, which identified sports as the most commercially promising vertical, while also highlighting opportunities in rehabilitation, ergonomics, and workplace safety. These insights guided our early market validation efforts, including participation in leading sector fairs such as FIBO, Racquet Trend, and RiminiWellness, where the project gained strong recognition and was awarded a dedicated exhibition space to present a live smart gym demo powered by Real-Move’s technology. These events not only showcased the platform’s capabilities to industry stakeholders but also enabled the team to establish strategic connections with HYROX, RealVT, Technogym, and AKUIS, paving the way for pilot projects and early adopters. In parallel, we explored workplace ergonomics applications through collaborations with INAIL, LEF, and Studio Peroni, collecting real-world worker motion data to validate the system’s utility in industrial settings. From an IP perspective, we adopted a know-how and speed-to-execution strategy, focusing on in-house development and modular design, with the plan to reassess targeted patent filings or trade secret protections as hardware–software integration and calibration techniques mature. In terms of visibility, we launched the Real-Move identity through a website, logo, LinkedIn page, and domain-based communication tools, and actively engaged in key events such as EDGE 2024, the AI Festival in Milan, the Richmond Italia IT Director Forum, and the RiminiWellness pitch competition, where we demonstrated Real-Move live on stage. Complementing these activities, we submitted a new ERC Proof of Concept proposal, participated in ECCV to remain aligned with cutting-edge research, and extended our ethical protocol to enable testing with human participants outside controlled laboratory settings, an essential step toward validating Real-Move in real-world scenarios.
Object and body skeletal tracking using real-Move
Virtual interaction tracking using real-move
World alignment with AR
Mein Booklet 0 0