Skip to main content
Vai all'homepage della Commissione europea (si apre in una nuova finestra)
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS

Transforming European Industrial Ecosystems through eXtended Reality enhanced with human-centric AI and secure, 5G-enabled IoT

Periodic Reporting for period 1 - INDUX-R (Transforming European Industrial Ecosystems through eXtended Reality enhanced with human-centric AI and secure, 5G-enabled IoT)

Periodo di rendicontazione: 2024-01-01 al 2025-06-30

INDUX-R is an application-driven project leveraging eXtended Reality (XR) to address market demands and societal needs, offering new business models and empowering humans. It is human-centric, involving end-users in defining and evaluating technology.
The project tackles two main challenges: expanding commercial products and enhancing human capabilities. It evolves LiveMedia into LiveMediaXR (UC1), boosts sports immersion with NOMADE (UC5), enables X-Ray vision for safer, more productive Industry 4.0 workplaces (UC2), supports virtual medical training (UC3), and delivers 4D historical reconstructions for cultural tourism (UC4). Together, these applications advance knowledge access, healthcare, workplace safety, social interaction, and recreation.
INDUX-R’s goal is to deliver scientific, technological, societal, and economic impact, strengthening European competitiveness through a “XR made in Europe” ecosystem. It builds on local expertise with hardware such as CREAL’s light-field HMDs and indigenous software for XR asset creation, egocentric perception, HCAI, dynamic UIs, 5G, and secure IoT. These advances reduce reliance on foreign technology, foster industry growth, and train new XR developers. The project also pioneers virtual world platforms with methods for 3D reconstruction, lifelike avatars, and deformable VR objects. Performance and scalability are enhanced by 5G and computational offloading, while secure XR-IoT ensures privacy and data protection.
User-centred design ensures relevance and desirability, with co-design and explainable AI enhancing transparency. Applications target empowerment through knowledge access, productivity, safety, training, and inclusivity, while SSH perspectives guide ethics, privacy, and trust.
Finally, INDUX-R promotes an open XR ecosystem aligned with European values, offering an alternative to closed, non-European solutions. Its expected impact includes digital sovereignty, strengthened industries, improved quality of life, and a more ethical, inclusive, and sustainable digital future.
The INDUX-R work began with a structured analysis of the technological landscape, identifying XR adoption limitations, relevant standards, and open-source tools. A parallel co-creation process defined functional and non-functional requirements across use cases using personas, journey maps, and task analyses, while integrating legal, ethical, and social considerations. This produced high-fidelity prototypes and detailed scenarios as blueprints for development.
A core achievement is the modular, scalable, and future-proof system architecture, built on an "ethics-by-design" approach. It identified required components across five layers, Sensor Network, Wearable Device, Edge Node, Network, and Cloud, supported by multiple architectural viewpoints.
Key advancements included realistic digital environments and avatars generation. Multi-strategy 3D reconstruction of pilot and historical sites, a pipeline for room acoustics, and personalized avatars via double-view optimization and photorealistic 4D head models. A robust SDK enabled animation and motion modelling with real-time soft-body interactions, automated rigging, and facial animation (landmark- and speech-driven). Speech processing improved denoising, ASR, and emotion classification. Motion control used reinforcement learning and inverse kinematics, while an AI surgical assistant integrated an LLM dialogue manager and gesture recognition.
Human-centric perception was strengthened through advanced tracking, real-time segmentation, 3D pose estimation from monocular RGB, and AR-based machinery inspection using SLAM techniques. Intelligent interfaces featured a dynamic UI toolkit with adaptive ontology-based components, a reinforcement learning decision maker, and cognitive load assessment from speech data. Early VR apps supported omni-conferencing and Industry 4.0 training, both preliminarily evaluated.
Infrastructure included cloud-based IoT middleware for secure data, 5G with tailored network slices, and XR scene orchestration for multi-user apps. Preparations for validation involved requirement traceability and a collaborative development environment.
Early piloting began with “Live Tests” collecting car sensor data during racing, verifying worker localization and feedback in industrial plants, and testing medical training with students. These trials benchmarked system performance and usability in real-world conditions.
INDUX-R has achieved major advances. For 3D reconstruction of historical buildings, images from different time periods produced accurate digital recreations that preserve cultural heritage, with object reconstructions reaching sub-millimeter precision. Personalized avatars are generated from monocular input using front and back views, producing high-quality textures and robust meshes. A new loss function improved head avatar reconstruction scores, while room digitization enabled realistic sound propagation in VR.
The XR SDK now supports soft-body modeling, speech processing, emotional state recognition, and an AI virtual assistant, enhancing user interaction. In human mesh recovery, monocular image and video methods reduced computational cost while maintaining performance; temporal stability improved by leveraging forward and reverse motion. Feature-matching algorithms enabled reliable localization of users and objects. Adaptive UIs based on emotional state and improvements to the decision-maker module reduced visual clutter and rendering time. The XR orchestrator supports real-time communication and scaling via 5G, handling many concurrent clients without bandwidth loss. INDUX-R’s middleware ensures seamless communication between system components and connected sensors.
The INDUX-R concept
Il mio fascicolo 0 0