Periodic Reporting for period 2 - MAX-R (Mixed Augmented and eXtended Reality media pipeline)
Reporting period: 2024-03-01 to 2025-02-28
The project has successfully completed all five milestones, with the two remaining milestones achieved in the second reporting period. All 19 deliverables due in the final 12 months (M19–M30) have been submitted on time. The following section describes the work carried out by each beneficiary, organized by work packages and tasks, and summarizes the overall progress of the project.
- From the successful delivery of all deliverables, the two final milestones have also been achieved, with all second-period deliverables submitted on time.
- With the project now completed, there is a strong sense of success both from the coordinator’s perspective and from partners’ views. The consortium worked effectively to accomplish all objectives.
- Collaboration among partners was key to achieving the project’s milestones, as reflected in productive meetings, high-quality deliverables, and thorough preparation for the final review.
By the end of the project, all 35 deliverables have been submitted within their respective deadlines. In the first reporting period (until m18), 16 deliverables were submitted on time, and in the second period (m19-m30), the remaining 19 have also been delivered as planned.
To conclude this section, as a summary, we would like to show MAX-R in numbers:
- New/enhanced Tools, APIs & others (D4.5): More than 20 new/improved tools, APIs: Fission, The Vault, CopyCat, Porta, RSConnect, OCIO, Depth reprojection, OSManager, Quasar, VPET 2.0 DataHub, AnimHost, InfinitySet, Global Tracking, adaptive XR asset streaming, Capture of performers, Rooms, wgpuEngine, NeSt-VR, CAP2&MCP, MILE’s platform, etc.
- MAX-R pipeline: interoperability demonstrated in 3 relevant use-cases, and 18 scenarios.
- Third party tests (D4.3-4.6): More than 50 external companies / entities have reviewed and tested MAX-R innovations: Bavaria Studios, Production Park, TVE, TF1, Netflix, Samsung, Pixar, Nvidia, Daikin, CISCO, etc. (MAX-R user group)
- Dissemination (D6.5): Website (70 news); Social Media (YouTube, Linkedin); + participation in more than 110 events: CVMP 23-24; NAB 23-24; ISE 25; FMX 24; Steropsia 24; Belgium EU presidency 24, etc.
- Exploitation (D6.6): Exploitation plans for 22 MAX-R innovations.
- Scientific publications (D.6.5): 18 publications (open access).
UPF-GTI developed "Rooms," an open-source platform for creating 3D content on XR hardware, aiming to democratize 3D modeling and animation. They also built wgpuEngine, an open-source engine for desktop, XR, and 3D web apps, supporting modern graphics techniques. This engine was integrated with other MAX-R partners’ projects.
UPF-WN focused on interactive VR streaming over Wi-Fi. They optimized streaming parameters and Wi-Fi settings for large-scale, multi-user XR using Wi-Fi 6. They also developed an adaptive video bitrate algorithm to handle Wi-Fi fluctuations and improve streaming quality.
Disguise streamlined XR production with tools like RSConnect (data/metadata transmission), OS Manager (render server updates), and Depth Reprojection (enhanced camera use in hybrid productions). They also implemented OCIO algorithms for color control and integrated an HTTP SockPuppet API with Porta for better XR broadcast control.
Foundry focused on virtual production, developing Fission, a real-time compositing engine for LED volumes. They augmented this with ML models and "The Vault" metadata tool to streamline post-production. They also developed OpenAssetIO, an open standard to improve interoperability between creation tools and asset management systems.
FilmLight created Quasar, a lightweight colour grading system for real-time apps like Unreal Engine. Quasar features an intuitive interface, Truelight Colour Spaces for management, BaseGrade for linear grading, and XGrade for localized edits, easing the match of virtual and live-action elements.
Brainstorm integrated MAX-R XR tools into its InfinitySet engine. This includes a Color Correction Plugin integrated with Quasar, VPET integration for real-time scene adjustments via tablet, and an upgraded virtual teleportation system with a cost-effective synchronized tracking solution. They also created a virtual asset library for testing.
ARRI improved production efficiency through reliable metadata and integrated workflows. They advanced all-IP production workflows and developed IP control & metadata protocols (CAP and MCP) to collect rich on-set metadata with temporal consistency, validated within MAX-R workflows. This aids in reconstructing camera movements in virtual scenes, reducing costs and speeding operations.
Improbable enabled over 10,000 live participants to interact in the same digital space. Their networking and rendering tech supports large-scale interactions and spatial audio grouping. They released a self-serve platform for Unreal Engine devs to build massive interactive live events (MILEs), used by partners like the BBC.
BBC developed methods for capturing live performances for immersive virtual exploration using conventional cameras and streaming. Their system switches views dynamically based on avatar position and synchronizes virtual lighting with real-world lighting. They streamed a live concert into an interactive virtual world with Improbable.
Filmakademie released open-source XR production tools based on their TRACER FOUNDATION framework: VPET (real-time scene editing), DataHub (communication/interoperability), and AnimHost (animation generator connection). They enhanced VPET for Brainstorm and Disguise and used it in their DIGITAL LOCATIONS tool. AnimHost explores ML-driven XR animation workflows. These tools are available on GitHub.
Hasselt University developed advanced XR tracking tools for seamless virtual-real sync in large spaces, supporting various XR headsets. A mobile version turns real-world objects into VR windows. This tech applies beyond entertainment—e.g. visualizing 3D BIM metadata in AR for construction. They also created a framework for progressively streaming XR content using GLTF over HTTP/3, enabling faster scene access.
CREW developed large-area, participative XR performances like Anxious Arrivals (site-specific XR) and Alert (evacuation simulation). These tested new tech and narrative forms with real audiences. They collaborated with UHasselt on a vendor-agnostic large area tracking solution and used wireless streaming for high-quality VR, testing and optimizing with UPF.