Skip to main content
Vai all'homepage della Commissione europea (si apre in una nuova finestra)
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS

Mixed Augmented and eXtended Reality media pipeline

Periodic Reporting for period 1 - MAX-R (Mixed Augmented and eXtended Reality media pipeline)

Periodo di rendicontazione: 2022-09-01 al 2024-02-29

MAX-R is a 30-month IA that will define, develop and demonstrate a complete pipeline of tools for making, processing and delivering maximum-quality XR content in real time. The pipeline will be based on open APIs, open file and data transfer formats to encourage development and support the integration of new open source and proprietary tools. MAX-R builds on recent research and advances in Virtual Production technologies to develop real-time processes to deliver better quality, greater efficiency, enhanced interactivity, and novel content based on XR media data. The interdisciplinary Consortium of eleven partners from five countries covers the chain from technology development and product innovation to creative experiment and demonstration, and from XR media creation to delivery to the final consumer. It is built around Europe’s leading media technology developers, together with creative organisations operating in AR/VR/MR, Virtual Production, interactive games, new media, TV, video and news, multimedia and immersive performance.
Our project has made significant strides, with progress exceeding expectations across all work packages (WPs):

Achievement of Milestones:
Milestones 1, 2, and 3 have been fully achieved, marking critical progress in various aspects:
- Scenarios and Architecture: Comprehensive development and structuring of scenarios and architectural frameworks.
- Open APIs: Successful implementation and utilization of open APIs.
- Professional and Open Source Tools: Integration and utilization of both professional and open-source tools for enhanced efficiency.

Milestones 4 and 5, focusing on integration, validation, and demonstration, are progressing well, showcasing advanced use cases:
- Involvement of Third Parties: Engagement of third-party professionals in significant projects, such as Depth Reprojection by DIS with customers, Fission in a Tent-Pole production, and Teleported presenters technology (BRA) with Apunt TV.
- Deployment of Tools: Implementation of tools in production or planned for deployment, such as ML CopyCat in Nuke (FO) and TRACER (released in Github) for Survivor 3D film (FABW).
- MAX-R Tech Demos: Successful demonstrations of MAX-R technology in various contexts, including "New Departures" at NUMIX Lab in December 2023 (CREW) and a 'Listening Party' for the K-pop artist Twice in February 2024 (IMP).
From live events to a pipeline enabling real time production: API tools and real-time workflows
Disguise is pioneering real-time XR workflows for live events, enhancing its XR system. Their MAX-R initiative aims to develop APIs for postproduction tools to work in live environments, empowering creators to make decisions on set. They're optimizing XR/VP workflows with simpler UIs and facilitating real-time rendering through virtualized cloud-based solutions to meet the increasing demand for photorealistic content.

Camera data synchronisation and real-time data communication
MAX-R focuses on synchronizing camera and lighting data in real-time XR environments to maintain quality and efficiency. Building on ARRI's work with Unreal Engine, the project aims to standardize communication protocols, design user-friendly interfaces, ensure interoperability, and provide guidelines. ARRI will continue hardware and tool development beyond MAX-R.

Real-time compositing workflows for virtual production
Foundry plans to revolutionize virtual production with real-time compositing workflows. They aim to bridge the gap between postproduction and on-set collaboration, preserving crucial metadata for flexibility and consistency. Their suite of tools, including ML-based and GPU-enabled solutions, will facilitate on-set compositing and review, enhancing the director's vision.

A novel concept and toolset for colour manipulation
FilmLight is developing a revolutionary tool for color manipulation in VP/XR environments, addressing challenges such as spectral mismatches. The tool allows intuitive and precise manipulation of colors without affecting others, aiming to make it accessible in real-time workflows for non-colorists, integrated with other production tools.

Virtual events production for sectors outside the traditional media industries
Brainstorm aims to expand virtual event production beyond traditional media industries by developing innovative tools for sectors like tourism, education, and corporate presentations. Their goals include integrating virtual studio technology with video-conference platforms and enabling remote presenters to be inserted into virtual environments, as well as teleporting them to real events using LED walls.

Bridging the gap between live capture for broadcast and immersive content creation
BBC plans to bridge the gap between live broadcast and immersive content creation by developing technology for creating '2.5D' representations of foreground action. This involves using deep learning for image segmentation and inferring depth from camera feeds, enabling simultaneous production for live audiences, TV, and XR delivery. Integration with the MAX-R pipeline will facilitate cloud-based processing.

Massively interactive live events: intimacy at scale
Improbable plans to create a framework for game engines that enables massively interactive XR experiences. They aim to provide intimate interaction at scale, allowing participants to see, hear, and interact with each other and performers in a virtual event space, unlike the restricted experiences of current videogame events. This involves rendering individualized customizations and animation states, and providing aggregated audio inputs for a realistic crowd atmosphere.

XR-enabled browser-based tools
UPF-GTI aims to transform WebGLStudio into WebXRstudio, enabling creation of XR content directly in XR-enabled browsers. This involves updating backend and frontend elements for future-proof tools, implementing XR-enabled interactions with a WYSIWYG-like paradigm, and focusing on experience development in line with game engines. Additionally, the backend of the tools will be updated to support emerging APIs and those developed in the MAX-R pipeline.

Open-source scene data and animation exchange for collaborative real-time XR production
Filmakademie Baden-Württemberg plans to address data transfer and synchronization challenges in XR production by developing an XR Data Hub and an open-source AnimHost. The XR Data Hub will facilitate communication between open-source and proprietary tools, enabling real-time collaboration and scene synchronization. The open-source AnimHost will support animation exchange and direction in real-time environments, connecting animation generators to various applications and tools within the XR pipeline.

Post-Covid live performances (including scalable large area tracking) to challenge and test XR pipelines
MAX-R project aims to challenge and test XR pipelines through live performances involving partners CREW, BBC, Improbable, and third parties. They seek to embed empathy, intimacy, and interactivity into XR experiences, particularly for post-Covid live performances. CREW will design virtual and real worlds connecting through XR for public, hybrid, and virtual events, while BBC will lead testing and evaluation, aiming to engage remote audiences as if they were physically present. University of Hasselt will develop a large area global tracking system to ensure accurate tracking for large-scale XR environments.
Il mio fascicolo 0 0