Skip to main content
European Commission logo print header

Previz for On-set Production - Adaptive Realtime Tracking

Periodic Reporting for period 1 - POPART (Previz for On-set Production - Adaptive Realtime Tracking)

Período documentado: 2015-01-01 hasta 2016-06-30

European filmmakers are well-known for their creative innovation in selecting themes, their original way of storytelling and their art of cinematography. Like elsewhere in the world, they use computer-generated visual effects to achieve the intended visual experience and quality.

To achieve the best interaction between the creative team on set and post-production team, director and producer, it is important to achieve the best common understanding between the team filming real scenes and the team working on visual effects. For this, pre-visualization (previz) has been developed. Previz today contributes to film making from the planning stage to filming with actors on set, where the final composition of mixed reality scenes can be reviewed during and right after the shoot.

Large film studios that produce films with the highest budgets take the final step and integrate previz with the final post-production. For this, they develop in-house tools or cooperate with a company that is dedicated to the previz concept. This integration remains unavailable to most filmmakers and provides a strong competitive advantage. Without the integration, a gap remains between the team filming real scenes and the team working on visual effects in post-production.

POPART will introduce an affordable and highly customizable solution that will disrupt the market and overcome this lack of competition. It will democratize the access to a complete previz solution integrated into the pipeline from shooting preparation to post-production, that doesn’t exist on the market. In contrast to the market lock-in solutions, all core elements required for real-time previz will be released in open source and will be based on open standards to provide an highly customizable solution.

In POPART, raw data are collected once on set and used in all steps of the production process. The same sequence of processing steps, differing in algorithmic complexity, is used in previz and post-production. The former allows better creative decisions on set, while the latter is streamlined with the collected data.

The open-source contribution of POPART will be based on existing open-source libraries, and constitute an important advance of the state-of-the-art.

The product released by POPART will comprise hard- and software components. It will be available at an affordable price, help improve the processes for creative teams in Europe directly, and will stimulate research by providing core libraries in open source.
During the course of the POPART project, we have developed an affordable live previz system that relies on open standards and new open source libraries.
The consortium has developed a large number of software and hardware components where some of them can be used for multiple purposes, and many of them are already released as open source allowing the community at large to take advantage of the technologies we have developed. The development effort has been significant, while mostly in line with the expectations of Annex 1, some modifications were necessary, including extending components to include features not envisioned in the proposal after they were deemed beneficial to the project outcome. One prominent modification was adding wireless access to the witness cameras through a new component called "HAL". The acquisition system was described as a simple cable in Annex 1, similar to what competing solutions have available on the market to this day. Developing a wireless acquisition system required significantly more complexity than planned, but we were able to develop this within the project's resources.

The technology developed in POPART enables an all-in-one solution for capturing a scene, placing virtual assets and set extensions and providing live previz. Compared to competing systems we provide wireless functionality, allowing freedom which is almost a requirement on sets these days, an integrated workflow from 3d reconstruction to post production, and not least a high quality camera track. A major goal for us is to provide localization data of high enough quality so that it can be used in for deferred rendering on set and even directly in the final composite of post production rendering. We have developed a plugin for compositing software using the OpenFX standard (supported by almost all compositing softwares) and we have tested it in Nuke.

The components developed have been continuously tested and evaluated on real film productions. Components such as OpenMVG, MayaMVG and Meshroom are continuously used by MIK in their productions, while HAL and related tools were successfully used in an external NRK production together with LAB in January 2016. We presented key parts of the system at NAB 2016 in Las Vegas. The full previz system was first demonstrated publicly at the annual short film festival in Grimstad, June 2016. Further external productions with POPART technology are already in planning.

We have organized communication activities through the website, conferences (Paper [Calvet 2016] presented at CVPR 2106, paper [Loviska 2016] presented at MMSys 2016), professional shows (NAB, Kortfilmfestivalen Grimstad), and open source contributions as anticipated by the proposal of the project.

The Open Source repositories are structured with sufficient API and build information :
OpenMVG : The 3d reconstruction software used in this project. The camera tracking library described in DoA has been merged into this system, as well as all the improvements to the different reconstruction pipelines.
https://github.com/poparteu/openMVG

CCTag : New open-source marker detection and identification library has been developed and publicly released.
https://github.com/poparteu/CCTag

MayaMVG : New 3D photo-modeling plugin for Maya.
https://github.com/poparteu/mayaMVG

ofxMVG : New open source camera localization and camera calibration plugins for compositing softwares which supports the OpenFX plugin standard (http://openeffects.org) like Nuke (The Foundry).
https://github.com/poparteu/openMVG.ofx

PopSift : Real-time SIFT on GPU developed and released.
https://github.com/poparteu/popsift

Camera tracking datasets : Open Access virtual and mixed reality datasets.
https://zenodo.org/search?ln=en&cc=datasets&p=popart&action_search=


[Loviska 2016] Milan Loviska, Otto Krause, Herman A. Engelbrecht, Jason B. Nel, Gregor Schiele, Alwyn Burger, Stephan Schmeißer, Christopher Cichiwskyj, Lilian Calvet, Carsten Griwodz, and Pål Halvorsen. 2016. "Immersed gaming in Minecraft" In Proceedings ofthe 7th International Conference on Multimedia Systems (MMSys Ί6). ACM, New York, NY, USA,, Article 32,4 pages.
DOI: http://dx.doi.Org/10.l 145/2910017.2910632

[Calvet 2016] Lilian Calvet, Pierre Gurdjos, Carsten Griwodz, Simone Gasparini; "Detection and Accurate Localization ofCircular Fiducials under Highly Challenging Conditions", in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 562-570
For the technical and creative impacts we have achieved the objectives sought out in the action to provide a set of tools designed to aid the productions in a meaningful way. Some of the components developed in the project have already been used in third party productions at both MIK and LAB with more planned over the next months.
The model-based camera tracking of POPART adopts a different approach than classical tracking (SLAM) techniques : the tracker is based on camera localization in the “3D visual database” of the scene that is built during the pre-shooting step, without suffering of the well known problems of sequential camera tracking techniques. The 3D reconstruction of the set does not require extensive laser surveying of the set, nor expensive sensors. The initial objective of using common hardware to keep our solution affordable for a wide spectrum of productions has remained.
The induced value to dispose of a 3D model of the set can be pretty useful for even basic set extensions or visual effects linked to this 3D knowledge of the scene (like props or characters add ons). The software toolset of POPART - especially the evolution of OpenMVG, allows different levels of precision for the 3D reconstruction as well as for the estimation of the camera pose needed for the on-set or off-set camera tracking. This algorithm consistency - with various levels of precision and quality depending on the rendering speed needed, also greatly simplifies the integration of these tools into the different steps of the production pipeline. Another pipeline integration between shooting and post-production provided by POPART is the synchronised on-set recording of H264 movies from the main camera, offering a meaningful way to help for editing process.

For the economical and strategic impacts we do not know the exact economic impact of the results at the end of the project. We can however say that the economical impact as written in the proposal is still reasonable at the conclusion of the project. Additionally, we have released a number of the components as open source software resulting in a strategic impact already observed, including one that was not anticipated by the proposal: PopSift. In the proposal, we had expected to utilize an existing GPU SIFT implementation, but we found none with adequate performance and had to write our own and later released as open source software.
popart.jpg