To bridge the gap between virtual and real worlds including interactive broadcast services and high-end linear production (movie production) through digital analysis of reality and advanced, real-time, motion / position / [e-motion] feedback.
The project intends to define, develop, implement and integrate advanced tools and techniques for a synergetic, interactive, symmetric and seamless mixing of reality and virtuality in linear (script-based) and non-linear (impromptu) production.
This will open the way to a novel WYSIWYG approach to production, which will give back the director the full control over the filming of live (real and virtual) action.
ORIGAMI intends to define, develop, implement and integrate advanced tools and techniques for a synergetic, interactive, symmetric and seamless high quality mixing of reality and virtuality in linear and non-linear production. The Project proposes a synergetic integration of technologies for the virtualisation and the authoring of 3D environments, which will be suitable for a wide range of extreme modelling situations and set extensions. It also proposes solutions for the mixing of real and synthetic live action in a virtual studio with off-line real and synthetic content, while offering a real-time preview of the final result. ORIGAMI will produce an integrated software package that will enable the user to perform a variety of tasks, such as set planning; set extension; and real-time direction in a simple, intuitive and cost-effective fashion.
The project activity will last 30 months and will be organised in 7 workpackages.
Project ordination, within WP0, will be carried out in synergy and concertation with other projects working in similar research areas within and outside the IST program.
WP1 will be devoted to the definition of systems specification; for the integration of all software modules within a common flexible environment; and to provide the user with a friendly and intuitive graphical interface.
Technical workpackages WP2 to WP5 will develop software modules for the off-line creation of a virtualised extended set, and for the managing of the interactions between real, virtual, off-line content and live action.
WP2 will be oriented to the implementation of advanced and robust camera motion tracking techniques.
WP3 will deal with object and scene reconstruction based on geometric surface estimation.
Activities in WP4 will be focused on light-field estimation for the plenoptic modelling of scenes; and on the estimation of the radiometric model associated to viewed surfaces. It also includes methods for light-field calibration and illumination correction, to be employed in virtual studio applications.
WP5 will be devoted to the managing of the interactions between real and virtual actors in an extended (virtual) set.
All activities related to the setting up of the extended virtual studio will be included in this WP, as well as the managing of positional, symbolic and emotional feedback between real and virtual and vice-versa.
Finally, WP6 will include all the activities related to the organisation of intermediate and final demonstrations in the three application scenarios of interest.
As part of this activity, at the end of the project, the ORIGAMI consortium will organize a Workshop for public demonstration and dissemination of the achieved results. A short film will be submitted for public exhibition to international events such as the SIGGRAPH Electronic Theater.
1) Research and develop a set of tools to analyse digitise and edit real environments and actors.
2) Research and develop software and hardware solutions for real-time actor feedback.
3) Provide an application environment with a consistent user interface across all the tools, allowing the final user to make a profitable production use of the developed tools.
4) Field-test the results, producing both a real on-line TV production simulation and a set of short demo movie productions.
Funding SchemeCSC - Cost-sharing contracts
W1A 1AA London
W1F 8GH London