Project description
Intelligent Content and Semantics
'Virtual Film Factory'
iMP will create architecture, workflow and applications for intelligent metadata-driven processing and distribution of digital movies and entertainment. These will enable a ‘Virtual Film Factory’ in which creative professionals can work together to create and customise programmes from Petabyte-scale digital repositories, using semantic technologies to organise data and drive its processing.
Challenge
iMP proposes a radical extension to the use of metadata, linking it to semantic technologies to support and enhance the creative processes, to unify the treatment of sound and image, and remove the barriers between postproduction, customisation, formatting and distribution.
Innovation
The outcomes will be five major innovations in the form of:
• Petabyte scale data management, processing and storage systems for the digital film and content industries, which eliminate the un-necessary storage and rendering of transitory data
• Creative applications whereby changes made to any of the principal processes (including editing, graphics, compositing and grading) to be applied automatically across the workflow and viewed interactively, together as a whole rather than separately and sequentially
• Work processes that make it possible for audio and video to be processed together and interact with each other, so that changes in the image geometry are automatically reflected as changes in the audio spatialisation
• Distribution systems that can automatically optimise digital content to individual theatres or playout platforms, and customise versions for different audiences
• A Virtual Film Factory that integrates all the data storage and management, postproduction and distribution technologies, tools and processes in a distributed networked system.
Impact
iMP will produce a step change in the efficiency of postproduction and distribution along with improved methods of producing finished versions of media for distribution to the correct destination. It will support production of higher quality media, by allowing the professional craftsmen in postproduction easily and accurately to track different versions. It will also support the automatic production of correctly versioned and formatted material for playout in cinemas and pubic spaces, in terms of resolution, aspect ratio, screen size and throw, version, sound format, language, subtitling, and other parameters. iMP will reduce cost for distributors, and provide audiences with versions of media tailored to their needs.
iMP will allow the easy and cost effective creation of more variations of material, increasing consumer choice, and ultimately audiences and box office take. iMP seeks to complement the highly developed skills that professional creative operatives possess, with support from machine skills in the day-to-day housekeeping that operators find tedious. It will enable a closer coupling of all of the stages of production of compelling material and will make it possible to create media in which the sound and image respond to each other. As well as providing more compelling and realistic representations of the real world, the interaction of sound and image will create opportunities for artistic expression that do not now exist.
iMP will apply intelligent technologies via metadatabases to the vast number of versions, edits, grades, languages, subtitles and other media objects required in the production of high quality finished media. iMP will improve the quality and reduce cost of media postproduction by allowing the professional operator efficiently to choose the ‘right’ edit with the ‘right’ grade, with the ‘right’ acoustics on the soundtrack, independent of whether he’s actually assembled this version before.
iMP will create architecture, workflow and applications for intelligent metadata-driven processing and distribution of digital movies and entertainment. The goal is to enable a 'Virtual Film Factory' in which creative professionals can work together to create and customise programmes from Petabyte-scale digital repositories, using semantic technologies to organise data and drive its processing. By separating metadata from essence, controlling all the image and sound processing operations from the metadata layer, we can maintain the underlying data library unchanged while enabling a new generation of more flexible applications. This will radically reduce the amount of data created: new versions, grades, or language releases only result in additional metadata, not new data files. The system will support a more automated workflow for content distribution from postproduction to the assembly, distribution and playout of multiple variations of programmes in different formats and locations. Outcomes will be:\tAn infrastructure in which multi-Petabye data stores are managed by persistent metadata in a distributed metadatabase.\tReal-time interaction with media sequences selected from the data store. When a sequence is changed, by an application, the commands are stored as a new set of metadata. The data remains unchanged in the store, but viewing and listening sequences are rendered 'on the fly. Semantic instruction sets to define associations between sequences of essence, the processes applied, and the uses to which they are put. Integration of currently separate processes (such as grading, CGI, audio effects and editing, version creation, previewing and viewing, customisation, mastering, distribution) in a 'Virtual Film Factory.' Changes to the video space will transfer to changes in a three-dimensional audio representation. Automated adjustment of video and audio to the physical characteristics, acoustics or screen size of the viewing environment.
Fields of science
Call for proposal
FP7-ICT-2007-3
See other projects for this call
Funding Scheme
CP - Collaborative project (generic)Coordinator
08002 Barcelona
Spain