Skip to main content

Smart Asset re-Use in Creative Environments

Deliverables

Final Dissemination Report

Final Dissemination Report (R, PU, M36). Report on dissemination activities for Years 1-3 and plans for further dissemination of results. It will include links to the academic publications produced in the course of the project.

Lightfield Assets for SAUCE

Lightfield Assets for SAUCE (R, PU, M18) is the set of captured assets in 4D- resp. 5D-lightfield format stored in an asset inventory accessible for all SAUCE partners, as well as the documentation of this inventory.

Report on Experimental Production Scenario Results

Report on Experimental Production Scenario Results (R, PU, M36). Documentation on the production from beginning to end (technical, creative, coordination details), with a post-mortem section specifying all the best-practices carried out in the experimental production. It includes a written report and visual material (a short film).

Project Handbook and Quality Plan

Project Handbook and Quality Plan (R, PU, M2) contains general operational information including contact details, reporting processes, communication protocols, document templates and numbering methods, logos et cetera. Quality plan elements include information about technical quality control, risk logs and the procedures for risk monitoring, decision-making and conflict resolution.

Combined Evaluation Report

Combined Evaluation Report (R, PU, M36) The industry partners (DNeg, FO & IK) will provide results of all production evaluation tasks for SAUCE.

Specification Report

Specification Report (R, PU, M36). Draft specification of SAUCE tools and example descriptors.

Report on environment adaptation B

Report on environment adaptation B (R, PU, M30) covers the topic of the set of image processing tools to adapt color and contrast appearance of content to display characteristics, viewing conditions and individual characteristics of observer.

Demo of Transcode Mechanism

Demo of Transcode Mechanism (DEM, PU, M27) complying with the requirements of WP5T6.

Animation graph traversal optimisation

Animation graph traversal optimisation (DEM, PU, M24). This will be a technical demo showing the dynamic traversal of a constructed animation graph depending on user commands and the in-world state at a given time. The demonstration will consist of two levels of control determining the path through an animation graph. The first level will be a response to an input voice command and the second will be the real time modification of the path through the animation graph depending on in-world stimulus and conditions.

Virtual Production prototype toolkit

Virtual Production prototype toolkit (DEM, PU, M24). Demo of exemplary Semantic Descriptor based on an Interface Description, internally delivered on M12, in an interactive (virtual) production environment.

Motion Stylization

Motion Stylization Implementation (DEM, PU, M24). Algorithms of style transfer and motion stylization; stylization of basic upper torso gestures; and stylization of full body motions. Implementations within WebGLStudio.

Initial demo of Tools to transform asset representation

Initial demo of tools to transform asset representation (DEM, PU, M15) will show asset types and their capabilities being recognized by the framework for linking together.

Final Showcase DemonstrationFinal Showcase Demonstration

Final Showcase Demonstration (DEM, PU, M30). Final showcase demonstration at key industry event.

Crowd scene synthesis and metrics for quality evaluation

Crowd scene synthesis and metrics for quality evaluation (DEM, PU, M30) This will demonstrate an automatic crowd scene synthesis method and provide metrics for its quality evaluation.

Animation pipeline and demo of new runtime rig

Animation pipeline and demo of new runtime rig (DEM, PU, M6) will be a technical demo showcasing the real time Inverse Kinematics rig generation given an input of animation assets and associated mesh and skeleton. This will result in a virtual character whose animation can be procedurally modified during runtime based on the identification of key end-effectors and how the user wishes to drive them.

Working framework to handle relationship contexts between scene and people

Working framework to handle relationship contexts between scene and people (DEM, PU, M18) A framework for learning relationships and context between a scene and people in it, and synthesizing a scene with interacting people.

D5.9 Tools for synthesizing animation without a rig

• A tool to demonstrate an interpolation scheme that produces more realistic results than current linear blending systems based on pose-space deformation. • A tool to allow for footstep cleanup of animation clips based on terrain (bumpy ground, stairs etc.)

Tools for editing mo-cap data

Tools for editing mo-cap data (DEM, PU, M24) • A tool that allows to alter character's path, based on user input or environment description. • A tool that allows to change style of an animation, or to match style of two animations to allow them to be seamlessly combined. • A tool that allows to solve for collisions by applying minimal adjustments to animations already placed in a scene.

Basic framework to enable asset transform

Basic framework to enable asset transform (DEM, PU, M18) • A framework for describing arbitrary sets of transformations. • A set of tools that can perform the transformations, and advertise their capabilities to a search engine and asset management system. • Methods for indicating/specifying properties to drive the transformation and measure appropriateness.

Smart Search Prototype

Smart Search Prototype (DEM, PU, M30). The deliverable is a demonstration of a software application built on top of the search framework, which will allow a user to search for assets using three different modes of searching: by property, by tags and by descriptors. The search will also include the notion of related assets, allowing the user to find assets that indirectly match the search criteria.

Tools to validate and upgrade assets

Tools to validate and upgrade assets (DEM, PU, M18). The set of tools will comply with the criteria of WP4T2.

Final Digital Asset Store Supporting Geolocation

Final Digital Asset Store Supporting Geolocation (DEM, PU, M33). Demonstrate the system, outline the final destination location of the assets stored, display redundancy and high-available capabilities.

Tools for splicing together animation clips

Tools for splicing together animation clips (DEM, PU, M24) covers various tools implementing methods from motion graph-related publications.

Prototype of Asset Store

Prototype of Asset Store (DEM, PU, M30) Demonstrate the storage and retrieval of assets using a prototype client application.

Smart Search Framework POC

Smart Search Framework POC (DEM, PU, M12). Proof-of-concept (POC) demonstration of how the framework and mock-up User Interface will work using a reduced set of descriptors.

Demo of transitional animation generation

This will be a technical demo in which animations will be generated on the basis of semantic constraints, with appropriate transitions so that they appear smooth and natural.

Semantic Labelling Toolbox

Semantic Labelling Toolbox (DEM, PU, M24). The deliverable is an application with a demonstration of the capabilities of WP4T3.

Project Website

Project Website (DEC, PU, M6). A high quality multimedia website where public deliverables and information will be housed.

Transcoders for SAUCE assets

Transcoders for SAUCE assets (OTHER, PU, M30) is the set of software tools to create and consume the SAUCE LF assets. The software and the assets will be made publicly available.

Accelerated Tools for Creating Smart Assets

Accelerated Tools for Creating Smart Assets (OTHER, PU, M24) is the set accelerated versions of tools introduced in D3.2, including the levels of the acceleration. The tools will be made available for the partners to be integrated in their respective asset management frameworks, and will eventually become publicly available.

Searching for OpenAIRE data...

Publications

Statistics of natural images as a function of dynamic range

Author(s): Antoine Grimaldi, David Kane, Marcelo Bertalmío
Published in: Journal of Vision, Issue 19/2, 2019, Page(s) 13, ISSN 1534-7362
DOI: 10.1167/19.2.13

A reevaluation of Whittle (1986, 1992) reveals the link between detection thresholds, discrimination thresholds, and brightness perception

Author(s): David Kane, Marcelo Bertalmío
Published in: Journal of Vision, Issue 19/1, 2019, Page(s) 16, ISSN 1534-7362
DOI: 10.1167/19.1.16

Light field compression by superpixel based filtering and pseudo-temporal reordering

Author(s): Harini Hariharan; Thorsten Herfet
Published in: International Conference on Consumer Electronics (ICCE), 2018

Color matching images with unknown non-linear encodings

Author(s): Gil Rodríguez R, Vazquez-Corral J, Bertalmío M.
Published in: Transactions on Image Processing (TIP), 2018

Photorealistic Style Transfer for Cinema Shoots

Author(s): Itziar Zabaleta, Marcelo Bertalmio
Published in: 2018 Colour and Visual Computing Symposium (CVCS), 2018, Page(s) 1-6
DOI: 10.1109/cvcs.2018.8496499

In-camera, Photorealistic Style Transfer for On-set Automatic Grading

Author(s): Itziar Zabaleta, Marcelo Bertalmio
Published in: SMPTE 2018, 2018, Page(s) 1-12
DOI: 10.5594/m001835

Influence of Ambient Chromaticity on Portable Display Color Appearance

Author(s): Trevor Canham, Michael J. Murdoch, David Long
Published in: SMPTE 2018, 2018, Page(s) 1-15
DOI: 10.5594/m001810

Convolutional Neural Networks Deceived by Visual Illusion

Author(s): Alexander Gomez-Villa, Adrian Martin, Javier Vazquez-Corral, Marcelo Bertalmio
Published in: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019