CORDIS provides links to public deliverables and publications of HORIZON projects.
Links to deliverables and publications from FP7 projects, as well as links to some specific result types such as dataset and software, are dynamically retrieved from OpenAIRE .
Deliverables
Final Dissemination Report (R, PU, M36). Report on dissemination activities for Years 1-3 and plans for further dissemination of results. It will include links to the academic publications produced in the course of the project.
Lightfield Assets for SAUCELightfield Assets for SAUCE (R, PU, M18) is the set of captured assets in 4D- resp. 5D-lightfield format stored in an asset inventory accessible for all SAUCE partners, as well as the documentation of this inventory.
Report on Experimental Production Scenario ResultsReport on Experimental Production Scenario Results (R, PU, M36). Documentation on the production from beginning to end (technical, creative, coordination details), with a post-mortem section specifying all the best-practices carried out in the experimental production. It includes a written report and visual material (a short film).
Project Handbook and Quality PlanProject Handbook and Quality Plan (R, PU, M2) contains general operational information including contact details, reporting processes, communication protocols, document templates and numbering methods, logos et cetera. Quality plan elements include information about technical quality control, risk logs and the procedures for risk monitoring, decision-making and conflict resolution.
Combined Evaluation ReportCombined Evaluation Report (R, PU, M36) The industry partners (DNeg, FO & IK) will provide results of all production evaluation tasks for SAUCE.
Specification ReportSpecification Report (R, PU, M36). Draft specification of SAUCE tools and example descriptors.
Report on environment adaptation BReport on environment adaptation B (R, PU, M30) covers the topic of the set of image processing tools to adapt color and contrast appearance of content to display characteristics, viewing conditions and individual characteristics of observer.
Demo of Transcode Mechanism (DEM, PU, M27) complying with the requirements of WP5T6.
Animation graph traversal optimisationAnimation graph traversal optimisation (DEM, PU, M24). This will be a technical demo showing the dynamic traversal of a constructed animation graph depending on user commands and the in-world state at a given time. The demonstration will consist of two levels of control determining the path through an animation graph. The first level will be a response to an input voice command and the second will be the real time modification of the path through the animation graph depending on in-world stimulus and conditions.
Virtual Production prototype toolkitVirtual Production prototype toolkit (DEM, PU, M24). Demo of exemplary Semantic Descriptor based on an Interface Description, internally delivered on M12, in an interactive (virtual) production environment.
Motion StylizationMotion Stylization Implementation (DEM, PU, M24). Algorithms of style transfer and motion stylization; stylization of basic upper torso gestures; and stylization of full body motions. Implementations within WebGLStudio.
Initial demo of Tools to transform asset representationInitial demo of tools to transform asset representation (DEM, PU, M15) will show asset types and their capabilities being recognized by the framework for linking together.
Final Showcase DemonstrationFinal Showcase DemonstrationFinal Showcase Demonstration (DEM, PU, M30). Final showcase demonstration at key industry event.
Crowd scene synthesis and metrics for quality evaluationCrowd scene synthesis and metrics for quality evaluation (DEM, PU, M30) This will demonstrate an automatic crowd scene synthesis method and provide metrics for its quality evaluation.
Animation pipeline and demo of new runtime rigAnimation pipeline and demo of new runtime rig (DEM, PU, M6) will be a technical demo showcasing the real time Inverse Kinematics rig generation given an input of animation assets and associated mesh and skeleton. This will result in a virtual character whose animation can be procedurally modified during runtime based on the identification of key end-effectors and how the user wishes to drive them.
Working framework to handle relationship contexts between scene and peopleWorking framework to handle relationship contexts between scene and people (DEM, PU, M18) A framework for learning relationships and context between a scene and people in it, and synthesizing a scene with interacting people.
D5.9 Tools for synthesizing animation without a rig• A tool to demonstrate an interpolation scheme that produces more realistic results than current linear blending systems based on pose-space deformation. • A tool to allow for footstep cleanup of animation clips based on terrain (bumpy ground, stairs etc.)
Tools for editing mo-cap dataTools for editing mo-cap data (DEM, PU, M24) • A tool that allows to alter character's path, based on user input or environment description. • A tool that allows to change style of an animation, or to match style of two animations to allow them to be seamlessly combined. • A tool that allows to solve for collisions by applying minimal adjustments to animations already placed in a scene.
Basic framework to enable asset transformBasic framework to enable asset transform (DEM, PU, M18) • A framework for describing arbitrary sets of transformations. • A set of tools that can perform the transformations, and advertise their capabilities to a search engine and asset management system. • Methods for indicating/specifying properties to drive the transformation and measure appropriateness.
Smart Search PrototypeSmart Search Prototype (DEM, PU, M30). The deliverable is a demonstration of a software application built on top of the search framework, which will allow a user to search for assets using three different modes of searching: by property, by tags and by descriptors. The search will also include the notion of related assets, allowing the user to find assets that indirectly match the search criteria.
Tools to validate and upgrade assetsTools to validate and upgrade assets (DEM, PU, M18). The set of tools will comply with the criteria of WP4T2.
Final Digital Asset Store Supporting GeolocationFinal Digital Asset Store Supporting Geolocation (DEM, PU, M33). Demonstrate the system, outline the final destination location of the assets stored, display redundancy and high-available capabilities.
Tools for splicing together animation clipsTools for splicing together animation clips (DEM, PU, M24) covers various tools implementing methods from motion graph-related publications.
Prototype of Asset StorePrototype of Asset Store (DEM, PU, M30) Demonstrate the storage and retrieval of assets using a prototype client application.
Smart Search Framework POCSmart Search Framework POC (DEM, PU, M12). Proof-of-concept (POC) demonstration of how the framework and mock-up User Interface will work using a reduced set of descriptors.
Demo of transitional animation generationThis will be a technical demo in which animations will be generated on the basis of semantic constraints, with appropriate transitions so that they appear smooth and natural.
Semantic Labelling ToolboxSemantic Labelling Toolbox (DEM, PU, M24). The deliverable is an application with a demonstration of the capabilities of WP4T3.
Project Website (DEC, PU, M6). A high quality multimedia website where public deliverables and information will be housed.
Transcoders for SAUCE assets (OTHER, PU, M30) is the set of software tools to create and consume the SAUCE LF assets. The software and the assets will be made publicly available.
Accelerated Tools for Creating Smart AssetsAccelerated Tools for Creating Smart Assets (OTHER, PU, M24) is the set accelerated versions of tools introduced in D3.2, including the levels of the acceleration. The tools will be made available for the partners to be integrated in their respective asset management frameworks, and will eventually become publicly available.
Publications
Author(s):
Antoine Grimaldi, David Kane, Marcelo Bertalmío
Published in:
Journal of Vision, Issue 19/2, 2019, Page(s) 13, ISSN 1534-7362
Publisher:
Association for Research in Vision and Ophthalmology
DOI:
10.1167/19.2.13
Author(s):
David Kane, Marcelo Bertalmío
Published in:
Journal of Vision, Issue 19/1, 2019, Page(s) 16, ISSN 1534-7362
Publisher:
Association for Research in Vision and Ophthalmology
DOI:
10.1167/19.1.16
Author(s):
Harini Hariharan; Thorsten Herfet
Published in:
International Conference on Consumer Electronics (ICCE), 2018
Publisher:
IEEE
Author(s):
Gil Rodríguez R, Vazquez-Corral J, Bertalmío M.
Published in:
Transactions on Image Processing (TIP), 2018
Publisher:
IEEE
Author(s):
Itziar Zabaleta, Marcelo Bertalmio
Published in:
2018 Colour and Visual Computing Symposium (CVCS), 2018, Page(s) 1-6, ISBN 978-1-5386-5645-7
Publisher:
IEEE
DOI:
10.1109/cvcs.2018.8496499
Author(s):
Itziar Zabaleta, Marcelo Bertalmio
Published in:
SMPTE 2018, 2018, Page(s) 1-12, ISBN 978-1-61482-960-7
Publisher:
IEEE
DOI:
10.5594/m001835
Author(s):
Trevor Canham, Michael J. Murdoch, David Long
Published in:
SMPTE 2018, 2018, Page(s) 1-15, ISBN 978-1-61482-960-7
Publisher:
IEEE
DOI:
10.5594/m001810
Author(s):
Alexander Gomez-Villa, Adrian Martin, Javier Vazquez-Corral, Marcelo Bertalmio
Published in:
The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019
Publisher:
IEEE
Searching for OpenAIRE data...
There was an error trying to search data from OpenAIRE
No results available