Servizio Comunitario di Informazione in materia di Ricerca e Sviluppo - CORDIS

FP5

ARIS Sintesi della relazione

Project ID: IST-2000-28707
Finanziato nell'ambito di: FP5-IST
Paese: United Kingdom

Light reconstruction and light simulation in augmented reality image synthesis

The ability to merge synthetically generated objects into images of a real scene is becoming central to many applications of computer graphics and in particular mixed or augmented reality. In many situations the merging must be performed at rates of many frames-per-second if an illusion of interactivity is to be maintained.

Traditionally the competing requirements of real-time rendering and visual realism have meant that generating photo-realistic augmented images at interactive rates has been a distant goal.

Our contribution to this work package has been two fold:
- The development of a fully automatic and robust algorithm for capturing illumination data; and

- The development of a new lighting simulation algorithm that is able to composite synthetic objects into a background image at interactive rates, shading the object with illumination that is consistent with the real environment.

In contrast to previous work, we will not restrict the situations we consider to distant lighting (i.e. assuming that the lights are infinitely far from the object).

To reconstruct an illumination map we require as input one digital images of the light-probe and calibration grid and an approximate geometric model of the scene. The algorithms then automatically estimates the pose of the camera and the position of the light-probe; deals with the non-linearities in the input images; corrects for poor camera white-balance corrections; and maps the reconstructed lighting data onto the geometric scene model for use by the rendering techniques that have developed.

Although light-probes and fish-eye lenses have previously been used to reconstruct a representation of the incident light in a scene, the algorithms developed in this work package are, to our knowledge, the first that allow the process to be robustly automated, removing much of the effort in capturing and calibrating images and simplifying the entire process of illumination reconstruction.

The lighting simulation algorithm constructs an irradiance volume using the illumination data reconstructed from a light-probe, and uses this to apply diffuse shading to synthetic objects. Specular effects are added using dynamically generated environment maps, blurred to simulate the effect of glossy reflection. Graphics hardware is used to render shadows cast by synthetic objects, with soft shadows generated from direct and indirect illumination source. A trade-off may be made at run-time between shadow accuracy and rendering cost by varying the number of shadow rendering passes that are performed.

Both illumination capture and rendering techniques consider the effects of direct sunlight ensuring that shadows are only cast into areas that receive direct sunlight. Additionally, synthetic light sources can be introduced into the environment modifying the illumination of the background image according to the pattern of light emitted by the object, and can also modify the shading and shadows case by synthetic objects.

Examples of the techniques in use for a variety of different lighting environments compared to photographic references is given here: http://aig.cs.man.ac.uk/gallery/ARIS

The work has been presented in a number of peer-reviewed publications. For more information see: http://aig.cs.man.ac.uk/publications/publications.php

Current status: a prototype of both the illumination reconstruction and lighting simulation tools, integrated into a web based interface for ease of use, is available.

More information on the Aris project can be found at: http://aris-ist.intranet.gr/

Informazioni correlate

Reported by

University of Manchester
Oxford Road
M13 9PL Manchester
United Kingdom