Community Research and Development Information Service - CORDIS

Tone mapping and perceptual evaluation for augmented reality image synthesis

Merging computer generated objects into images or videos of a real scenes is becoming extremely important to many applications of computer graphics and in particular mixed or augmented reality. High fidelity renderings of real scenes should accurately reproduce any large dynamic range of luminances that may be present in reality. This is particularly important for applications in which an accurate representation of the illumination is critical for the correct perception of the environment, for example visibility in driving simulators, the perception of archaeological sites or indeed the application that the ARIS (Augmented Reality Image Synthesis) project is based on.

Although it is possible to generate high dynamic range images, most modern display devices are still only capable of displaying images with contrast ratios of less than two orders of magnitude. Tone mapping operators are thus typically used to generate a low contrast image, which should be perceived in a similar way as if the viewer were standing in the real scene.

Our contribution to this workpackage has been two fold:
- The development of psychophysical framework to validate augmented reality images and;

- The development of a novel local and perceptual tone mapping operator which allows compressing the high range of luminance levels in real scene in order to display them on typical monitors.

Various tone mapping operators have been published but none are capable of good contrast reduction and simulating aspects of human vision. Our algorithm is based on widely accepted psychophysical data and takes into account the retinal response to light when processing luminance information in the scene. The algorithm developed aims to generate images visually similar to a real scene by careful mapping to a set of luminances that can be displayed on a low contrast ratio display or printed. It was therefore fundamental not only to validate the ARIS system from a rendering point of view but also test the performance of the tone mapping operator against real scenes. This ensured that the results of the ARIS system are an accurate representation of reality, which is critical for our applications.

Validation of the system as a whole was also fundamental for ARIS. Various perceptual experiments were conducted testing many parameters such as number shadow passes, importance of specular highlights, texture resolution and level of detail. These results allowed us to determine the minimum processing power while maintaining the highest perceptual quality.

We believe that the results of this workpackage could benefit future work in this area. Specifically for those applications which require perceptually accurate results. Our tone mapping operators not only works with static images but also animations which could also be used in movies or computer games.

Publications of the University of Bristol related to the ARIS project follow. Ledda P, Santos L.P., Chalmers A. "A Local and Dynamic Model of Visual Adaptation for High Dynamic Range Images". AFRIGRAPH 2004, Cape Town, Nov 2004. Ledda P, Chalmers A., Seetzen H. "HDR Displays: a validation against reality". International Conference on Systems, Man and Cybernetics 2004, The Hague, The Netherlands, Oct 2004. Ledda P, Chalmers A., Seetzen H. "A Psychophysical Validation of Tone Mapping Operators Using a High Dynamic Range Display". Symposium on Applied Perception in Graphics and Visualization, LA. Ledda P., Ward G, Chalmers A. "A Wide Field, High Dynamic Range Stereographic Viewer". GRAPHITE 2003 Conference, Feb 2003. Peter Longhurst and Alan Chalmers. “User Validation of Image Quality Assessment Algorithms”. In: EGUK 04. IEEE Computer Society, June 2004.

More information on the Aris project can be found at:

Related information

Reported by

University of Bristol
Senate House Tyndall Avenue
BS8 1TH Bristol
United Kingdom
See on map
Follow us on: RSS Facebook Twitter YouTube Managed by the EU Publications Office Top