Shedding light on Augmented Reality
Augmented Reality (AR) uses computer-generated graphics to enhance images of real life. The aim is to advance AR technology to the point where the viewer cannot discern between the virtual and the real components of images and films. One of the most difficult tasks is to achieve realistic lighting of the virtual components. Computer scientists at the University of Manchester tackled this challenge in the IST project ARIS. Two algorithms were created, one to derive an illumination map from the image and the other to simulate the lighting of virtual objects placed within the image. The innovation of the first algorithm is that it fully automates the process of extracting illumination data, thereby achieving significant time savings over current methods in practice. Input data required include digital images of the light probe and calibration grid as well as a geometric model of the scene. The algorithm corrects discrepancies in the digital images and also adjusts for white-balance problems before generating the illumination map. Once the illumination map has been created, the lighting simulation algorithm is summoned to apply shading to the imported virtual objects. Both natural (e.g. the sun) and artificial light sources can be included. The algorithm also accounts for the effects of light reflected off the virtual objects, known as specular effects. For film applications, a balance must be struck between the refresh rate and the degree of detail of the shadows created by the algorithm. Examples of images created with the new algorithms can be viewed online at: http://aig.cs.man.ac.uk/gallery/ARIS(opens in new window) The primary target audience for the new algorithms, which have been packaged into a single prototype with a Web-based interface, is the video/film post-production industry. Further information about the IST project ARIS can be found at: http://aris-ist.intranet.gr/(opens in new window)