Skip to main content
European Commission logo print header

Augmented Reality Image Synthesis through illumination reconstruction and its integration in interactive and shared mobile AR-systems for E- (motion)-commerce applications

Deliverables

The ability to merge synthetically generated objects into images of a real scene is becoming central to many applications of computer graphics and in particular mixed or augmented reality. In many situations the merging must be performed at rates of many frames-per-second if an illusion of interactivity is to be maintained. Traditionally the competing requirements of real-time rendering and visual realism have meant that generating photo-realistic augmented images at interactive rates has been a distant goal. Our contribution to this work package has been two fold: - The development of a fully automatic and robust algorithm for capturing illumination data; and - The development of a new lighting simulation algorithm that is able to composite synthetic objects into a background image at interactive rates, shading the object with illumination that is consistent with the real environment. In contrast to previous work, we will not restrict the situations we consider to distant lighting (i.e. assuming that the lights are infinitely far from the object). To reconstruct an illumination map we require as input one digital images of the light-probe and calibration grid and an approximate geometric model of the scene. The algorithms then automatically estimates the pose of the camera and the position of the light-probe; deals with the non-linearities in the input images; corrects for poor camera white-balance corrections; and maps the reconstructed lighting data onto the geometric scene model for use by the rendering techniques that have developed. Although light-probes and fish-eye lenses have previously been used to reconstruct a representation of the incident light in a scene, the algorithms developed in this work package are, to our knowledge, the first that allow the process to be robustly automated, removing much of the effort in capturing and calibrating images and simplifying the entire process of illumination reconstruction. The lighting simulation algorithm constructs an irradiance volume using the illumination data reconstructed from a light-probe, and uses this to apply diffuse shading to synthetic objects. Specular effects are added using dynamically generated environment maps, blurred to simulate the effect of glossy reflection. Graphics hardware is used to render shadows cast by synthetic objects, with soft shadows generated from direct and indirect illumination source. A trade-off may be made at run-time between shadow accuracy and rendering cost by varying the number of shadow rendering passes that are performed. Both illumination capture and rendering techniques consider the effects of direct sunlight ensuring that shadows are only cast into areas that receive direct sunlight. Additionally, synthetic light sources can be introduced into the environment modifying the illumination of the background image according to the pattern of light emitted by the object, and can also modify the shading and shadows case by synthetic objects. Examples of the techniques in use for a variety of different lighting environments compared to photographic references is given here: http://aig.cs.man.ac.uk/gallery/ARIS The work has been presented in a number of peer-reviewed publications. For more information see: http://aig.cs.man.ac.uk/publications/publications.php Current status: a prototype of both the illumination reconstruction and lighting simulation tools, integrated into a web based interface for ease of use, is available. More information on the Aris project can be found at: http://aris-ist.intranet.gr/
One possibility to offer the ARIS (Augmented Reality Image Synthesis) service to customers of a furniture house is through the website of the company, providing the possibility to furnish one's house virtually in a photo-realistic manner. The result includes the web services necessary to deploy the service, along with the locally executables (applets running on the customer's PC), all connections to the e-Commerce site of the company and a communication component, which allows for the direct communication between the customer and specialists (such as interior designers) for the creation of suggestions towards the customer. In the case of the interactive (Web-based) application scenario for ARIS, the web client is the customer's personal computer, which is equipped with a digital photo-camera and is connected through a broadband connection to the 3D e-commerce server. A possible deployment scenario involves an application server on the server side connected to the furniture repository of the company and its web-server. The client at home is deploying an applet for the viewing of the content. Additionally the remote expert is using an advanced version of the same tool with increased privileges. Technical Details: The necessary 3D geometry reconstruction and illumination reconstruction are usually performed at the customer's machine and are operated by the customer. We assume that the customer's machine is powerful enough (any state-of-the-art PC at the time of writing, i.e. end of 2004) to perform the aforementioned tasks and the user is familiar with the procedure of 3D geometry and illumination reconstruction, which is partly automated. If the user faces any difficulties in the procedure or thinks that the results are not sufficiently satisfactory, an alternative is the provisioning of an expert teleconsultations. In this case all data he/she used for the 3D geometry and illumination reconstruction are transmitted to the machinery of the expert, who is now to perform the whole procedure. More information on the Aris project can be found at: http://aris-ist.intranet.gr/
Merging computer generated objects into images or videos of a real scenes is becoming extremely important to many applications of computer graphics and in particular mixed or augmented reality. High fidelity renderings of real scenes should accurately reproduce any large dynamic range of luminances that may be present in reality. This is particularly important for applications in which an accurate representation of the illumination is critical for the correct perception of the environment, for example visibility in driving simulators, the perception of archaeological sites or indeed the application that the ARIS (Augmented Reality Image Synthesis) project is based on. Although it is possible to generate high dynamic range images, most modern display devices are still only capable of displaying images with contrast ratios of less than two orders of magnitude. Tone mapping operators are thus typically used to generate a low contrast image, which should be perceived in a similar way as if the viewer were standing in the real scene. Our contribution to this workpackage has been two fold: - The development of psychophysical framework to validate augmented reality images and; - The development of a novel local and perceptual tone mapping operator which allows compressing the high range of luminance levels in real scene in order to display them on typical monitors. Various tone mapping operators have been published but none are capable of good contrast reduction and simulating aspects of human vision. Our algorithm is based on widely accepted psychophysical data and takes into account the retinal response to light when processing luminance information in the scene. The algorithm developed aims to generate images visually similar to a real scene by careful mapping to a set of luminances that can be displayed on a low contrast ratio display or printed. It was therefore fundamental not only to validate the ARIS system from a rendering point of view but also test the performance of the tone mapping operator against real scenes. This ensured that the results of the ARIS system are an accurate representation of reality, which is critical for our applications. Validation of the system as a whole was also fundamental for ARIS. Various perceptual experiments were conducted testing many parameters such as number shadow passes, importance of specular highlights, texture resolution and level of detail. These results allowed us to determine the minimum processing power while maintaining the highest perceptual quality. We believe that the results of this workpackage could benefit future work in this area. Specifically for those applications which require perceptually accurate results. Our tone mapping operators not only works with static images but also animations which could also be used in movies or computer games. Publications of the University of Bristol related to the ARIS project follow. Ledda P, Santos L.P., Chalmers A. "A Local and Dynamic Model of Visual Adaptation for High Dynamic Range Images". AFRIGRAPH 2004, Cape Town, Nov 2004. Ledda P, Chalmers A., Seetzen H. "HDR Displays: a validation against reality". International Conference on Systems, Man and Cybernetics 2004, The Hague, The Netherlands, Oct 2004. Ledda P, Chalmers A., Seetzen H. "A Psychophysical Validation of Tone Mapping Operators Using a High Dynamic Range Display". Symposium on Applied Perception in Graphics and Visualization, LA. Ledda P., Ward G, Chalmers A. "A Wide Field, High Dynamic Range Stereographic Viewer". GRAPHITE 2003 Conference, Feb 2003. Peter Longhurst and Alan Chalmers. “User Validation of Image Quality Assessment Algorithms”. In: EGUK 04. IEEE Computer Society, June 2004. More information on the Aris project can be found at: http://aris-ist.intranet.gr/
The ARIS (Augmented Reality Image Synthesis) system infrastructure summarizes the technical know-how necessary to set up a complete software and hardware chain that will help a corporation in the furniture retail industry set up the added value services developed within ARIS. The ARIS services concentrate on the selection of furniture out of a catalogue and the subsequent incorporation of these selected pieces of furniture in an image of the real environment of the user or projected into the actual environment through head-mounted displays. The technology deployed is called augmented reality. The main innovation of the project is the approach that allows for photorealism in the incorporation of furniture models in the actual environment of the user. Additionally within the project it was possible to link the complete technology to a set of technologies, necessary to make the service available in a short period of time and closely linked to possible eCommerce activities of the customers. The infrastructure of the ARIS system is completely scalable and may be adjusted to fit the needs of various potential customers, such as: - Large furniture retail companies; - Smaller furniture retail houses; - Application service providers. The components of the infrastructure comprise the following: - A database management system structured according to the needs of the furniture retail industry, capabale of integration into existing inventory systems. Additionally capabilities to integrate multimedia information and advanced grapohics models with the furniture inventory. - A set of web services for setting up the application as an addition to existing web-sites. - The complete design of a scalable hardware platforms for the hosting of various components of the system. - Networking components and infrastructure for on-site collaborative augmented reality applications. - Exact specification of all components needs to set up various versions of the system and deploy it remotely through conventional (web browsers) or advanced clients (mobile augmented reality units). More information on the Aris project can be found at: http://aris-ist.intranet.gr/
The mobile and collaborative system within the ARIS (Augmented Reality Image Synthesis) context is characterizing the implementations of an augmented reality application for the photorealistic and collaborative virtual furnishing of interior spaces. The system features a portable local central unit (containing a database of furniture models and the server for the communication of different participants), an arbitrary number of client devices (featuring a lightweight portable computer equipped with a head-mounted-display) communicating over a local wireless LAN. A connection to the central server of the furniture company providing for this service is also possible, if a communication channel with an interior designer located there is desired. A set of preparatory tasks are necessary for the system to operate smoothly in any new environment and are usually operated by a technician sent by the retailer to install the necessary equipment at the customer's environment. The preparatory phase requires less than half an hour, even for rather complex surroundings. The mobile AR-unit (client devices for each one of the customers participating) is performing the tracking of the position of the user who wears it. The visualization of the furniture models on top of the real environment that can be captured through a camera and presented on a pair of Head-Mounted Displays (HMDs) is also taking place on the client device, thus implementing a video-see-through augmented reality solution. Customers are in portions to select and position furniture out of a catalogue displayed in their HMD. The primary interaction device is a joystick attached to the notebook PC. Additionally there is the option of activating the speech recognition component, which allows for a completely free movement in the environment. The speech recognition used in the ARIS solution is an off-the-shelf system. When in collaborative mode, the selection and update of the position of furniture is instantly transferred to the display of all participants of the collaborative session, presented in their correct geometry and lighting condition (adjusted to the viewpoint of the participants). There is always one person responsible for the selection and manipulation, i.e. the system uses a token to grant manipulation rights to any of the participants requesting them. The communication between different mobile clients and eventually also the remotely located 3D e-commerce server (where an expert in interior design may be sitting to provide remotely some additional services) is handled by a communication module, which is also responsible for the communication between clients themselves through the local area network. In the case where many mobile clients operate at the same room, only one of them can collaborate with a remote expert (decorator, designer) from the store. This client is denoted as the coordinator of the collaboration and it includes an additional module, the collaboration module. However, the results of the collaboration and any suggestion/changes that the remote expert makes should be made visible to all other participants. More information on the Aris project can be found at: http://aris-ist.intranet.gr/

Searching for OpenAIRE data...

There was an error trying to search data from OpenAIRE

No results available