Skip to main content
European Commission logo print header

Markerless Real-time Tracking for Augmented Reality Image Synthesis

Project description


Multimodal interfaces

Augmented reality (AR) is a growing field, with many diverseapplications ranging from TV and film production, to industrialmaintenance, medicine, education, entertainment and games. The centralidea is to add virtual objects into a real scene, either by displayingthem in a see-through head-mounted display, or by superimposing them onan image of the scene captured by a camera. Depending on theapplication, the added objects might be virtual characters in a TV orfilm production, instructions for repairing a car engine, or areconstruction of an archaeological site (see Figure below).

For the effect to be believable, the virtual objects must appearrigidly fixed to the real world, which requires the accuratemeasurement in real-time of the position of the camera or the user'shead. Present technology cannot achieve this without resorting tosystems that require a significant infrastructure in the operatingenvironment, severely restricting the range of possible applications.

The objective of MATRIS is to develop and implement a system fordetermining the position, orientation, and focal length of a camera inreal time, by analysis of the camera images and exploitation ofunobtrusive inertial motion sensors. As the system will not requiremarkers or other special infrastructure in the environment, it will besuitable for applications where existing tracking systems cannot easilybe used, including:

insertionof virtual objects in live broadcast images outside of a dedicated"virtual studio" environment, such as graphics for sport, or virtualobjects in a conventional TV studio
augmented reality applications in the area of development, production, service and maintenance
augmented reality for architecture, design and product presentation
augmented reality for cultural heritage sites and tourism

There are many applications in which it is necessary to overlay a computer-generated object onto a real scene in real-time, requiring accurate measurement of the position of the camera or headset. Existing methods require bulky hardware, severely limiting their usability. The objective of this project is to develop and implement a system for determining the position, orientation, and focal length of a camera in real-time, by analysis of the camera images and exploitation of unobtrusive inertial motion sensors. This will enable the system as a whole to determine its location and orientation in a very natural way, mimicking the way a human orients himself, using the vestibular organ (in the ears) � which is essentially an inertial measurement unit, and the eyes � essentially comparable to a camera. The results of project will be a marker-free tracking system that works with a high frame-rate on a low-performance computer unit. It will allow the capture of camera and head motions respectively for TV production and mobile augmented reality applications. In particular the system will work over a large area in indoor as well as in outdoor environments. By providing this unique technology, the project will act as a strong enabling force for the wider deployment of augmented reality in application areas including content production, education, cultural heritage and industry.

Call for proposal

FP6-2002-IST-1
See other projects for this call

Coordinator

FRAUNHOFER GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V.
EU contribution
€ 829 176,22
Address
Fraunhoferstrasse 5
64283 Darmstadt
Germany

See on map

Total cost
No data

Participants (4)