Skip to main content

Services and Training through Augmented Reality

Exploitable results

Augmented Reality (AR) is a major step forward in the drive to create photo-realistic interactive synthetic worlds. IST project STAR (Service Training through Augmented Reality) is developing AR systems that can be used for training, online documentation and planning purposes. Current systems mainly use Virtual Reality (VR), but the synthetic worlds generated by this method are often unconvincing, however much effort the creators put in. AR has emerged as an attractive alternative because it yields increased realism at a lower computational cost. "The main goal of STAR is to use AR techniques to support training, maintenance and installation on real existing machines or systems," says Dr Artur Raczynski, STAR project coordinator, an expert in industrial visualisation at Siemens AG. "The project's mission is to effectively support technical personnel in order to ensure the execution of complex service tasks with increased efficiency and reduced error rate. "The installation or maintenance of all components of a complex system in a new environment needs the knowledge of an expert. The STAR approach can reconstruct this expert knowledge, and communicate and display it to workers who might be unfamiliar with what they need to do to carry out the necessary tasks correctly. STAR will be useful for teaching the functionality of a product, and for running through installation or maintenance procedures, even before the actual physical set-up is available." Added reality AR works by mixing graphics or virtual objects with real images taken from existing environments on the same display. This is why AR is also sometimes referred to as Mixed Reality. In practical terms this means that AR allows a video to be overlaid with additional information on the same display in real time. The information varies from simple instructions to complex 3D geometry. With the help of AR techniques it is possible to blend this information according to the camera movement. The result is an extension of reality. A typical example of the STAR system working in practice is when workers are performing a maintenance task. As well as, or instead of, a manual, they have a laptop equipped with cameras and a wireless connection to the local computer network. The cameras record the workspace in which the workers are operating. The video stream from the camera is sent over the network to an expert. The 3D position of the cameras with respect to the scene is determined automatically by the system from unique shapes found in the images. The expert can augment the video with a variety of information: text, 3D, etc. The augmented view is sent back over the network to the workers' laptop. Comprising the STAR system To achieve this highly realistic and flexible set-up, STAR have developed unique subsystems and interfaces, in particular: A 3D-CAD-reconstruction subsystem that includes new methods of producing CAD models - The clever innovation here is the integration of point clouds acquired by laser ranging with digital photographs. By combining these data sources, far better automation of site modelling is achieved. A 3D rough reconstruction subsystem that contains fast, accurate alternative techniques for 3D modelling using panoramic images - The software imports a set of images and produces panoramas from them. It can also import 3D-data to enhance the panorama with integrated 3D-models. A camera hand-over subsystem that controls the automated view selection using a camera hand-over algorithm - Given several camera inputs, the algorithm automatically chooses the one that gives the best possible view of the main point of interest. An AR-browser that allows manipulation of mixed objects by virtual humans - The operator is able to add autonomous virtual humans into the augmented environment. Both the user and autonomous virtual humans are able to move, animate and manipulate objects, whether real or virtual. The software consists of Feature-based tracking (used to find the 3D-coordinates of a predefined object in a video) and Rendering (enabling the import of Smart-Objects that contain the motion information for a virtual human). To handle scenes without virtual humans STAR has developed two components that communicate in real-time. The workflow editor is an AR-authoring tool that defines individual AR applications using video, marker-tracking and 3D information. The second component, the workflow player, enables the video to be sent to a remote person. The remote person can then enhance the video using the workflow editor and send it back to the sender in real-time. The commercial potential of STAR is huge. Used in its entirety it will be an invaluable tool wherever complex installation or maintenance tasks are needed. But in addition, as Raczynski points out, the STAR subsystems are so sophisticated that each could be sold as a stand-alone application. Source: Based on information from STAR Promoted by the IST Results Service