The goal of the ARIS project is to provide new technologies for a seamless integration of virtual objects in an augmented environment, and to develop new visualisation and interaction paradigms for collaborative AR-applications.
Two application scenarios will be developed:
1) an interactive AR- system, where the end-user can easily integrate 3D product models (e.g. furniture) into a set of images of his real environment, taking consistent illumination of real and virtual objects into account;
2) a mobile AR-unit, where 3D product models can be directly visualised on a real site and be discussed with remote participants, including new collaborative and shared technologies.
Both approaches will be validated in end-user trials, addressing the new application area of e- (motion)-commerce. In addition to existing e-commerce solutions, e- (motion)-commerce enables the presentation of the products in the context of its future environment.
ARIS intends to overcome limitations of current AR-solutions to open up new business opportunities by providing:
1) easy-to-use methods for 3D reconstruction and camera registration;
2) precise camera tracking, using vision methods with AND without markers;
3) tools to reconstruct illumination and material data from images (incl. day light);
4) seamless integration of virtual and real scenes through consistent illumination between real AND virtual objects in interactive update rates;
5) psychophysical evaluation and new tools to adjust the augmented visualisation to the brightness conditions of the real environment for see-through systems;
6) new collaboration components, including interaction and communication methods to enable discussion with external users in an augmented environment;
7) clear validation of the results through end-user trials.
The work plan is structured in 7 work packages and one additional work package for the management of the project. The first 4 work packages cover the research and development of the basis technology (geometry and illumination reconstruction, combined lighting simulation and perceptual evaluation), which are needed for the interactive and the mobile ARIS systems. The application scenarios, the application systems and the validation trails are defined as WP 5, WP 6 and WP7. Combined lighting and semi-automatic reconstruction of geometric and photometric properties implies new approaches, which are not obvious to the end-users from the beginning. Demonstrators and mock-ups of the new technologies and their application opportunities will be developed first, before an ultimate decision of the application system will be taken. Therefore, the first 9 months of the project are dedicated to the development of first prototypes of the new algorithms, which will be presented to all consortium members. In parallel, the user requirements, survey of existing systems and the scenario descriptions will be prepared, yielding to the important system decisions at the end of the first year. Based on the final system specifications, the developments will be combined and extended, resulting in the prototyped solutions, which will be tested and validated at the test trials of the project.
three milestones are planned:
technology demonstrators (month 9: M1) used for requirement definitions;
first prototype (month 24: M2) used for end-user-trials;
final system (month 36: M3).
Two systems are expected:
- interactive AR-system: the user can place 3D product models in the reconstructed image space and see the direct and indirect lighting effects;
- mobile AR-system will allow to present 3D products on-line in its future environment, supporting new collaboration components.
Funding SchemeCSC - Cost-sharing contracts
15233 Halandri - Athens
78153 Le Chesnay
19002 Peania - Attiki
M13 9PL Manchester
BS8 1TH Bristol