Skip to main content
European Commission logo print header

Immersive interface technologies for life-cycle human-oriented activities in interactive aircraft-related virtual products

Periodic Report Summary - VISION (Immersive interface technologies for life-cycle human-oriented activities in interactive aircraft-related virtual products)

Virtual reality (VR) has demonstrated a significant potential for interactive applications on product and process development. Nevertheless, the proven quality of the underlying technologies is still far from satisfying the real-life needs of aerospace industrial practice. VISION objective is to specify and develop key interface features in fundamental cornerstones of VR technology, namely in immersive visualisation and interaction, so as to improve the flexibility, the performance and cost efficiency of human-oriented life cycle procedures, related to critical aircraft-related virtual products (e.g. virtual cabin, virtual assembly etc.).

VISION follows an upstream research approach, in view of improving the underlying VR technologies, which are considered critical for the human-oriented life-cycle use of the future aircraft-related virtual products. Human factors and their implications in human-machine interaction within the aircraft-related products are considered in the definition of the technology specification framework. The approach of VISION involves:
a) specific human-oriented developments on visualisation and interaction simulation features, such as real-time rendering, global illumination, marker-less body tracking, smart objects interaction and interaction metaphors,
b) an integration of the features in common IT platforms, which will enable the launch of multi-disciplinary activities around a virtual prototype that ensures human immersion in complete context,
c) a validation based on test cases, which will consider the simulation of different aspects of the aircraft lifecycle (e.g. virtual assembly operations, immersive tasks execution in cabin by crew or passengers, etc.).

The project is expected to have significant impact on the working practices related to the creation of virtual aircraft products. The achievements of VISION will enhance the credibility of the human-in-the loop aircraft-related VR simulations. They will enhance the engineering context of the aircraft-related virtual products by enabling their increased use for activities, such as design verification, ergonomics validation, specifications of equipment displays, operational and situational training. They are also expected to improve the humanoriented functionality and usage of these virtual products along their life-cycle.

VISION started in November 2008. At the end of project month 18, VISION has already delivered concrete outputs, being fully aligned with its end users interests. The requirements analysis and specification phase of the project has been completed. The project has delivered detailed visualization/interaction technology specifications for virtual aircraft applications, human factors guidelines for the technology development and the implementation of the integration platform, as well as a human-centred validation framework.

The technology development phase is currently on-going. The design framework of the real-time ray tracing platform has been delivered while the assembly of the visualisation software stack is still on-going. An intermediate proof of concept application has been implemented, partially fulfilling some application scenarios. A dedicated data file format for the ray tracing engine has been specified. A new shading language for the specification of surface shaders has been defined and implemented. A neutral user interaction framework, called VITAL, has been defined and implemented for the authoring and modelling of interaction techniques. A platform independent management framework for the marker-less body tracking has been implemented. Dedicated computer vision and machine learning libraries have been developed and are currently being enhanced. A first set of generic interaction metaphors for object manipulation, and intuitive 3D user interfaces for environment and application control, have been also defined and preliminary assessed through the development of some 'proof of concept' applications.