CORDIS - Forschungsergebnisse der EU
CORDIS

Next Generation Cockpit HUD Integration

Periodic Reporting for period 2 - ANGI-HUD (Next Generation Cockpit HUD Integration)

Berichtszeitraum: 2018-03-01 bis 2019-08-31

Aircraft cockpit complexity, high cost and flight safety. The ANGI-HUD program aims at providing aircraft manufacturers innovative tools in the form of hardware and procedures that will allow them to reduce the amount of equipment in the cockpit while maintaining and improving flight safety. With that aim in mind ANGI-HUD program strives to develop an ‘eyes out’ by developing HW, SW and procedure tools that will make it possible for pilots to use the HUD for additional flying tasks then are currently performed by pilots on the HUD. This is expected to have an impact on flight safety as well as an impact on aircraft avionics costs. The use of a HUD for additional flying tasks will allow the removal of down displays. In turn the removal of a down display reduces the amount of times the pilot looks inside the cockpit, thus making the flight safer and it reduces the cost of the down display from the cost of avionics.

The importance to society:
Low cost and safe flight is an essential part of the modern world. Increased safety and cheaper flight costs is expected to contribute to:
1) The new HUD architecture will reduce the number of displays in the head down displays by becoming a PFD sole means. It will allow to reduce hardware use and weight onboard or to allow more display areas for new applications display and availability to the Pilots.
2) Wearable HUD is a replacement for heavy and complex installation fixed HUD and will save costly downtime for a retrofitted aircrafts.
3) Display interactive and intuitive display including 3d synthetic entities and an intuitive control over the avionics will decrease pilot’s workload while increasing situational awareness.
4) Reduce deviations in all landing conditions and by that, reduces hard landings.


Objective:
1) The main objective of this ANGI-HUD project is to analyse how the capacities of the Head Up Display (HUD) could be used to provide new functionalities, in combination with other visualization means, and to demonstrate them on a fixed-based simulator.
2) Contribute to the analysis of potential new functionalities, prototype the intended new Man Machine Interface (MMI),
3) Provide the Airframer with two representative HUD systems, including rapid prototyping capacities, and participate to, and support, bench tests at the Airframer’s simulation facilities.
4) Assess and analyse how novel HUD intelligent functionalities can be fully integrated into this next generation cockpit concept such that the efficiency of the new researched technologies and concepts are maximized. Furthermore, optimum performance of the cockpit operations in general is aiming at implicitly contributing to the Clean Sky 2 overall objectives in a best possible way.
1) Analysis of potential new functionalities: This objective was addressed by conducting several ‘brain storming’ sessions of the consortium members aimed at analysing the ‘eyes out’ concept and its impact on the HW of a HUD and more importantly on the flying processes and work share in the cockpit. In parallel during the HUD integration in the cockpit the consortium members and the topic manager have discussed the interface that are need for the HUD (through the HUD PC computer) with the aim of providing interfaces for all current and future avionics.
2) Prototype the intended new Man Machine Interface: To achieve this objective the consortium members have developed tools and demonstrators to demonstrate and evaluate the ‘eyes out’ concept. A ‘simple tool’ was developed that allows presenting the pilot with a set of avionics data on a screen emulating the HUD and an ‘eye tracking’ device to receive selections from the pilot. Several experiments were conducted to choose the HW and the input methods that are most suitable for that tool. The main aim of the ‘simple tool’ is to verify the benefits of the ‘eye tracking’ and the work load related to ‘eye tracking’. A second demonstrator that was developed is the ‘interactive SVS’.
3) Provide the Airframer with two representative HUD systems: Two HUD systems and the necessary HW and SW were delivered and installed at the Bizjet simulator.
4) Participate and support bench tests at the Airframer’s simulation facilities: The ANGI-HUD consortium is supporting the Airframer with all activities related to HUD installation in the Bizjet simulator, the development of rapid prototyping. Also, the consortium supports the Airframer with the ability of self rapid prototyping.

Main results achieved:
1) Prototypes:
2 HUD systems in the airframer simulator.
4 rapid prototype versions for the HUD SW as per airframer specifications.
2) ‘Eyes out’ concept demonstrators (for Testing Activities)
3) Experiments results and conclusions of the ‘Eyes out', interactive SVS and interactive HUD concepts involving all MMI related topics and other future cockpit concepts.
4) Future HUD architucure including test bench to demonstrate HUD as PFD sole means

Results:
1. MMI interface experiment of 'Eyes out' concept
The experiments were focused on testing the 'Eyes out' concept based on the future HUD architeture. The last experiment used all the data which was gathered from the other experiments in order to test if the 'Eyes out' concept is reducing pilots workload and increasing there situational awarenes and by that, meeting project objectives.
The 'Eyes out' concept final experiment was tested with 10 european airlines Pilots. The Pilots performed the tasks as described in the scenarios below (1.1-1.3). Both with and without the 'Eyes out' concept.
1.1 Speed, Heading and Altitude changes
1.2 Late Runway Change
1.3 Go Around, flap and gear adjustments

Potential exploitation of the results for the consortium would be the use of HUD as a sole mean PFD and development of MMI for increased operational benefits in the future cockpit. In the future, the HUD and eye gaze interaction may be an enabler for single pilot operations and may enable the replacement of existing displays and controls as we see in current day cockpits.

dissemination-
The ANGI-HUD project was poblished in several websits:
1. Elbit web site - https://elbitsystems.com/products/comercial-aviation/innovation-rd/
2. NLR - https://youtu.be/HU7jluJyq-s
https://www.linkedin.com/feed/update/urn:li:activity:6573160782056439809
https://twitter.com/NLR_NL/status/1167396664296038401
Socio economic impact:

The new HUD architecture will reduce the number of displays in the head down displays by becoming a PFD sole means.
It will allow to reduce hardware use and weight onboard or to allow more display areas for new applications display and availability to the Pilots.
Wearable HUD is a replacement for heavy and complex installation fixed HUD and will save costly downtime for a retrofitted aircrafts.
Display interactive and intuitive display including 3d synthetic entities and an intuitive control over the avionics will decrease pilot’s workload while increasing situational awareness.
By achieving the above, future HUD can support:
* Reduce 2 man cockpits into single pilot operation.
* Increase situational awareness to traffic and navigation in a growing and congested aviation environment.
Reduce deviations in all landing conditions and by that, reduces hard landings.
Experiment of the Eye tracking in Elbit
Simple demonstrator - eye tracking demonstrator
Example of Interactive SVS element selection - pic 1 of 2
Example of Interactive SVS element selection - pic 2 of 2