Skip to main content

Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions

Periodic Reporting for period 1 - PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions)

Reporting period: 2019-01-01 to 2019-12-31

"European Clean Sky project PEGGASUS
Ensuring harmonious human–machine collaboration in the cockpit

Led by CSEM, the PEGGASUS consortium is set to enable new types of human–machine interface (HMI) across cockpit avionics, pushing the boundaries of augmentation in the cockpit. Using the latest in artificial intelligence (AI) and computer vision technologies, this European-funded project will integrate—for the first time— remote eye-gaze tracking and gesture recognition for pilots in a single framework. Its purpose is to enhance human-machine interaction in the complex flight operations of today’s cockpits through pilot monitoring, for applications aiming to improve crew efficiency and pilot training, towards the development new generation cockpits.
Since autopilot was first demonstrated in 1914, pilots have found themselves increasingly “connected” to their aircrafts through numerous displays, knobs and instruments in the cockpit. This increasing degree of pilot assistance has been developed to help reduce pilot workload, always with aviation safety as a driver.

Paradoxically however there is the risk that when faced with an atypical event pilots encounter difficulties coping with the vast amounts of information generated by their instruments. It is therefore crucial to improve and optimize their situational awareness and the relationship between the flight crew and the aircraft controls.

PEGGASUS—optimizing human–machine interactions
Endorsed by the Clean Sky 2 initiative, the European project PEGGASUS aims to counter the “instruments paradox” by understanding crew members’ actions and behaviour and by moving towards a multimodal cockpit interactivity, thus allowing pilots greater levels of control. “We need to recognize when limitations such as confusion or drowsiness impact attention, mental workload, and decision-making on the flight deck,” explains Andrea Dunbar, Head of Embedded Vision Systems at CSEM. “Additionally, the new HMI we are developing will eventually enable a more intuitive and natural interaction so they can make quick, informed decisions across any situation, even when stressed.”
To design a powerful and tailored solution, CSEM and three partners will provide the essential complementary skills, ensuring the project’s success. “Our company is pleased to support the consortium partners with the expertise of a professional airline,” says Christoph Ammann, Vice- President, Head of Crew Training at Swiss International Air Lines, a member of the Lufthansa Group. “Mutual exchange with research and industry partners enables us to reflect on our training standards and on potential future applications.""
The airline has previously collaborated with consortium member ETH Zurich to develop novel gaze-based techniques to monitor pilots’ cognitive states and situational awareness. In particular, these new methods were designed to allow more efficient and effective interaction between the pilot and the aircraft, while also expanding pilot training techniques. The consortium will build upon this excellent experience base and previously collected data when developing PEGGASUS.
Integrating HMI systems into the cockpit . “The aeronautic context poses unique challenges for our team,” comments Andrea Dunbar. “The vision systems and machine learning algorithms CSEM will develop must take into account both the pilots in the cockpit. The technology will be developed to remain accurate and robust during the course of a flight, considering aviation-specific environmental factors such as changing lighting conditions and vibrations.” Consortium partner SERMA Ingénierie will be responsible for integrating the PEGGASUS output into a cockpit prototype for testing.
The consortium’s work will also be supported by Thales, Clean Sky 2 leader. Thales’ Thierry Maret is proud that PEGGASUS will mean another step forward in “providing pilots with new ways of interacting with the aircraft system so pilots can easily and efficiently adapt to the changing and complex needs of 21st century avionics.”"
Main achievements during the first reporting period:

Identifying global system specification
- Representative avionics use cases were defined for evaluation of the system
- Hardware and software specifications were identified for the three envisioned prototypes based on the requirements and constraints defined by the topic leader

Development of the first prototype (PT1) and data acquisition
- Off-the-shelf components for the multi-camera vision system were selected based on the identified system specifications
- HW setup was assembled in a structure representing the cockpit demonstrator at THALES-Cergy
- SW was developed for acquiring/recording synchronised frames from the vision system
- SW was developed for calibration of the vision system
- First dataset was acquired, including eye-gaze to certain points (representing important panels in the cockpit), and predefined gestures

Development of crew-interface algorithms for
- Face detection and facial landmark localisation were developed using a combination of public and private datasets
- Triangulation and head-pose estimation were developed

Development of the second prototype (PT2)
- HW setup was installed in the cockpit demonstrator at THALES-Cergy

Management
- Ethics proposal was submitted to the relevant ethics committee in Zurich and was approved prior to PT1 data collection
- Data management plan was established
- Plan for communication, dissemination and exploitation of the project results was established
PEGGASUS consortium have developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. Once this multi-modal system is complete, it will allow a leap towards a more comprehensive Human-Machine Interface in the cockpit to reduce the stress and cognitive load of the pilots. Better awareness of the instruments can also help for the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.
PEGGASUS will have a positive impact on the European aviation industry by generating knowledge and experience with the first TRL4 eye gaze monitoring system coupled with avionics. The industry (avionics and aircraft manufacturers, airlines) will be well positioned to develop systems that make use of data generated, thanks to the consortium’s proven successes in maturing technologies developed in Cleansky, to high TRL industrial developments.
Beyond the aeronautical domain, the developed system will have a major impact on eye tracking technologies in general, after it has been tested and validated in the demanding environment of a cockpit. Automotive applications, railway industry to quote a few, can strongly benefit from these improvements.

Progress
- Developing a calibrated multi-camera system with a 3D reconstruction error of less than 2mm
- Developing algorithms for face detection and 3D facial landmark localisation in real time (60 frames per second)

Next steps
- Data acquisition using the second prototype of the PEGGASUS system installed in the cockpit demonstrator
- Validation of the algorithms (eye-gaze detection and gesture recognition) using the acquired datasets
- Validation of the system based on typical aviation use cases with participation of pilots