European Commission logo
English English
CORDIS - EU research results
CORDIS

Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions

Periodic Reporting for period 2 - PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions)

Reporting period: 2020-01-01 to 2021-08-31

Since autopilot was first demonstrated in 1914, pilots have found themselves increasingly “connected” to their aircrafts through numerous instruments in the cockpit. These tools have been introduced to assist pilots with the aim of increasing safety. Ironically, today there is a risk that pilots find it difficult to cope with the vast amounts of information generated by the instruments. Therefore, to continue to pioneer safety, improvement and optimization of situational awareness is needed. Maturing current technologies as well as new developments are required to meet the needs of next generation aircraft. These will result in increased crew efficiency and enhanced flight safety. Employing a smart cockpit interface, fully adapted to the pilots’ need, is a way to meet these objectives.
Led by CSEM, the PEGGASUS consortium has created a system for human–machine interface for cockpit avionics, towards the development of new generation cockpits. Using the latest in artificial intelligence (AI) and computer vision technologies, this project created remote eye-gaze tracking and gesture recognition for pilots in a single framework. Its purpose is to enhance human-machine interaction for today’s complex flight operations to improve safety, crew efficiency and pilot training. Such systems need to remain accurate and robust during a flight, considering aviation-specific environmental factors such as extreme lighting conditions and vibrations. Also, the use of eye tracking in combination with measures like skin conductance was investigated in the context of crew monitoring.
To design a powerful and tailored solution, the consortium partners provided the essential complementary skills, ensuring the project’s success. CSEM developed the vision system and machine learning algorithms for gaze tracking and gesture recognition, addressing the unique challenges and requirements in the aeronautic context. SERMA Ingénierie was responsible for integrating the PEGGASUS system HW into a cockpit demonstrator for testing. ETH Zurich, with their expertise in the field of human-machine interaction and aviation safety, performed the design and execution of a simulator study with ten pilots from SWISS International Airlines. Additionally, SWISS International Airlines supported the consortium partners with the expertise of a professional airline on the use cases and insights for future training possibilities.
As the first step for designing a non-intrusive system for tracking eye gaze and hand gestures, the global system specification was identified. First, representative avionics use cases were defined for evaluation of the system. Then, hardware and software specifications were identified for the three envisioned prototypes based on the use case requirements and constraints defined by the topic leader.
A non-intrusive multi-camera system was developed to meet the requirements established in the first phase. Two hardware prototypes were installed in the lab environment and the final prototype in the cockpit simulator for data acquisition and system evaluation. This iterative development allowed for optimizing the techniques towards an accurate and robust eye tracking and hand gesture recognition system.
The crew interface system included software for acquiring synchronized frames from the multi-camera system. State-of-the-art computer vision and machine learning techniques were implemented to provide real-time gaze tracking and gesture recognition from video sequences.
The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes, for 17 participants. The angular gaze deviation goes down to less than 1° for the panels towards which an accurate eye gaze was required according to the use cases.
To implement gesture recognition, a deep neural network model was developed to differentiate five pre-defined gestures. The model was improved by using the dataset acquired with the PEGGASUS system. The model was incorporated in the PEGGASUS software to allow for the integration of the gesture recognition module in the crew interface system for the final prototype, running in real time.
In the context of the crew monitoring framework, the use of eye-tracking and electrodermal activity measures for pilot state detection were investigated. The machine learning based framework was designed and evaluated with a dataset, comprised of recordings from a total of 40 participants in the data acquisition campaigns. Results show that the framework can classify the pilot’s state in three flight phases.
The PEGGASUS crew interface and monitoring systems were evaluated in a flight simulator study with 10 professional airline pilots in a realistic environment based on regular training.
An ethics proposal was approved by the ethics committee at ETH Zurich prior to the data collection campaigns. While the acquired data allowed to adapt the system’s performance to the requested requirements, it also required strict implementation of data management strategies in compliance with GDPR.
The work performed in the context of the project was presented internally by all consortium partners inside their organizations. In addition, the results were presented at scientific conferences. A press release was done at the beginning of the project, which achieved good coverage by media, including several interviews and articles. The project results have been presented to potential industrial entities who have shown interest in the technology, which has led to discussions on possible future partnerships and successful acquisition of new projects.
The PEGGASUS consortium has developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. This system allows a leap towards a more comprehensive human-machine interface in the cockpit to reduce the stress and cognitive load of the pilots, while bringing forward future pilot training techniques. Better awareness of the instruments will help the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.
The PEGGASUS project’s positive impact on the European aviation industry will be by generating knowledge and experience with the first TRL4-5 eye gaze monitoring system coupled with avionics. The industry (avionics and aircraft manufacturers, airlines) will be well positioned to develop systems that make use of data generated, thanks to the consortium’s proven successes in maturing technologies developed in Clean Sky, to high TRL industrial developments.
Beyond the aeronautical domain, the developed system will have a major impact on eye tracking technologies in general, after it has been tested and validated in the demanding environment of a cockpit. Automotive applications, railway industry to mention a few, can strongly benefit from these improvements.
Two images showing the results of the algorithms including pupil detection and eye gaze estimation
UPT HW setup installed in the cockpit simulator.