Skip to main content
Aller à la page d’accueil de la Commission européenne (s’ouvre dans une nouvelle fenêtre)
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

Earth-like Planet Imaging with Cognitive computing

Periodic Reporting for period 4 - EPIC (Earth-like Planet Imaging with Cognitive computing)

Période du rapport: 2023-11-01 au 2025-04-30

Among the various exoplanet detection methods, direct imaging has an important role to play: by spatially separating the faint glow of exoplanets from the blinding light of their host star, it enables the characterisation of exoplanet structure and composition. However, directly imaging exoplanets is a formidable challenge, due to the huge contrast between bright stars and faint planets, and to the minute separation between them as seen from Earth. Dedicated instruments have been developed and built to tackle this challenge, but despite the efforts put into this endeavour in the last 10+ years, only a handful of exoplanets have been directly imaged. With this research project, we aim to use the latest advances in the field of machine learning to unleash the full power of direct imaging instruments. For that, we focus on two main bottlenecks where machine learning can make a significant difference: the optimal operation and calibration of the instruments, and image processing. In the first case, our goal is to optimise the quality of the recorded data by correcting wavefront errors due to atmospheric turbulence and optical aberrations in the instrument. In the second case, the goal is to leverage the power of supervised machine learning methods to identify fainter planets in existing and future data sets. These advances are specifically geared towards the METIS instrument for the Extremely Large Telescope, with the overarching goal to enable the first direct detection of temperate rocky planets around nearby stars in the early 2030s. Throughout this project, we have led the optimisation of the METIS final design using a custom end-to-end simulation framework, as well as the manufacturing, testing, and integration of high-contrast imaging components. In parallel, we have laid the foundations for deploying machine learning tools in the operations and scientific exploitation of METIS.
Two main types of activities have been performed over the course of the project: (i) fundamental development of techniques and methods to improve high-contrast imaging, and (ii) application of these techniques and methods to instrumental developments and data analysis.

In the first category, we have developed a new algorithm, referred to as the "regime-switching model detection map" (RSM), to process high-contrast imaging data sets. RSM uses advanced statistical tools originally developed in the field of economics to analyse time series. In parallel, we proposed new metrics to compare the performance of various image processing algorithms, and organised a community-wide exoplanet imaging data challenge in an attempt to rank the merits of various algorithms proposed over the last 10+ years in the high-contrast imaging community. Our RSM map provided some of the best results among about thirty entries. On the contrary, our first machine-learning algorithm based on a convolutional neural network (CNN) showed significant performance limitations, which we addressed with follow-up developments. Besides image processing, the work also focused on the development of new focal-plane wavefront sensing (FP-WFS) techniques for high-contrast imaging instruments. We reformulated the problem of FP-WFS as a supervised machine learning problem, and trained a series state-of-the-art CNNs to perform this task. This analysis shows that CNNs provide robust FP-WFS capabilities, and performance compatible with fundamental limits. Finally, we started to explore the design of new types of vortex coronagraphs in an attempt to enable better starlight cancellation, using the concept of optical metasurfaces.

In the second category, our efforts focused on the detailed design ("Phase C") of the METIS instrument. Our main goal was to finalise the detailed design of the METIS coronagraphs and of its high-contrast imaging modes, and to define the detailed strategy that will be used to optimise the instrument performance. We put a significant efforts in the adaptation and practical implementation of FP-WFS techniques to the case of METIS, using our machine learning approach. These activities are largely based on the end-to-end simulation software that we specifically developed for METIS, and which also provides a means to predict the yield of METIS in terms of planet detection. Besides METIS, we also provided a strong contribution to an early demonstration of the ground-based high-contrast imaging in the mid-infrared though the NEAR project at the Very Large Telescope. Our contribution consisted of the delivery of a new coronagraph and of a dedicated pointing control algorithm, which we helped install and test at the telescope. We contributed to the 100-h observing campaign on alpha Centauri with VLT/NEAR, and to the subsequent data analysis. This project led to the first candidate detection of a planet smaller than the mass of Jupiter, possibly down to the mass of Neptune. Besides our contribution to VLT/NEAR observing campaign, we participated in a series of other observing programs in collaboration with several other teams, including the early scientific exploitation of the vortex coronagraph that we built and commissioned for the ERIS camera at the VLT.
The deployment of deep learning techniques in the field of high-contrast imaging goes well beyond the state of the art (as it stood in 2019, when this project started). Beyond the results described in the previous sections, this project has a series of short-, mid-, and long-term perspectives:

- The image processing algorithms developed during this project can be used to reassess the results of major high-contrast imaging surveys. Such reassessment has already started on one of the most prominent ground-based high-contrast imaging surveys, but can be extended to others. We expect to improve the detection limits in these surveys, which could lead to the discovery of new candidate planets, and will improve the current constraints on giant planet populations are large separations.

- Focal-plane wavefront sensing using supervised deep learning has been demonstrated in simulations and in the lab during this project. In the near future, we will make first on-sky demonstrations. We will also further explore the use of reinforcement learning to optimise the real-time control of the wavefront errors. These techniques will be deployed on METIS in the mid-term (early 2030).

- Among the unexpected discoveries enabled by this project, the harmful effect of water vapour fluctuations in the atmosphere on mid-infrared high-contrast imaging performance is one of the most notable. Correcting for this effect in real time has been a strong focus of our work, and will continue to be at the centre of our attention in the next few years leading to the commissioning of the METIS instrument.

- During the course of this project, we have developed a simulation framework to evaluate the performance of focal-plane phase masks for starlight suppression, and investigated metasurface designs in an attempt to improve the performance of such masks. This work could have direct mid- to long-term consequences on the design of future ambitious instruments and observatories, including future instruments on the Extremely Large Telescope or NASA's Habitable World Observatory.
Simulated observations of alpha Centauri with the METIS N-band imager.
Mon livret 0 0