Skip to main content
European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS

Bias and Clustering Calculations Optimised: Maximising discovery with galaxy surveys

Periodic Reporting for period 4 - BACCO (Bias and Clustering Calculations Optimised: Maximising discovery with galaxy surveys)

Okres sprawozdawczy: 2021-06-01 do 2023-08-31

Over the last 20 years, our understanding of the Universe has grown tremendously. There is a relatively simple model, referred to as LCDM, which can explain the main properties of the Universe that surrounds us – from its infancy to its present-day structure. There are, however, big unanswered questions in this model. For instance, the nature of dark energy, the origin of the seeds from where structure grows, and the properties of dark matter are all still unknown.

In the coming years, a new generation of telescopes – located in different places around the world and in outer space – will map the distribution of dark matter, gas, and of millions of galaxies. This will offer the possibility of profound discovery. We will be able to carry out tests where perhaps Einstein’s General Relativity will breakdown, a new fundamental particle will appear, or where we would find a clue to explain the accelerated cosmic expansion.

The goal of the BACCO project was to increase the chances for such discoveries by developing a new generation of supercomputer simulations mimicking large regions of the Universe. These simulations were used to generate millions of virtual universes which we compared to the real one. In each of these computer-generated universes, we varied the amount of cosmic ingredients, the strength of the gravity law, and the processes that determine the formation of galaxies, trying to find the combination that best coincides with the observed Universe.
Since the beginning of this project, we have carried out some of the largest cosmological simulations in the world to study the large-scale distribution of matter, gas, and galaxies in the Universe. These calculations were able to resolve the smallest clumps of dark matter where we expect large galaxies to form, and over regions of billions of light-years across.

Even though a single simulation originally takes weeks of processing in a large supercomputer, we have designed a method that uses one simulation to create another one with different assumptions about the Universe. This transformation takes only 5 minutes in a normal desktop, so we have been able to create thousands of new pseudo-simulations as a function of the amount of dark matter, neutrinos, and the properties of dark energy.

Complementing these results, we developed physical models that predict the distribution of gas and stars inside our simulations, and how they are related to dark matter. These models include the impact of the formation of stars and the energy released by supermassive black holes, among other effects. In parallel, we developed new techniques to predict the spatial distribution of galaxies and their shapes. All these models have been extensively tested against hydro-dynamical simulations and other galaxy formation models.

Employing these data, we have trained machine learning algorithms so that the predictions could be obtained in a fraction of a second. We made these results publicly available so that they can benefit researchers from all over the world. These tools have also been adopted by some of the most advanced cosmological surveys to date, such as ESA's Euclid mission.

These novel methods have enabled the interpret cosmological data over a range of scales never used before, and to employ them for inferring the parameters that determine the evolution and properties of our Universe. Specifically, we analysed the latest data release from the Dark Energy Survey, modelling its gravitational lensing measurements to infer the amplitude of fluctuations as well as the amount of gas in massive galaxy clusters.

All these results have been described in detail in more than 50 peer-reviewed scientific publications, as well as in multiple scientific conferences and events designed for the general public.
To carry out our simulations, we have developed several new algorithms that speed up our simulation software and manage huge amounts of data generated. In this way we are were to efficiently use the power of some of the largest supercomputers in Europe We have also designed new methods that allow us to very rapidly scan combinations of many different cosmological parameters to find very quickly the set that produces the best agreement with a given set of observations.

In the second half of this project, we increased the sophistication of our models to include, for instance, details of how galaxies of various types form. We also employed new machine learning algorithms to augment our simulations. All these aspects together provided a theoretical counterpart to ongoing and futures observational surveys. Together, models and data could in the near future allows us to disentangle the effects of standard physics from that caused by potentially new ingredients.
nenya.png