Skip to main content
European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Illuminating the darkness with precision maps of neutral hydrogen across cosmic time

Periodic Reporting for period 2 - MapItAll (Illuminating the darkness with precision maps of neutral hydrogen across cosmic time)

Período documentado: 2022-07-01 hasta 2023-12-31

Some of the early stages of our Universe's history are shrouded in mystery. We have long been able to study the warm glow of radio waves that were emitted very soon after the Big Bang -- the so-called Cosmic Microwave Background -- but the period immediately after that, and for the next billion years, is difficult to observe with telescopes. This is largely due to the fact that stars and galaxies had not yet begun to form at this point, and as they did eventually form they remained rare and hard to see as they are so far away from us today. Fortunately, there was an abundance of the most common element -- hydrogen -- at these times. If left alone, hydrogen atoms will occasionally emit a tiny amount of radio waves, with a wavelength of around 21cm. These waves travel through the expanding Universe to reach us here on Earth today, and by studying how much they have been stretched by cosmic expansion, and which direction they came from, we can reconstruct a 3D map, of sorts, of our Universe's distant past.

The problem is, there are many other sources of radio waves at similar frequencies, and they are all much brighter than the 21cm emission. Disentangling the waves that arrive at our radio telescopes, even the most sensitive ones, is extremely challenging. However, we must do an almost perfect job of distinguishing between them if we are to make the aforementioned 3D map and learn about the time before, during, and immediately after the formation of the first stars and galaxies. This project seeks to develop the data analysis tools required to perform this difficult, painstaking task. While many people are trying to use a hodgepodge of methods to progressively improve our measurements of the 21cm emission, we are working on a single framework that tries to do everything that is required in a statistically rigorous way. This is important, as small imperfections can creep in at each stage of the analysis, and if they are not tracked very carefully, can add up and combine to spoil the entire analysis. The down side of our approach, and the reason no-one has done it before, is that it is very demanding in terms of computer power, requiring hundreds of thousands, or even millions, of little parcels of information to be kept track of to describe what is going on in the data. We have found a novel way of making this statistical exploration of the data tractable with reasonably sized computers. Using a statistical approach called Gibbs sampling, we are able to build up statistical information about the whole dataset by combining together a few simpler chunks in a clever way.

Our overall objectives are to build the software tools and models needed to explain the data from our radio telescopes, and build a 3D map of the distribution of hydrogen atoms from the early phases of cosmic history onwards. This requires building models and tools for each individual component of the observing process -- the properties of the telescopes, the properties of the other sources of radio emission from the sky, and so on. These must then be combined together in a self-consistent way to produce an overall picture of our data. As a bottom line, we wish to detect the presence of fluctuations in the distribution of hydrogen in the early Universe; to date, only upper bounds on the maximum possible fluctuation size have been achieved, so measuring this property directly for the first time would be an important result.

In terms of the importance for society, this is very much about knowing where we came from. The questions we can answer with our 3D map are broadly existential in nature, ranging from how long it took for the first stars to form and start exploding (creating the heavier elements in the Universe, like the ones that the Earth formed from), to how likely (or unlikely) it is that we find ourselves in our present place in the Universe. As a welcome byproduct, we are also developing novel tools and techniques that can tackle an extremely difficult statistical problem, which may find applications in other fields and commercial ventures.
The largest part of our work so far has involved building models of the various parts of the observing process, such as the different types of radio emission from the sky (broadly, the 21cm emission and its statistical properties, individual radio sources, and 'diffuse' emission from our own Galaxy), and the way that the telescope responds to radio emission, described by properties such as the 'antenna beam pattern' and the 'antenna gains'. From these we have then been able to develop computationally tractable methods for drawing samples (plausible sets of parameters) from the relevant statistical distributions. If we repeat these statistical draws enough times, informed by the observed data, we can build up a statistically-rigorous picture of the actual behaviour of the instrument and the different sources of emission on the sky.

With each of these 'modules' in hand, we can then move on to connecting them together into one large computational scheme. There is an interplay between the parameters from each module; if the antenna beam pattern is estimated slightly wrongly, the brightness of the individual radio sources might be mis-estimated and so on. By drawing many samples across all of the modules, with each module informed by the information from all the others, we can build up a picture of the mutual effects of mis-estimates on all the other estimates. With this picture in hand, we can then use statistical tools to select out the 'most likely' values for all of the parameters of the whole system.

So far, we have used early versions of these tools to help improve measurements from two radio telescopes -- the Hydrogen Epoch of Reionization Array (HERA) and MeerKAT, both based in the Karoo desert in South Africa. To date, our measurements are the best ever, putting the most stringent limits on the sizes of the 21cm fluctuations in the early Universe (HERA), and demonstrating the feasibility of doing big surveys of the 21cm emission at later times in cosmic history too (MeerKAT).
The tools we have developed all represent advances beyond the state of the art. We can measure more parameters than other statistical methods, and have novel and information-rich approaches to handling each part of the observation process. Some of these approaches are still in early phases of their eventual usage however, and need to be rigorously tested using simulations to make sure we understand how they behave, and to verify that they produce correct results. Once this has been done, we can then run them on the most up-to-date data from each of our telescopes, in order to produce the most sensitive and robust measurements of the 21cm signal.