CORDIS - Forschungsergebnisse der EU
CORDIS
Inhalt archiviert am 2024-06-18

Understanding the Dark Universe with 3D Weak Gravitational Lensing

Final Report Summary - DARKMATTERDARKENERGY (Understanding the Dark Universe with 3D Weak Gravitational Lensing)


The standard cosmological model successfully describes an enormous range of observations using only a small number of parameters. However, profound questions about its interpretation remain unanswered. It appears that particles from the standard model of particle physics represent less than 5% of the total mass-energy content of the Universe; the rest is invisible dark matter (27%) and dark energy (68%). The origin and nature of this dark sector is still unknown, but it has shaped the evolution of the Universe. Massive structures of dark matter started growing soon after the Big Bang, and provided the scaffolding inside which were built galaxies, planets, and life.

Galaxy clusters are key to understanding the dark sector. They are the densest regions of the Universe, in which gravity pulls together the different species of particles: ordinary particles from the periodic table and the exotic materials from the dark sector. To investigate the large-scale relationship between dark and ordinary matter, I have published several analyses of large samples of clusters, comparing the amount and distribution of dark matter seen in gravitational lensing, with the distribution of gas seen in X-ray emission or via the Sunyaev-Zel’dovich scattering of photons from the Cosmic Microwave Background. By exploring new regimes of low-mass clusters and clusters early in the evolution of the Universe, I have uncovered some unexpected relationships (for example, that the “luminosity-temperature relation” continues remarkably uniformly as a power law over two decades in mass and has remained constant for the past ~5 billion years), whose origin is tantalizingly interesting but remains unresolved.

More unambiguously interpreted constraints upon the particle physics properties of dark matter have arisen from analysis of collisions between massive galaxy clusters like the ‘Bullet Cluster’ - 1E0657-56, Clowe et al. (2006). At the point of collision, ordinary particles slow down or stop – like a giant car crash. However, dark matter passes through the impact seemingly unaffected – indicating that it experiences very small interaction forces with either ordinary matter or other dark matter. Using the ESA/NASA Hubble Space Telescope during this project, I found two more collisions with a similar configuration (MACSJ0025.4-1222 and Abell 520), that have tightened the overall constraints on the interaction cross-sections of dark matter. The intentional discovery of a these additional systems has been a critical improvement in the scientific credibility of the field, since the first was observed serendipitously within a larger survey. However, I also discovered a third collision (Abell 2744) whose configuration is intriguingly different, and I have recently been awarded time on the Hubble Space Telescope, the ESA Chandra X-Ray Observatory, and the ESO Very Large Telescope to study a fourth (Abell 3827). The latter in particular has a rare dynamical history, having recently undergone a simultaneous merger of four giant elliptical galaxies and their accompanying dark matter. The coincidence of four systems enables an even more rigorous statistical analysis, and early analyses suggest that the dark matter has also slowed down slightly during the collision. If confirmed by my ongoing observations, this will be the first empirical evidence that dark matter has a small but non-zero interaction cross-section.

While these collisions between massive galaxy clusters are providing the only constraints on the interaction cross-section of dark matter from an actual detection (rather than the absence of a detection, such as in terrestrial particle accelerators), their rarity has begun to limit their exploitation. I have therefore developed a new method of extending this approach to the gradual growth of all structures in the universe, which assemble through a process of small-scale mergers. I realised that each of these mergers acts as a small bullet, with the infalling dark matter and baryonic material entering large structures along slightly different trajectories. I first developed an approximate, analytic model of how the dynamics depends on the particle properties of dark matter, and used this to verify that the technique is viable; I have now used the grant to kickstart a full effort to incorporate particle physics into cosmological simulations, and obtain more accurate predictions so that observations can be reliably interpreted. With my PhD student, I have also just completed the first observational detection of this signal, by combining the entire back archives from the Hubble Space Telescope and Chandra X-ray observatories. This analysis achieves a similar statistical precision to the sum of current analyses from a few large-scale collisions. However, the systematic errors are no longer limited by uncertainty in the impact velocity, impact parameter and initial velocity of individual systems – and my technique can grow in the future to almost unlimited statistical precision as the (overlap between) archives continue to grow.

With an eye to the long-term future of the field, I have also led efforts developing two new telescopes: the ESA Euclid telescope to be launched into space in 2019, and the HALO balloon-borne telescope scheduled for launch to the top of the atmosphere next year. In ESA’s Euclid, I manage the effort to mimise the impact on science observables of radiation damage that the satellite accumulates in the harsh environment of space. This involves coordinating between scientists and engineers to develop radiation-tolerant hardware, then to understand its limitations and develop postprocessing software that can remove the residual effects from data. This effort has been very successful so far on both fronts, and will continue of the next five to drive the mission's hardware and ground segment performance budget, and to optimize survey strategy. One immediately tangible outcome has been the application of these mitigation strategies to the Hubble Space Telescope/Advanced Camera for Surveys. My software has now been incorporated into the standard data analysis pipeline at the Space Telescope Science Institute, and is being used to enable full science exploitation of HST’s widest-field camera (which has been in orbit the longest), for all science purposes.

Most excitingly, I also initiated a project at NASA to construct the balloon-borne telescope HALO (High Altitude Lensing Observatory), and assembled a pan-European team to join the consortium. This exploits a new technical capability for long-duration balloons that, flying above 99% of the Earth’s atmosphere, provide a platform for space-quality observations at about 1% of the cost of a dedicated space mission. The main technical challenge is to stabilise the camera, which is swinging on a ~100m rope under the balloon. This has been achieved before, for imaging in the microwave region of the spectrum, which naturally has a coarse resolution, so camera stability within about 10 seconds of arc is sufficient. For migh higher resolution optical imaging, our target is an rms stability of 0.1 seconds of arc. The UK consortium has now delivered the mission’s guide control camera to JPL for integration, completing our hardware contribution; the first flights are scheduled for 2015, and we will also lead the data analysis. I have thus ensured European partnership in this new, balloon-platform technology that has the potential to revolutionise both astronomy and Earth-imaging applications.

Finally, I have been awarded a permanent contract to work at a European University, thus achieving the final goal of the grant.