European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
Contenido archivado el 2024-06-18

Precision Parton Distributions for New Physics Discoveries at the Large Hadron Collider

Final Report Summary - DISCOVERY@LHC (Precision Parton Distributions for New Physics Discoveries at the Large Hadron Collider)


The Large Hadron Collider (LHC) is the most powerful particle accelerator ever build by mankind: in its 30 km long tunnel below Geneva in Switzerland, protons are accelerated to almost the speed of light and then collide. These high-energy collisions generate a state similar in terms of energy and temperature to what happened shortly after the Big Bang. In these extreme conditions, physicists study the laws of nature at the smallest distances ever probed, and try to answer some of the ultimate questions that man has always wondered about, like the origin of the universe, the nature of forces and the quest for the ultimate constituents of matter. With the recent discovery of a Higgs-like boson at the LHC, awarded with the 2013 Nobel Prize, particle physics has entered a new era. The main challenge now is to understand in detail the properties of this new particle, and in particular, to assess whether it is the boson that completes the Standard Model of particle physics or if, on the other hand, it has a different nature, for example, whether it is a fundamental or a composite scalar particle. In addition, the LHC will continue the search for even more exotic heavy particles at the highest energies ever probed, scrutinizing for new forces or extra dimensions.

It is therefore of utmost importance to provide extremely accurate theoretical predictions for the most relevant processes at the LHC, including Higgs production, as well as for a variety of new physics scenarios. Crucial ingredients of these predictions are Parton Distribution Functions (PDFs), which encode the dynamics determining how the proton’s energy is split among its constituents, quarks and gluons, in each LHC collision. PDFs are intrinsically non-perturbative, and thus need to be extracted from data. In the last years, the researcher developed a novel approach to PDF determination based on artificial neural networks, machine learning techniques and genetic algorithms, so that PDFs are “learnt” from the wide variety of experimental data without the need of imposing a prior theory, just as we know how to score a goal without the need of solving Newton’s equations of motion, and they “adapt” towards the physically more meaningful solution.

The goals of this project were to use the most updated experimental data from the LHC, together with the best theoretical information available, and with a modern robust statistical methodology, to impose stringent constraints and achieve extremely precise PDF determinations, which then could be used improve our prospects to better characterize the Higgs boson properties and to search for new heavy particles. The ultimate goal was to contribute to the full exploitation of the LHC potential to achieve the best possible experimental and theoretical precision in the determination of parton distributions and to make essential contributions to our understanding of the laws of nature at the TeV scale at the LHC.

During the length of this project (Sep 2011 - Aug 2013), several milestones towards a new generation of PDFs for LHC phenomenology were achieved. To begin with, in Summer 2012, the researcher published the Neural Network Parton Distribution Functions NNPDF2.3 parton set, which was the first (and still only up to date) public PDF set to include the direct constraints from LHC data. The use of the NNPDF sets by the LHC community has substantially increased during these period, and in particular, the NNPDF2.1 set was used for the theory predictions of the Higgs discovery papers by ATLAS and CMS. In August 2013, he presented the first NNPDF set with QED effects, including a first ever determination of the photon PDF from experimental data, which is a key ingredient for any calculation at the LHC including electroweak effects. In various studies, we showed for the first time the type of quantitative constraints on various PDFs that can be obtained from LHC data: isolated photons and top quark production for the gluon PDF, W production in association with charm quarks for the strange PDFs, and cross-section ratios between different center of mass energies for large-x parton distributions. The relevance of his research was also recognized by his appointment, in January 2012, as Affiliate Scientist of the CMS experiment, where he is one of the coordinators of the PDF4CMS forum and author of already five CMS publications, all of them providing key information for the determinations of parton distributions.

As an integral part of my fellowship, he took active part in several activities which aim to increase the public awareness of the crucial importance of science for society as a whole. For instance, he was one of the CERN moderators of the International Particle Physics Master-classes, where high school students learned how to analyze LHC data and interpret the results. As another example, I also gave public lectures to high school science students in Barcelona, to describe the research done at CERN and to motivate their interest in basic research.

In summary, the research project PIEF-GA-2010-272515, funded by a Marie Curie Intra-European Fellowship, has successfully achieved its main goal of providing a new generation of parton distributions for LHC phenomenology. As the LHC is now upgrading its energy to 13 TeV, these new sets of PDFs, and latter upgrades and improvements, will play a key role in the full exploitation of the LHC data at the new energy frontier.