"In the next few years very significant progress is expected in the understanding of fundamental interactions, due to information coming from experiments at the LHC collider at CERN. The LHC will pose an unprecedented challenge: to discover new physical phenomena using methods which have so far been used for precision studies of phenomena which are already known. This challenge requires a radical rethinking of the way one arrives at predictions for collider processes. There have been recent attempts at collecting all the available information on important LHC processes (such as Higgs, W or top production): these studies have invariably found that theoretical and phenomenological uncertainties had been previously underestimated, that the impact of QCD corrections is very significant, and that the dominant uncertainty comes from ignorance of the structure of the nucleon. The goal of this project is to provide such a rethinking, starting from the bottom up, i.e. starting from a new determination of the quark and gluon substructure of the nucleon, as encoded in parton distributions (PDFs). These new PDFs, the NNPDF sets, are obtained using a completely new approach, based on neural networks combined with a Monte Carlo technique. This approach will provide not only a theoretically and phenomenologically reliable and consistent set of PDFs, but also a flexible tool which can be used to assess the impact of various theoretical ingredients (such as higher order perturbative corrections and their resummation) and of the new experimental information obtained at the LHC. The corresponding studies will be the main result of this project: we will be able to determine the combined impact of the wealth of theoretical and experimental information which has been accumulating over the last several years, and to use it for the discovery of New Physics at the LHC."
Fields of science
Call for proposal
See other projects for this call