Skip to main content
European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
Contenido archivado el 2024-05-28

Resolving short-distance physics mechanisms in hadron collisions at TeV-scale energies

Final Report Summary - PROBE4TEVSCALE (Resolving short-distance physics mechanisms in hadron collisions at TeV-scale energies.)

The Large Hadron Collider (LHC) at CERN offers a promising opportunity to discover and characterise a putative new physics sector at the TeV energy scale. Due to the complexity of the experimental signatures, the direct search for new particles at this huge experimental facility is a real challenge, requiring a very clear understanding of the structure of hadron collisions. Two directions of research are particularly important in this context: (1) the development of new Monte-Carlo-based simulation tools reflecting as accurately as possible the complexity of hadron-hadron collisions and (2) the development of sophisticated multivariate techniques to maximally exploit theory information in analysing the data. My research program supported by the Marie Curie Actions was aimed at reaching significant achievements in these two directions of research, and has been completed successfully as described below.

Objective 1: Development of a generic decay algorithm and implementation in the MadGraph framework

The first objective of my proposal was to develop a generic decay algorithm designed specifically to retain all spin correlation effects beyond the lowest order in perturbation theory, in any decay chain in the Standard Model and beyond. This objective has been successfully achieved. I have proposed a new generic algorithm to generate the decay of heavy resonances in simulated events. I am the main author of the implementation of this algorithm (dubbed MadSpin) in the MadGraph5_aMC@NLO framework. The procedure captures essentially all spin correlation effects as predicted by a full next-to-leading-order calculation. Therefore such a tool –in combination with a next-to-leading-order Monte Carlo generators– is expected to give a practical access to event generation with an improved accuracy in many instances. This was demonstrated for the specific case of the production of a top quark pair in association with a Higgs boson in a dedicated publication. The tool is highly flexible and has already been used both by experimentalists (see e.g. CMS-PAS-TOP-14-001) and by theorists (see e.g. arXiv:1405.5859).

Objective 2: Development of a matrix-element-based likelihood method

This part of the proposal was aimed at bringing the multivariate analysis named Matrix Element Method onto solid ground by improving its formulation, in particular by deriving a new definition of the probability density function of scattering events that is more flexible to include QCD effects beyond the lowest order in perturbation theory (while preserving all spin-correlation effects). I have been working on this objective in the context of the measurement the mass of the top quark in dilepton final state events. I have established a procedure to calculate the matrix element weights in such a way that these weights inherit from the accuracy of state-of-the-art Monte Carlo generators (events at next-to-leading accuracy, interfaced with parton showers, and with the inclusion of spin correlation effects). Beside its ability to account for extra radiations in scattering events, such a reformulation of the matrix element method leads to a faster evaluation of the weights and can be used in the case of high statistics samples. I demonstrated the capabilities of the method in the context of the measurement of the mass of the top quark. While the associated publication is still ongoing, the method and the preliminary results have already been presented at the CERN laboratory (Top LHC WG meeting 2014).

Objective 3. Revisiting the maximum significance of specific experimental analyses for the search of new physics at the LHC

This part of the proposal was dedicated to a specific scattering process, namely the production of a scalar Higgs boson in association with a top quark pair in the lepton channel t→b(W→lν), with the Higgs decaying into b-quark jets. Due to the small production rate, the search for this channel is notoriously difficult. When the Higgs is not boosted, the large backgrounds and the complexity of the final state makes this channel a very interesting and challenging test case for advanced analysis techniques developed by the high energy physics community. I demonstrated that the sensitivity to this Higgs production mechanism can be significantly enhanced by means of the so-called matrix element reweighing method. In particular, this method is able to efficiently reduce the combinatorial problem arising from the jet multiplicity in the final state. Assuming SM production rates for signal and background processes, this matrix-element-based analysis leads to an expected 3-sigma (resp. 5-sigma) observation in the dilepton channel with a luminosity of 120 inverse femtobarn (resp. 420 inverse femtobarn) at 14 TeV. This important work -which has been published in Physical Review Letters- has triggered new studies in the CMS collaboration (as exemplified by the presentations at the Zurich Phenomenology Workshop, 2014). In a separate publication, I also explored the use of matrix-element-reweighing techniques for the characterisation of the newly discovered scalar resonance.

The achievements associated with the three objectives of the proposal has led to a very positive impact in the high-energy community. Although very recent, the associated publications already totalise more than 50 citations, hence demonstrating that the research conducted in the frame of this Marie Curie project was relevant to a large number of scientists.

As it is transparent in this report, hypothesis tests calibrated on Monte Carlo simulations play a central role in high energy physics. During my Marie Curie fellowship, I have investigated -as a side project- the possibility of developing similar statistical procedures in the context of the genetic evolution of plant populations cultivated traditionally. Such traditional farming systems have attracted a lot of attention over the past decades as they have been recognized to supply an important component in the maintenance of the genetic diversity worldwide. A broad spectrum of traditionally managed crops have been studied to investigate how reproductive properties in combination with husbandry characteristics shape the genetic structure of the crops over time. However, traditional farms typically involve populations of small size whose genetic evolution is overwhelmed with statistic fluctuations inherent to the stochastic nature of the crossings. Hence there is generally no one-to-one mapping between crop properties and measured genotype data, and claims regarding crop properties on the basis of the observed genetic structure must be stated within a confidence level to be estimated by means of a dedicated statistical analysis. In this context I have proposed a comprehensive framework to carry out such statistical analyses based on Monte Carlo simulations. Using this approach, I demonstrated that specific crop properties are much better characterized than by current methods in the literature.