Skip to main content
European Commission logo print header

Innovative Tools for Event Selection in high energy physics

Final Report Summary - ITES (Innovative tools for event selection in high energy physics)

Understanding the basic laws of nature was and is one of the major challenges of the human being. In particular a lot of efforts have gone in the determination of whether there are fundamental components of matter and which their properties are or if the quest for the smallest constituent will be endless.

Last year two experiments at Large Hadron Collider (LHC) at Cern (Geneva) reported the discovery of a new particle, hoped to be the Higgs boson. The theory developed by physicists to explain the behaviour of nature, the standard model, predicts a peculiar role for the Higgs boson: it generates the mass of all other fundamental particles, otherwise mass-less in the theory. The discovery of the new particle is the beginning of a new investigation devoted to establish its identity: at the moment only the mass is known with good precision and in order to establish if this is really the Higgs boson several other properties have to be measured.

On the other hand only this new particle has been found while the theory predicts the existence of many others necessary to explain the nature as we know it. This tells us that we have to continue with our work in order to understand the Cosmos. The research activities dedicated to this goal are based on tools characterised by an extremely advanced technology, like colliders, where particles are accelerated and let collide 'head-on', and the complementary detectors, used to study the products of the collisions, containing a wealth of particles with, hopefully, a few new ones. The run for higher energies and higher intensity (higher number of collisions per unit time) has favoured the development of new technologies also in the field of informatics, in order to be able to handle the huge amount of data collected.

The ITES project regarded the development of innovative tools and strategies that can be used to select particularly interesting events among the wealth produced in the extreme conditions of colliders like LHC. The idea was to first study new strategies with the data collected by the computational Collider Detector at Fermilab (CDF) experiment at the Tevatron collider before LHC started and then transfer them to an LHC experiment exploiting the experience gained. One of the most crucial ingredients of this kind of experiments is the selection of events that have to be written to tape for long-term preservation, the so called real-time events selection or trigger. In particular the implementation of the most powerful selections in real time is essential when the most interesting processes are very rare and hidden in extremely large levels of other non-interesting events, called background, like the case of the Higgs boson.

The first phase of the project consisted in the upgrade of a part of the Silicon Vertex Tracker (SVT) used in the CDF trigger system to reconstruct in real time the trajectories of the particles (tracks). SVT was originally built to select a particle called 'beauty' by requiring the tracks produced by this particle displaced with respect to the primary vertex. This trigger had an extremely significant impact on the CDF physics programme overall. SVT reconstructs tracks in two phases: pattern recognition and then track fitting. The pattern recognition consists of finding possible tracks with a very coarse resolution by comparing the information collected by the detector to pre-computed low resolution tracks (patterns) stored in the associative memory chips. The resolution of the track parameters is then improved to the best achievable level, as happens later in the offline reconstruction, by fitting these tracks with the fitting boards.

This upgrade, called Gigafitter, aimed to increase SVT track reconstruction efficiency in the last period of data taking when the running conditions were much more complicated than the previous years and similar to those of the first period of LHC. The new track fitting processor was successfully implemented, installed and tested at CDF; based on a powerful field programmable gate array (FPGA) board, much more compact and easy to maintain with respect to the old system (a single board, instead of 12), the Gigafitter was able to reconstruct more than a billion of tracks per second (hence its name): it improved SVT track reconstruction capability allowing to recover tracks in regions of the detector not accessible before.

The experience gained during the Gigafitter upgrade, was then exploited to study new solutions for online tracking at future experiments where tracks generated by the sought particle have to be found among the thousands created by the noise. Different new technologies have been studied. There is the possibility to build a three-dimensional (3D) associative memory by using a 3D integrated circuit technology as a way to implement associative memory structures for fast pattern recognition applications. A generic research and development (R&D) proposal (VIPRAM) on this subject has been approved by the Department of Energy of the United States.

Another technology is based on a commercial device, the graphic processing unit (GPU). GPUs are recently replacing standard central processing units (CPUs) in certain scientific applications: thanks to their architecture, made of hundreds of parallelised processors and to their flexible memory rearrangement, GPUs are the ideal tool for intensive parallel computation. The implementation of trigger algorithms by means of commercial processors has recently become particularly interesting: sophisticated selection algorithms can be easily implemented on these devices with a very convenient ratio performance / price. A study was carried on exploiting CDF trigger test-stand to assess the strength and weakness of GPU applications to online tracking. First results showed that a GPU can fit a billion of tracks per seconds, comparable to the Gigafitter. The data transfer to and from this new device was an issue until last year when it was solved by new techniques developed by several companies.

During the last year the use of GPU for trigger applications has spread out to the point that now almost all the experiments at LHC are considering to use farms of GPUs for online events selection. The former CDF group of Padova University recently joined the LHCb collaboration that runs an experiment devoted to the study of the beauty hadrons and it is proposing a trigger similar to that used in CDF but realised with GPUs. The experience gained in the past with the SVT trigger and the results obtained with the online tracking on GPUs convinced the collaboration to include this project in the upgrade program of the experiment.

Once the data are collected the wanted particle has to be identified. This is obtained using very powerful analysis tools based on mathematical methods applied to physics. In the ITES project we propose multivariate techniques to distinguish between what we call signal and the background. The goal was to search for the Higgs boson exploiting its decay in jets containing b hadrons (b-jets). The challenge here was to identify the b-jets coming from the Higgs among a huge amount of b-jets produced in the collisions. Now that we know that the Higgs exists this analysis technique can be used to select samples of Higgs bosons events to study its properties.

Developments in both online and offline selections obtained by ITES project, will have also an impact outside the physics field. The SVT-like processor, that essentially performs data correlation searches preceded or followed by a processing stage, can be used for example for the implementation of a vision mechanism based on pattern recognition. Such mechanism has been proposed by researchers of the Psychology Department in Florence. We are collaborating with them to study the feasibility of the implementation of their model on GPUs. Such a device will not only allow to validate the model itself, but will be a first step towards very fast automatic image analysis. This can have wide applications, for example in the analysis of medical images, or in security applications.
gf.jpg