Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Content archived on 2024-05-29

Study and validation of the computing model for the atlas experiment

Final Activity Report Summary - ATLASCM (Study and validation of the computing model for the Atlas experiment)

The Large hadron collider (LHC) at the European organisation for nuclear research (CERN) was planned to start data acquisition in 2008. The 'A toroidal LHC apparatus' (ATLAS) experiment was preparing for the data handling and analysis via a series of data challenges and production exercises to validate its computing model and provide useful samples of data for detector and physics studies. The ATLAS production system was successfully used to run production of simulation data at an unprecedented scale. Up to 50 000 jobs were processed by the system on more than 150 sites in one day.

In this work, we demonstrated the capability of such a system for user analysis. This experience would allow us to compare the execution of analysis task using such a system versus using direct submission to the infrastructure. Compared to direct submission to the infrastructure we could observe a more robust execution and we were also profiting from the system advanced monitoring capabilities. A drawback was that the system represented an additional infrastructure element which had to be operated by the experiment. For ATLAS, as well as for other experiments, it was therefore a question to evaluate the advantages of using such a system for user analysis with the additional effort necessary to operate it as a service for its users.

Distributed analysis in ATLAS was still in progress by the time of the project completion. With the start-up of LHC we expected a dramatic increase in data volume. This would therefore require the general ATLAS user to utilise grid resources to perform his analysis.