Skip to main content
Weiter zur Homepage der Europäischen Kommission (öffnet in neuem Fenster)
Deutsch de
CORDIS - Forschungsergebnisse der EU
CORDIS
Inhalt archiviert am 2024-06-16

DEVELOPMENTS IN INFORMATICS

Final Activity Report Summary - DEVINFO (Developments in informatics)

DEVINFO is linked to CERN's flagship, the LHC project which has triggered development of new concepts, new applications and new tools in a number of domains. The LHC's unprecedented size with its global equipment supply chain and huge experimental collaborations has created new challenges in project life-cycle management that have been successfully met.

The accelerator, a 27 km ring comprising more than 10,000 superconducting magnets is in an underground tunnel. The advanced manufacturing processes and complicated assembly procedures in many different countries around the world and finally in the LHC tunnel were monitored. Quality assurance plans for collaborating institutes and subcontractors were also necessary components to develop when preparing the supply chain. Tools for configuration management and traceability of both equipment and interventions made on equipment during transport, installation and final assembly have been made available using the Web, invented at CERN. The resulting as-built database with a description of the machine layout and manufacturing information will become the information backbone of the LHC operations and maintenance phases.

The LHC experiments will store experimental data in the peta-byte range each year. Analysing such volumes of data and searching for interesting physical phenomena requires both distributed and parallel computing approaches. The high energy particle physics community has invented the GRID computing concept to solve this problem. During the construction phase of the machine and the experimental detectors much effort has been concentrated on developing models and verification tools, models simulating the LHC GRID behaviour as well as real-time monitoring tools of the entire system. The LHC computing grid concept with its three tiered processing layers is at present the sole approach valid to handle the enormous volumes of data generated in the LHC detectors.

Models and monitoring tools used to estimate the global network traffic patterns, available bandwidth and to analyse the flow have also been developed and are in use at a number of sites to monitor the network traffic generated in simulation data challenges. Tools to manage and analyse the large volumes of data generated in each individual experiment have also been developed in the DEVINFO context. In ATLAS, extensive development has been made for the Distributed Data Management project (DDM) where a design for the DDM system was been established as part of the ATLAS Computing Technical Design Report. The final validation of design concepts and parameters were done during the LHC Computing Grid Service Challenges 3 and Service Challenges 4. A physics analyst's work bench, ROOT, has been developed for the LHC era. This work bench with its highly optimised remote access and parallel data processing has become the basic input/output framework for all the LHC experiments.

The rapid evolution of information technology makes strategic decision taking extremely difficult in a project with a 30-40 year life-cycle. One DEVINFO subproject, working in the Open Lab at CERN, has been dedicated to track the evolution of modern computer technology, hardware interconnects and networking technology. Tools to analyse performance of physics analysis programs in a GRID computing network have been designed and implemented to benchmark different systems proposed.

A project the size of LHC required additional development of new features in CERN's project and administrative management systems. The concept of Earned Value Management has been introduced and integrated into CERN's Administrative Information System. The workflow engine of the principal tool for electronic document handling, EDH, has been upgraded with a Business Process Execution Language (BPEL) workflow infrastructure.

Work on the CERN Document Server (CDS) has profited from activities in automated classification and meta data harvesting issues.

Please note that the apparently low number of publications reflects the new reality where work related to technical issues are often published on the web and no longer submitted to peer-review journals.
Mein Booklet 0 0