ATLAS computing

Data processing plays a crucial role on all levels of the experiment: Computing systems capture the data produced directly in the detector. Besides that, the ability to prepare simulated events and results for scientific publications is critical for the researchers.

Therefore the buildup of the ATLAS detector with all its constituent detectors is geared toward the digital usability of the data: All signals that particles leave behind, for example in track detectors or muon chambers, are electronically recorded and digitized. Digitization means that a measurement, such as the size of a signal in one cell of the calorimeter, is converted to a binary number as compact as possible. In this way, all measurements can be quickly transported out of the detector into nearby computer systems and processed further there.

Data analysis at the LHC – a herculean task

In the ATLAS detector, proton packets collide with each other at the rate of 40 million times per second. The amount of data coming in as a result is unimaginably large – bigger than the total of all telecommunications traffic on Earth. Only one hundred-thousandth part of it is interesting for the investigation of matter. But even this generates huge data volumes of more than a gigabyte per second. For the discovery of the Higgs particle, the physicists at the ATLAS experiment analyzed 10 million gigabytes of data – a herculean effort that could only be performed with a unique IT infrastructure.

How physics is processed in the computing network

When the ATLAS experiment is taking data, several megabytes of data from selected events are collected and stored up to 1000 times per second. Thus a data stream of around one gigabyte per second must be processed. This volume of data corresponds to the simultaneous streaming of 50 high-definition videos.

The data of the ATLAS experiment are stored at CERN and distributed worldwide in 11 large computing centers (Tier 1). One of these large computing centers is located at the Karlsruhe Institute of Technology and also serves, besides ATLAS, the other big experiments at the Large Hadron Collider. The data processing and the production of simulated data take place in more than 100 smaller computing centers (Tier 2) in collaboration with the 11 large centers.

MPP operates a Tier 2 computing center

The ATLAS group at the MPP operates one such Tier 2 computing center at the Max Planck Computing and Data Facility (MPCDF). The group carries the funding and provides staff to take care of the so-called grid middleware and storage systems as well as handling technical support.

The computing facility of the MPP at the MPCDF, which for the most part is used for the purposes of the ATLAS group's Tier 2 computing center, has at present more than two petabytes of storage capacity (one petabyte corresponds to a million gigabytes) and more than 100 high-performance server computers. A typical server computer has two CPUs with 12 cores each, 128 gigabytes of RAM, a fast hard drive, and a 10 gigabyte-per-second network connection.

More information on the ATLAS computing group

E-mail address: e-mail@mpp.mpg.de
Phone number: +49 89 32354-extension
name function e-mail extension office
Bethke, Siegfried, Prof. Dr. Emeritus bethke 381 A.2.05
Britzger, Daniel, Dr. Postdoc britzger 453 A.2.28
Buchin, Daniel PhD Student dbuchin 376 A.2.21
Delle Fratte, Cesare Senior Scientist cdf 518 A.1.77
Hessler, Johannes PhD Student jhessler 453 A.2.28
Kado, Marumi, Prof. Dr. Director kado 382 A.2.45
Kluth, Stefan, PD Dr. Senior Scientist skluth 468 A.2.05
Schielke, Anja Secretary schielke 299 A.2.47
Stonjek, Stefan, Dr. Senior Scientist stonjek 296 A.2.26
Tabriz, Meisam, Dr. IT tabriz 603 A.1.77
Verbytskyi, Andrii, Dr. Senior Scientist andriish 454 A.2.31