Loading…

ATLAS computing and the GRID

At the most conservative estimates, ATLAS will produce over 1 Pb of data per year requiring 1-2M SpecInt95 of CPU to process and analyse, and to generate large Monte Carlo datasets. The collaboration is worldwide, and only Grids will allow all collaborators must have access to the full datasets. Atl...

Full description

Saved in:
Bibliographic Details
Published in:Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment Accelerators, spectrometers, detectors and associated equipment, 2003-04, Vol.502 (2), p.372-375
Main Author: Jones, R.W.L.
Format: Article
Language:English
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:At the most conservative estimates, ATLAS will produce over 1 Pb of data per year requiring 1-2M SpecInt95 of CPU to process and analyse, and to generate large Monte Carlo datasets. The collaboration is worldwide, and only Grids will allow all collaborators must have access to the full datasets. Atlas must develop an intercontinental distributed computing and data Grid with a user interface to shield the user from the Grid middleware and the distributed nature of the processing; we must develop automated production systems using the Grid tools; and we must provide tools that automatically distribute, install and verify the required experimental software and run-time environment to remote sites to avoid the problems of chaotic and multi-site management. Bookkeeping, replication and monitoring are also required. All of these topics are being addressed within the collaboration, with Grid tools being used for large-scale Data Challenges.
ISSN:0168-9002
1872-9576
DOI:10.1016/S0168-9002(03)00446-7