[Show abstract][Hide abstract]ABSTRACT: Complex Event Processing (CEP) is a methodology that combines data from many sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment's operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system's behaviour into the CEP engine. Nevertheless, writing or modifying CEP rules is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the rules work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the data reader from existing storage of all relevant data streams that are produced during data taking, the second is a playback tool that allows to re-inject data of specific data taking sessions from the past into the CEP engine, and the third is a reporting tool that shows the output that the rules loaded into the engine would have produced in the live system. In this paper we describe the design and implementation of this validation system, highlight its strengths and shortcomings and indicate how such a system could be reused in similar projects.
Preview · Article · Dec 2015 · Journal of Physics Conference Series
[Show abstract][Hide abstract]ABSTRACT: The ATLAS experiment at the LHC in Geneva uses a complex and highly
distributed Trigger and Data Acquisition system, involving a very large
number of computing nodes and custom modules. The configuration of the
system is specified by schema and data in more than 1000 XML files, with
various experts responsible for updating the files associated with their
components. Maintaining an error free and consistent set of XML files
proved a major challenge. Therefore a special service was implemented;
to validate any modifications; to check the authorization of anyone
trying to modify a file; to record who had made changes, plus when and
why; and to provide tools to compare different versions of files and to
go back to earlier versions if required. This paper provides details of
the implementation and exploitation experience, that may be interesting
for other applications using many human-readable files maintained by
different people, where consistency of the files and traceability of
modifications are key requirements.
Preview · Article · Dec 2012 · Journal of Physics Conference Series
[Show abstract][Hide abstract]ABSTRACT: This paper describes P-BEAST, a highly scalable, highly available and durable system for archiving monitoring information of the trigger and data acquisition (TDAQ) system of the ATLAS experiment at CERN. Currently this consists of 20,000 applications running on 2,400 interconnected computers but it is foreseen to grow further in the near future. P-BEAST stores considerable amounts of monitoring information which would otherwise be lost. Making this data accessible, facilitates long term analysis and faster debugging. The novelty of this research consists of using a modern key-value storage technology (Cassandra) to satisfy the massive time series data rates, flexibility and scalability requirements entailed by the project. The loose schema allows the stored data to evolve seamlessly with the information flowing within the Information Service. An architectural overview of P-BEAST is presented alongside a discussion about the technologies considered as candidates for storing the data. The arguments which ultimately lead to choosing Cassandra are explained. Measurements taken during operation in production environment illustrate the data volume absorbed by the system and techniques for reducing the required Cassandra storage space overhead.
Preview · Article · Jun 2012 · Journal of Physics Conference Series
[Show abstract][Hide abstract]ABSTRACT: The reconstruction of photons in the ATLAS detector is studied with data taken during the 2004 Combined Test Beam, where a full slice of the ATLAS detector was exposed to beams of particles of known energy at the CERN SPS. The results presented show significant differences in the longitudinal development of the electromagnetic shower between converted and unconverted photons as well as in the total measured energy. The potential to use the reconstructed converted photons as a means to precisely map the material of the tracker in front of the electromagnetic calorimeter is also considered. All results obtained are compared with a detailed Monte-Carlo simulation of the test-beam setup which is based on the same simulation and reconstruction tools as those used for the ATLAS detector itself.
Full-text · Article · Mar 2011 · Journal of Instrumentation
[Show abstract][Hide abstract]ABSTRACT: This paper presents a software environment to automatically configure and run online triggering and dataflow farms for the ATLAS experiment at the Large Hadron Collider (LHC). It provides support for a broad set of users, with distinct knowledge about the online triggering system, ranging from casual testers to final system deployers. This level of automatization improves the overall ATLAS TDAQ work flow for software and hardware tests and speeds-up system modifications and deployment.
Full-text · Article · Mar 2011 · Computer Physics Communications
[Show abstract][Hide abstract]ABSTRACT: A new method for calibrating the hadron response of a segmented calorimeter
is developed and successfully applied to beam test data. It is based on a
principal component analysis of energy deposits in the calorimeter layers,
exploiting longitudinal shower development information to improve the measured
energy resolution. Corrections for invisible hadronic energy and energy lost in
dead material in front of and between the calorimeters of the ATLAS experiment
were calculated with simulated Geant4 Monte Carlo events and used to
reconstruct the energy of pions impinging on the calorimeters during the 2004
Barrel Combined Beam Test at the CERN H8 area. For pion beams with energies
between 20 GeV and 180 GeV, the particle energy is reconstructed within 3% and
the energy resolution is improved by between 11% and 25% compared to the
resolution at the electromagnetic scale.
Full-text · Article · Dec 2010 · Journal of Instrumentation
[Show abstract][Hide abstract]ABSTRACT: In 2004 at the ATLAS (A Toroidal LHC ApparatuS) combined test beam, one slice of the ATLAS barrel detector (including an Inner Detector set-up and the Liquid Argon calorimeter) was exposed to particles from the H8 SPS beam line at CERN. It was the first occasion to test the combined electron performance of ATLAS. This paper presents results obtained for the momentum measurement p with the Inner Detector and for the performance of the electron measurement with the LAr calorimeter (energy E linearity and resolution) in the presence of a magnetic field in the Inner Detector for momenta ranging from 20 GeV/c to 100 GeV/c. Furthermore the particle identification capabilities of the Transition Radiation Tracker, Bremsstrahlungs-recovery algorithms relying on the LAr calorimeter and results obtained for the E/p ratio and a way how to extract scale parameters will be discussed.
Full-text · Article · Nov 2010 · Journal of Instrumentation
[Show abstract][Hide abstract]ABSTRACT: ATLAS is the biggest of the experiments aimed at studying high-energy particle interactions at the Large Hadron Collider (LHC). This paper describes the evolution of the Controls and Configuration system of the ATLAS Trigger and Data Acquisition (TDAQ) from the Technical Design Report (TDR) in 2003 to the first events taken at CERN with circulating beams in autumn 2008. The present functionality and performance and the lessons learned during the development are outlined. At the end we will also highlight some of the challenges which still have to be met by 2010, when the full scale of the trigger farm will be deployed.
Full-text · Article · Nov 2010 · Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment
[Show abstract][Hide abstract]ABSTRACT: A fully instrumented slice of the ATLAS detector was exposed to test beams from the SPS (Super Proton Synchrotron) at CERN in 2004. In this paper, the results of the measurements of the response of the barrel calorimeter to hadrons with energies in the range 20-350 GeV and beam impact points and angles corresponding to pseudo-rapidity values in the range 0.2-0.65 are reported. The results are compared to the predictions of a simulation program using the Geant 4 toolkit. (C) 2010 Published by Elsevier B.V.
Full-text · Article · Sep 2010 · Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment
[Show abstract][Hide abstract]ABSTRACT: The response of pions and protons in the energy range of 20–180 GeV, produced at CERN's SPS H8 test-beam line in the ATLAS iron–scintillator Tile hadron calorimeter, has been measured. The test-beam configuration allowed the measurement of the longitudinal shower development for pions and protons up to 20 nuclear interaction lengths. It was found that pions penetrate deeper in the calorimeter than protons. However, protons induce showers that are wider laterally to the direction of the impinging particle. Including the measured total energy response, the pion-to-proton energy ratio and the resolution, all observations are consistent with a higher electromagnetic energy fraction in pion-induced showers. The data are compared with GEANT4 simulations using several hadronic physics lists. The measured longitudinal shower profiles are described by an analytical shower parametrization within an accuracy of 5–10%. The amount of energy leaking out behind the calorimeter is determined and parametrized as a function of the beam energy and the calorimeter depth. This allows for a leakage correction of test-beam results in the standard projective geometry.
Full-text · Article · Apr 2010 · Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment
[Show abstract][Hide abstract]ABSTRACT: ATLAS is a general-purpose experiment in high-energy physics at Large Hadron Collider at CERN. ATLAS Trigger and Data Acquisition (TDAQ) system is a distributed computing system which is responsible for transferring and filtering the physics data from the experiment to mass-storage. TDAQ software is developed since 1998 by a team of few dozens developers. It is used for integration of all ATLAS subsystem participating in data-taking, providing framework and API for building the s/w pieces of TDAQ system. It is currently composed of more then 200 s/w packages which are available for ATLAS users in form of regular software releases. The s/w is available for development on a shared filesystem, on test beds and it is deployed to the ATLAS pit where it is used for data-taking. The paper describes the working model, the policies and the tools which are used by s/w developers and s/w librarians in order to develop, release, deploy and maintain the TDAQ s/w for the long period of development, commissioning and running the TDAQ system. In particular, the patching and distribution model based on RPM packaging is discussed, which is important for the s/w which is maintained for a long period on the running production system.
[Show abstract][Hide abstract]ABSTRACT: A fully instrumented slice of the ATLAS central detector was exposed to test beams from the SPS (Super Proton Synchrotron) at CERN in 2004. In this paper, the response of the central calorimeters to pions with energies in the range between 3 and 9 GeV is presented. The linearity and the resolution of the combined calorimetry (electromagnetic and hadronic calorimeters) was measured and compared to the prediction of a detector simulation program using the toolkit Geant 4.
Full-text · Article · Aug 2009 · Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment
[Show abstract][Hide abstract]ABSTRACT: We report test beam studies of 11% of the production ATLAS Tile Calorimeter modules. The modules were equipped with production front-end electronics and all the calibration systems planned for the final detector. The studies used muon, electron and hadron beams ranging in energy from 3 to 350GeV.Two independent studies showed that the light yield of the calorimeter was ∼70pe/GeV, exceeding the design goal by 40%. Electron beams provided a calibration of the modules at the electromagnetic energy scale. Over 200 calorimeter cells the variation of the response was 2.4%. The linearity with energy was also measured. Muon beams provided an intercalibration of the response of all calorimeter cells. The response to muons entering in the ATLAS projective geometry showed an RMS variation of 2.5% for 91 measurements over a range of rapidities and modules. The mean response to hadrons of fixed energy had an RMS variation of 1.4% for the modules and projective angles studied. The response to hadrons normalized to incident beam energy showed an 8% increase between 10 and 350GeV, fully consistent with expectations for a noncompensating calorimeter. The measured energy resolution for hadrons of σ/E=52.9%/E⊕5.7% was also consistent with expectations.Other auxiliary studies were made of saturation recovery of the readout system, the time resolution of the calorimeter and the performance of the trigger signals from the calorimeter.
Full-text · Article · Apr 2009 · Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment
[Show abstract][Hide abstract]ABSTRACT: The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.
Full-text · Article · Aug 2008 · Journal of Instrumentation
[Show abstract][Hide abstract]ABSTRACT: The ATLAS conditions databases will be used to manage information of quite diverse nature and level of complexity. The usage of a relational database manager like Oracle, together with the object managers POOL and OKS developed in-house, poses special difficulties in browsing the available data while understanding its structure in a general way. This is particularly relevant for the database browser projects where it is difficult to link with the class defining libraries generated by general frameworks such as Athena. A modular approach to tackle these problems is presented here.
The database infrastructure is under development using the LCG COOL infrastructure, and provides a powerful information sharing gateway upon many different systems. The nature of the stored information ranges from temporal series of simple values up to very complex objects describing the configuration of systems like ATLAS' TDAQ infrastructure, including also associations to large objects managed outside of the database infrastructure.
An important example of this architecture is the Online Objects Extended Database BrowsEr (NODE), which is designed to access and display all data, available in the ATLAS Monitoring Data Archive (MDA), including histograms and data tables. To deal with the special nature of the monitoring objects, a plugin from the MDA framework to the Time managed science Instrument Databases (TIDB2) is used. The database browser is extended, in particular to include operations on histograms such as display, overlap, comparisons as well as commenting and local storage.
No preview · Article · Jul 2008 · Journal of Physics Conference Series