Modeling and Validating Time, Buffering, and Utilization of a Large-Scale, Real-Time Data Acquisition System

2017 International Conference on High Performance Computing & Simulation (HPCS)(2017)

Cited 3|Views24
No score
Abstract
Data acquisition systems for large-scale high-energy physics experiments have to handle hundreds of gigabytes per second of data, and are typically implemented as specialized data centers that connect a very large number of front-end electronics devices to an event detection and storage system. The design of such systems is often based on many assumptions, small-scale experiments and a substantial amount of over-provisioning. In this paper, we introduce a discrete event-based simulation tool that models the dataflow of the current ATLAS data acquisition system, with the main goal to be accurate with regard to the main operational characteristics. We measure buffer occupancy counting the number of elements in buffers; resource utilization measuring output bandwidth and counting the number of active processing units, and their time evolution by comparing data over many consecutive and small periods of time. We perform studies on the error in simulation when comparing the results to a large amount of real-world operational data. We show which efforts are required to minimize the error for such a configuration, and explain possible reasons for the most important outliers we are observing. Furthermore, we use this tool to derive an operational envelope of the system, which describes the minimal amount of resources required to fulfill certain real-time guarantees.
More
Translated text
Key words
CERN,ATLAS,Data Acquisition System,Simulation,Modeling,Omnet++
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined