ترغب بنشر مسار تعليمي؟ اضغط هنا

Real Time Global Tests of the ALICE High Level Trigger Data Transport Framework

352   0   0.0 ( 0 )
 نشر من قبل Jean Cleymans
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The High Level Trigger (HLT) system of the ALICE experiment is an online event filter and trigger system designed for input bandwidths of up to 25 GB/s at event rates of up to 1 kHz. The system is designed as a scalable PC cluster, implementing several hundred nodes. The transport of data in the system is handled by an object-oriented data flow framework operating on the basis of the publisher-subscriber principle, being designed fully pipelined with lowest processing overhead and communication latency in the cluster. In this paper, we report the latest measurements where this framework has been operated on five different sites over a global north-south link extending more than 10,000 km, processing a ``real-time data flow.



قيم البحث

اقرأ أيضاً

The ALICE High-Level Trigger processes data online, to either select interesting (sub-) events, or to compress data efficiently by modeling techniques. Focusing on the main data source, the Time Projection Chamber, the architecure of the system and the current state of the tracking and compression methods are outlined.
The ALICE High Level Trigger has to process data online, in order to select interesting (sub)events, or to compress data efficiently by modeling techniques.Focusing on the main data source, the Time Projection Chamber (TPC), we present two pattern re cognition methods under investigation: a sequential approach cluster finder and track follower) and an iterative approach (track candidate finder and cluster deconvoluter). We show, that the former is suited for pp and low multiplicity PbPb collisions, whereas the latter might be applicable for high multiplicity PbPb collisions, if it turns out, that more than 8000 charged particles would have to be reconstructed inside the TPC. Based on the developed tracking schemes we show, that using modeling techniques a compression factor of around 10 might be achievable
The High Level Trigger (HLT) of the ALICE experiment requires massive parallel computing. One of the main tasks of the HLT system is two-dimensional cluster finding on raw data of the Time Projection Chamber (TPC), which is the main data source of AL ICE. To reduce the number of computing nodes needed in the HLT farm, FPGAs, which are an intrinsic part of the system, will be utilized for this task. VHDL code implementing the Fast Cluster Finder algorithm, has been written, a testbed for functional verification of the code has been developed, and the code has been synthesized
The ALICE High Level Trigger has to process data online, in order to select interesting (sub)events, or to compress data efficiently by modeling techniques. Focusing on the main data source, the Time Projection Chamber (TPC), we present two pattern r ecognition methods under investigation: a sequential approach (cluster finder and track follower) and an iterative approach (track candidate finder and cluster deconvoluter). We show, that the former is suited for pp and low multiplicity PbPb collisions, whereas the latter might be applicable for high multiplicity PbPb collisions of dN/dy>3000. Based on the developed tracking schemes we show that using modeling techniques a compression factor of around 10 might be achievable.
The CMS experiment will collect data from the proton-proton collisions delivered by the Large Hadron Collider (LHC) at a centre-of-mass energy up to 14 TeV. The CMS trigger system is designed to cope with unprecedented luminosities and LHC bunch-cros sing rates up to 40 MHz. The unique CMS trigger architecture only employs two trigger levels. The Level-1 trigger is implemented using custom electronics, while the High Level Trigger (HLT) is based on software algorithms running on a large cluster of commercial processors, the Event Filter Farm. We present the major functionalities of the CMS High Level Trigger system as of the starting of LHC beams operations in September 2008. The validation of the HLT system in the online environment with Monte Carlo simulated data and its commissioning during cosmic rays data taking campaigns are discussed in detail. We conclude with the description of the HLT operations with the first circulating LHC beams before the incident occurred the 19th September 2008.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا