ترغب بنشر مسار تعليمي؟ اضغط هنا

Many-core applications to online track reconstruction in HEP experiments

105   0   0.0 ( 0 )
 نشر من قبل Peter Wittich
 تاريخ النشر 2013
والبحث باللغة English




اسأل ChatGPT حول البحث

Interest in parallel architectures applied to real time selections is growing in High Energy Physics (HEP) experiments. In this paper we describe performance measurements of Graphic Processing Units (GPUs) and Intel Many Integrated Core architecture (MIC) when applied to a typical HEP online task: the selection of events based on the trajectories of charged particles. We use as benchmark a scaled-up version of the algorithm used at CDF experiment at Tevatron for online track reconstruction - the SVT algorithm - as a realistic test-case for low-latency trigger systems using new computing architectures for LHC experiment. We examine the complexity/performance trade-off in porting existing serial algorithms to many-core devices. Measurements of both data processing and data transfer latency are shown, considering different I/O strategies to/from the parallel devices.



قيم البحث

اقرأ أيضاً

Interest in many-core architectures applied to real time selections is growing in High Energy Physics (HEP) experiments. In this paper we describe performance measurements of many-core devices when applied to a typical HEP online task: the selection of events based on the trajectories of charged particles. We use as benchmark a scaled-up version of the algorithm used at CDF experiment at Tevatron for online track reconstruction - the SVT algorithm - as a realistic test-case for low-latency trigger systems using new computing architectures for LHC experiment. We examine the complexity/performance trade-off in porting existing serial algorithms to many-core devices. We measure performance of different architectures (Intel Xeon Phi and AMD GPUs, in addition to NVidia GPUs) and different software environments (OpenCL, in addition to NVidia CUDA). Measurements of both data processing and data transfer latency are shown, considering different I/O strategies to/from the many-core devices.
Petabytes of data are to be processed and stored requiring millions of CPU-years in high energy particle (HEP) physics event simulation. This enormous demand is handled in worldwide distributed computing centers as part of the LHC computing grid. The se significant resources require a high quality and efficient production and the early detection of potential errors. In this article we present novel monitoring techniques in a Grid environment to collect quality measures during job execution. This allows online assessment of data quality information to avoid configuration errors or inappropriate settings of simulation parameters and therefore is able to save time and resources.
As a concern with the reliability, bandwidth and mass of future optical links in LHC experiments, we are investigating CW lasers and light modulators as an alternative to VCSELs. These links will be particularly useful if they utilize light modulator s which are very small, low power, high bandwidth, and are very radiation hard. We have constructed a test system with 3 such links, each operating at 10 Gb/s. We present the quality of these links (jitter, rise and fall time, BER) and eye mask margins (10GbE) for 3 different types of modulators: LiNbO3-based, InP-based, and Si-based. We present the results of radiation hardness measurements with up to ~1012 protons/cm2 and ~65 krad total ionizing dose (TID), confirming no single event effects (SEE) at 10 Gb/s with either of the 3 types of modulators. These optical links will be an integral part of intelligent tracking systems at various scales from coupled sensors through intra-module and off detector communication. We have used a Si-based photonic transceiver to build a complete 40 Gb/s bi-directional link (10 Gb/s in each of four fibers) for a 100m run and have characterized it to compare with standard VCSEL-based optical links. Some future developments of optical modulator-based high bandwidth optical readout systems, and applications based on both fiber and free space data links, such as local triggering and data readout and trigger-clock distribution, are also discussed.
115 - Bjorn S. Wonsak 2018
Unsegmented, large-volume liquid scintillator (LS) neutrino detectors have proven to be a key technology for low-energy neutrino physics. The efficient rejection of radionuclide background induced by cosmic muon interactions is of paramount importanc e for their success in high-precision MeV neutrino measurements. We present a novel technique to reconstruct GeV particle tracks in LS, whose main property, the resolution of topological features and changes in the differential energy loss $mathrm{d}E/mathrm{d}x$, allows for improved rejection strategies. Different to common track reconstruction approaches, our method does not rely on concrete track / topology hypotheses. Instead, based on a reference point in space and time, the observed distribution of photon arrival times at the photosensors and the detectors characteristics in terms of photon production, propagation and detection (optical model), it reconstructs the voxelized distribution of optical photon emissions. Techniques from three-dimensional data analysis can then be applied to extract parameters describing the topology, e.g., the direction of a track. We performed a first performance evaluation of our method using single muon events with up to $10,mathrm{GeV}$ from a Geant4 simulation of the LENA detector. The current results indicate that our approach is competitive with existing reconstruction methods -- although its full potential has not yet been exploited. We also remark on other detector technologies in astroparticle physics as well as applications in medical imaging that could benefit from the fundamental ideas of our method.
140 - Miriam Giorgini 2008
The nuclear track detector CR39 was calibrated with different ions of different energies. Due to the low detection threshold (Z/beta~6e) and the good charge resolution (sigma_Z ~ 0.2e for 6e < Z/beta <83e with 2 measurements), the detector was used f or different purposes: (i) fragmentation of high and medium energy ions; (ii) search for magnetic monopoles, nuclearites, strangelets and Q-balls in the cosmic radiation.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا