ترغب بنشر مسار تعليمي؟ اضغط هنا

Automated Reconstruction of Particle Cascades in High Energy Physics Experiments

380   0   0.0 ( 0 )
 نشر من قبل Jan Steggemann
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a procedure for reconstructing particle cascades from event data measured in a high energy physics experiment. For evaluating the hypothesis of a specific physics process causing the observed data, all possible reconstructi



قيم البحث

اقرأ أيضاً

77 - Stefan Schmitt 2016
A selection of unfolding methods commonly used in High Energy Physics is compared. The methods discussed here are: bin-by-bin correction factors, matrix inversion, template fit, Tikhonov regularisation and two examples of iterative methods. Two proce dures to choose the strength of the regularisation are tested, namely the L-curve scan and a scan of global correlation coefficients. The advantages and disadvantages of the unfolding methods and choices of the regularisation strength are discussed using a toy example.
Rapidly applying the effects of detector response to physics objects (e.g. electrons, muons, showers of particles) is essential in high energy physics. Currently available tools for the transformation from truth-level physics objects to reconstructed detector-level physics objects involve manually defining resolution functions. These resolution functions are typically derived in bins of variables that are correlated with the resolution (e.g. pseudorapidity and transverse momentum). This process is time consuming, requires manual updates when detector conditions change, and can miss important correlations. Machine learning offers a way to automate the process of building these truth-to-reconstructed object transformations and can capture complex correlation for any given set of input variables. Such machine learning algorithms, with sufficient optimization, could have a wide range of applications: improving phenomenological studies by using a better detector representation, allowing for more efficient production of Geant4 simulation by only simulating events within an interesting part of phase space, and studies on future experimental sensitivity to new physics.
98 - Dimitri Bourilkov 2019
The many ways in which machine and deep learning are transforming the analysis and simulation of data in particle physics are reviewed. The main methods based on boosted decision trees and various types of neural networks are introduced, and cutting- edge applications in the experimental and theoretical/phenomenological domains are highlighted. After describing the challenges in the application of these novel analysis techniques, the review concludes by discussing the interactions between physics and machine learning as a two-way street enriching both disciplines and helping to meet the present and future challenges of data-intensive science at the energy and intensity frontiers.
Muons are the most abundant charged particles arriving at sea level originating from the decay of secondary charged pions and kaons. These secondary particles are created when high-energy cosmic rays hit the atmosphere interacting with air nuclei ini tiating cascades of secondary particles which led to the formation of extensive air showers (EAS). They carry essential information about the extra-terrestrial events and are characterized by large flux and varying angular distribution. To account for open questions and the origin of cosmic rays, one needs to study various components of cosmic rays with energy and arriving direction. Because of the close relation between muon and neutrino production, it is the most important particle to keep track of. We propose a novel tracking algorithm based on the Geometric Deep Learning approach using graphical structure to incorporate domain knowledge to track cosmic ray muons in our 3-D scintillator detector. The detector is modeled using the GEANT4 simulation package and EAS is simulated using CORSIKA (COsmic Ray SImulations for KAscade) with a focus on muons originating from EAS. We shed some light on the performance, robustness towards noise and double hits, limitations, and application of the proposed algorithm in tracking applications with the possibility to generalize to other detectors for astrophysical and collider experiments.
86 - Andrew Fowlie 2021
I would like to thank Junk and Lyons (arXiv:2009.06864) for beginning a discussion about replication in high-energy physics (HEP). Junk and Lyons ultimately argue that HEP learned its lessons the hard way through past failures and that other fields c ould learn from our procedures. They emphasize that experimental collaborations would risk their legacies were they to make a type-1 error in a search for new physics and outline the vigilance taken to avoid one, such as data blinding and a strict $5sigma$ threshold. The discussion, however, ignores an elephant in the room: there are regularly anomalies in searches for new physics that result in substantial scientific activity but dont replicate with more data.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا