ترغب بنشر مسار تعليمي؟ اضغط هنا

NNDrone: a toolkit for the mass application of machine learning in High Energy Physics

124   0   0.0 ( 0 )
 نشر من قبل Sean Benson
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Machine learning has proven to be an indispensable tool in the selection of interesting events in high energy physics. Such technologies will become increasingly important as detector upgrades are introduced and data rates increase by orders of magnitude. We propose a toolkit to enable the creation of a drone classifier from any machine learning classifier, such that different classifiers may be standardised into a single form and executed in parallel. We demonstrate the capability of the drone neural network to learn the required properties of the input neural network without the use of any labels from the training data, only using appropriate questioning of the input neural network.

قيم البحث

اقرأ أيضاً

We investigate a new structure for machine learning classifiers applied to problems in high-energy physics by expanding the inputs to include not only measured features but also physics parameters. The physics parameters represent a smoothly varying learning task, and the resulting parameterized classifier can smoothly interpolate between them and replace sets of classifiers trained at individual values. This simplifies the training process and gives improved performance at intermediate values, even for complex problems requiring deep learning. Applications include tools parameterized in terms of theoretical model parameters, such as the mass of a particle, which allow for a single network to provide improved discrimination across a range of masses. This concept is simple to implement and allows for optimized interpolatable results.
197 - Xin Xia , Teng Li , Xing-Tao Huang 2017
A fast physics analysis framework has been developed based on SNiPER to process the increasingly large data sample collected by BESIII. In this framework, a reconstructed event data model with SmartRef is designed to improve the speed of Input/Output operations, and necessary physics analysis tools are migrated from BOSS to SNiPER. A real physics analysis $e^{+}e^{-} rightarrow pi^{+}pi^{-}J/psi$ is used to test the new framework, and achieves a factor of 10.3 improvement in Input/Output speed compared to BOSS. Further tests show that the improvement is mainly attributed to the new reconstructed event data model and the lazy-loading functionality provided by SmartRef.
This paper reports the results of an experiment in high energy physics: using the power of the crowd to solve difficult experimental problems linked to tracking accurately the trajectory of particles in the Large Hadron Collider (LHC). This experimen t took the form of a machine learning challenge organized in 2018: the Tracking Machine Learning Challenge (TrackML). Its results were discussed at the competition session at the Neural Information Processing Systems conference (NeurIPS 2018). Given 100.000 points, the participants had to connect them into about 10.000 arcs of circles, following the trajectory of particles issued from very high energy proton collisions. The competition was difficult with a dozen front-runners well ahead of a pack. The single competition score is shown to be accurate and effective in selecting the best algorithms from the domain point of view. The competition has exposed a diversity of approaches, with various roles for Machine Learning, a number of which are discussed in the document
148 - B. Messerly , R. Fine , A. Olivier 2021
We introduce the MINERvA Analysis Toolkit (MAT), a utility for centralizing the handling of systematic uncertainties in HEP analyses. The fundamental utilities of the toolkit are the MnvHnD, a powerful histogram container class, and the systematic Un iverse classes, which provide a modular implementation of the many universe error analysis approach. These products can be used stand-alone or as part of a complete error analysis prescription. They support the propagation of systematic uncertainty through all stages of analysis, and provide flexibility for an arbitrary level of user customization. This extensible solution to error analysis enables the standardization of systematic uncertainty definitions across an experiment and a transparent user interface to lower the barrier to entry for new analyzers.
54 - Gang LI , Libo Liao , Xinchou Lou 2021
Several high energy $e^{+}e^{-}$ colliders are proposed as Higgs factories by the international high energy physics community. One of the most important goals of these projects is to study the Higgs properties, such as its couplings, mass, width, and production rate, with unprecedented precision. Precision studies of the Higgs boson should be the priority and drive the design and optimization of detectors. A global analysis approach based on the multinomial distribution and Machine Learning techniques is proposed to realize an ``end-to-end analysis and to enhance the precision of all accessible decay branching fractions of the Higgs significantly. A proof-of-principle Monte Carlo simulation study is performed to show the feasibility. This approach shows that the statistical uncertainties of all branching fractions are proportional to a single parameter, which can be used as a metric to optimize the detector design, reconstruction algorithms, and data analyses. In the Higgs factories, the global analysis approach is valuable both to the Higgs measurements and detector R & D, because it has the potential for superior precision and makes detector optimization single-purpose.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا