Do you want to publish a course? Click here

An Error Analysis Toolkit for Binned Counting Experiments

149   0   0.0 ( 0 )
 Added by Rob Fine
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

We introduce the MINERvA Analysis Toolkit (MAT), a utility for centralizing the handling of systematic uncertainties in HEP analyses. The fundamental utilities of the toolkit are the MnvHnD, a powerful histogram container class, and the systematic Universe classes, which provide a modular implementation of the many universe error analysis approach. These products can be used stand-alone or as part of a complete error analysis prescription. They support the propagation of systematic uncertainty through all stages of analysis, and provide flexibility for an arbitrary level of user customization. This extensible solution to error analysis enables the standardization of systematic uncertainty definitions across an experiment and a transparent user interface to lower the barrier to entry for new analyzers.



rate research

Read More

Machine learning has proven to be an indispensable tool in the selection of interesting events in high energy physics. Such technologies will become increasingly important as detector upgrades are introduced and data rates increase by orders of magnitude. We propose a toolkit to enable the creation of a drone classifier from any machine learning classifier, such that different classifiers may be standardised into a single form and executed in parallel. We demonstrate the capability of the drone neural network to learn the required properties of the input neural network without the use of any labels from the training data, only using appropriate questioning of the input neural network.
197 - Xin Xia , Teng Li , Xing-Tao Huang 2017
A fast physics analysis framework has been developed based on SNiPER to process the increasingly large data sample collected by BESIII. In this framework, a reconstructed event data model with SmartRef is designed to improve the speed of Input/Output operations, and necessary physics analysis tools are migrated from BOSS to SNiPER. A real physics analysis $e^{+}e^{-} rightarrow pi^{+}pi^{-}J/psi$ is used to test the new framework, and achieves a factor of 10.3 improvement in Input/Output speed compared to BOSS. Further tests show that the improvement is mainly attributed to the new reconstructed event data model and the lazy-loading functionality provided by SmartRef.
171 - P. Rahkila 2007
Grain is a data analysis framework developed to be used with the novel Total Data Readout data acquisition system. In Total Data Readout all the electronics channels are read out asynchronously in singles mode and each data item is timestamped. Event building and analysis has to be done entirely in the software post-processing the data stream. A flexible and efficient event parser and the accompanying software framework have been written entirely in Java. The design and implementation of the software are discussed along with experiences gained in running real-life experiments.
We present the development and validation of a new multivariate $b$ jet identification algorithm ($b$ tagger) used at the CDF experiment at the Fermilab Tevatron. At collider experiments, $b$ taggers allow one to distinguish particle jets containing $B$ hadrons from other jets. Employing feed-forward neural network architectures, this tagger is unique in its emphasis on using information from individual tracks. This tagger not only contains the usual advantages of a multivariate technique such as maximal use of information in a jet and tunable purity/efficiency operating points, but is also capable of evaluating jets with only a single track. To demonstrate the effectiveness of the tagger, we employ a novel method wherein we calculate the false tag rate and tag efficiency as a function of the placement of a lower threshold on a jets neural network output value in $Z+1$ jet and $tbar{t}$ candidate samples, rich in light flavor and $b$ jets, respectively.
54 - Gang LI , Libo Liao , Xinchou Lou 2021
Several high energy $e^{+}e^{-}$ colliders are proposed as Higgs factories by the international high energy physics community. One of the most important goals of these projects is to study the Higgs properties, such as its couplings, mass, width, and production rate, with unprecedented precision. Precision studies of the Higgs boson should be the priority and drive the design and optimization of detectors. A global analysis approach based on the multinomial distribution and Machine Learning techniques is proposed to realize an ``end-to-end analysis and to enhance the precision of all accessible decay branching fractions of the Higgs significantly. A proof-of-principle Monte Carlo simulation study is performed to show the feasibility. This approach shows that the statistical uncertainties of all branching fractions are proportional to a single parameter, which can be used as a metric to optimize the detector design, reconstruction algorithms, and data analyses. In the Higgs factories, the global analysis approach is valuable both to the Higgs measurements and detector R & D, because it has the potential for superior precision and makes detector optimization single-purpose.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا