ترغب بنشر مسار تعليمي؟ اضغط هنا

Averaging Results with Theoretical Uncertainties

501   0   0.0 ( 0 )
 نشر من قبل Frank Porter
 تاريخ النشر 2011
  مجال البحث فيزياء
والبحث باللغة English
 تأليف F. C. Porter




اسأل ChatGPT حول البحث

Combining measurements which have theoretical uncertainties is a delicate matter, due to an unclear statistical basis. We present an algorithm based on the notion that a theoretical uncertainty represents an estimate of bias.



قيم البحث

اقرأ أيضاً

Recent statistical evaluations for High-Energy Physics measurements, in particular those at the Large Hadron Collider, require careful evaluation of many sources of systematic uncertainties at the same time. While the fundamental aspects of the stati stical treatment are now consolidated, both using a frequentist or a Bayesian approach, the management of many sources of uncertainties and their corresponding nuisance parameters in analyses that combine multiple control regions and decay channels, in practice, may pose challenging implementation issues, that make the analysis infrastructure complex and hard to manage, eventually resulting in simplifications in the treatment of systematics, and in limitations to the result interpretation. Typical cases will be discussed, having in mind the most popular implementation tool, RooStats, with possible ideas about improving the management of such cases in future software implementations.
Evaluated nuclear data uncertainties are often perceived as unrealistic, most often because they are thought to be too small. The impact of this issue in applied nuclear science has been discussed widely in recent years. Commonly suggested causes are : poor estimates of specific error components, neglect of uncertainty correlations, and overlooked known error sources. However, instances have been reported where very careful, objective assessments of all known error sources have been made with realistic error magnitudes and correlations provided, yet the resulting evaluated uncertainties still appear to be inconsistent with observed scatter of predicted mean values. These discrepancies might be attributed to significant unrecognized sources of uncertainty (USU) that limit the accuracy to which these physical quantities can be determined. The objective of our work has been to develop procedures for revealing and including USU estimates in nuclear data evaluations involving experimental input data. We conclude that the presence of USU may be revealed, and estimates of magnitudes made, through quantitative analyses. This paper identifies several specific clues that can be explored by evaluators in identifying the existence of USU. It then describes numerical procedures to generate quantitative estimates of USU magnitudes. Key requirements for these procedures to be viable are that sufficient numbers of data points be available, for statistical reasons, and that additional supporting information about the measurements be provided by the experimenters. Realistic examples are described to illustrate these procedures and demonstrate their outcomes as well as limitations. Our work strongly supports the view that USU is an important issue in nuclear data evaluation, with significant consequences for applications, and that this topic warrants further investigation by the nuclear science community.
154 - G.Vianello 2017
Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (on measurement) is contrasted with a background-only observation free of the effect (off measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely-used formula from [LiMa], which assumes that both measurements are Poisson random variables. In this paper we study three other cases: i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, ii) the case where the background estimate $b$ in the off measurement has an additional systematic uncertainty, and iii) the case where $b$ is a Gaussian random variable instead of a Poisson random variable. The latter case applies when $b$ comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use in this case a formula which is only valid when $b$ is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short Gamma-Ray Bursts and of new X-ray or $gamma$-ray sources.
84 - J. Calcutt , C. Thorpe , K. Mahn 2021
Geant4Reweight is an open-source C++ framework that allows users to 1) weight tracks produced by the GEANT4 particle transport Monte Carlo simulation according to hadron interaction cross section variations and 2) estimate uncertainties in GEANT4 int eraction models by comparing the simulations hadron interaction cross section predictions to data. The ability to weight hadron transport as simulated by GEANT4 is crucial to the propagation of systematic uncertainties related to secondary hadronic interactions in current and upcoming neutrino oscillation experiments, including MicroBooNE, NOvA, and DUNE, as well as hadron test beam experiments such as ProtoDUNE. We provide motivation for weighting hadron tracks in GEANT4 in the context of systematic uncertainty propagation, a description of GEANT4s transport simulation technique, and a description of our weighting technique and fitting framework in the momentum range 0--10 GeV/c, which is typical for the hadrons produced by neutrino interactions in these experiments.
135 - Milan Krbalek 2012
This article presents a derivation of analytical predictions for steady-state distributions of netto time gaps among clusters of vehicles moving inside a traffic stream. Using the thermodynamic socio-physical traffic model with short-ranged repulsion between particles (originally introduced in [Physica A textbf{333} (2004) 370]) we firstly derive the time-clearance distribution in the model. Consecutively, the statistical distributions for the so-called time multi-clearances are calculated by means of theory of functional convolutions. Moreover, all the theoretical surmises used during the above-mentioned calculations are proven by the statistical analysis of traffic data. The mathematical predictions acquired in this paper are thoroughly compared with relevant empirical quantities and discussed in the context of three-phase traffic theory.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا