ترغب بنشر مسار تعليمي؟ اضغط هنا

Direct comparison between Bayesian and frequentist uncertainty quantification for nuclear reactions

96   0   0.0 ( 0 )
 نشر من قبل Filomena Nunes
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Until recently, uncertainty quantification in low energy nuclear theory was typically performed using frequentist approaches. However in the last few years, the field has shifted toward Bayesian statistics for evaluating confidence intervals. Although there are statistical arguments to prefer the Bayesian approach, no direct comparison is available. In this work, we compare, directly and systematically, the frequentist and Bayesian approaches to quantifying uncertainties in direct nuclear reactions. Starting from identical initial assumptions, we determine confidence intervals associated with the elastic and the transfer process for both methods, which are evaluated against data via a comparison of the empirical coverage probabilities. Expectedly, the frequentist approach is not as flexible as the Bayesian approach in exploring parameter space and often ends up in a different minimum. We also show that the two methods produce significantly different correlations. In the end, the frequentist approach produces significantly narrower uncertainties on the considered observables than the Bayesian. Our study demonstrates that the uncertainties on the reaction observables considered here within the Bayesian approach represent reality more accurately than the much narrower uncertainties obtained using the standard frequentist approach.

قيم البحث

اقرأ أيضاً

139 - Jun Xu , Zhen Zhang , 2021
Within a Bayesian statistical framework using the standard Skyrme-Hartree-Fcok model, the maximum a posteriori (MAP) values and uncertainties of nuclear matter incompressibility and isovector interaction parameters are inferred from the experimental data of giant resonances and neutron-skin thicknesses of typical heavy nuclei. With the uncertainties of the isovector interaction parameters constrained by the data of the isovector giant dipole resonance and the neutron-skin thickness, we have obtained $K_0 = 223_{-8}^{+7}$ MeV at 68% confidence level using the data of the isoscalar giant monopole resonance in $^{208}$Pb measured at the Research Center for Nuclear Physics (RCNP), Japan, and at the Texas A&M University (TAMU), USA. Although the corresponding $^{120}$Sn data gives a MAP value for $K_0$ about 5 MeV smaller than the $^{208}$Pb data, there are significant overlaps in their posterior probability distribution functions.
The uncertainty quantifications of theoretical results are of great importance to make meaningful comparisons of those results with experimental data and to make predictions in experimentally unknown regions. By quantifying uncertainties, one can mak e more solid statements about, e.g., origins of discrepancy in some quantities between theory and experiment. We propose a novel method for uncertainty quantification for the effective interactions of nuclear shell-model calculations as an example. The effective interaction is specified by a set of parameters, and its probability distribution in the multi-dimensional parameter space is considered. This enables us to quantify the agreement with experimental data in a statistical manner and the resulting confidence intervals show unexpectedly large variations. Moreover, we point out that a large deviation of the confidence interval for the energy in shell-model calculations from the corresponding experimental data can be used as an indicator of some exotic property, e.g. alpha clustering, etc. Other possible applications and impacts are also discussed.
Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going efforts seek to better root nuclear DFT in the theory of nuclear forces [see Duguet et al., this issue], energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in finite nuclei. In this paper, we review recent efforts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statistical analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models; to estimate model errors and thereby improve predictive capability; to extrapolate b eyond the regions reached by experiment; and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
This work affords new insights into Bayesian CART in the context of structured wavelet shrinkage. The main thrust is to develop a formal inferential framework for Bayesian tree-based regression. We reframe Bayesian CART as a g-type prior which depart s from the typical wavelet product priors by harnessing correlation induced by the tree topology. The practically used Bayesian CART priors are shown to attain adaptive near rate-minimax posterior concentration in the supremum norm in regression models. For the fundamental goal of uncertainty quantification, we construct adaptive confidence bands for the regression function with uniform coverage under self-similarity. In addition, we show that tree-posteriors enable optimal inference in the form of efficient confidence sets for smooth functionals of the regression function.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا