ترغب بنشر مسار تعليمي؟ اضغط هنا

Towards Improving the Predictive Capability of Computer Simulations by Integrating Inverse Uncertainty Quantification and Quantitative Validation with Bayesian Hypothesis Testing

75   0   0.0 ( 0 )
 نشر من قبل Xu Wu
 تاريخ النشر 2021
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

The Best Estimate plus Uncertainty (BEPU) approach for nuclear systems modeling and simulation requires that the prediction uncertainty must be quantified in order to prove that the investigated design stays within acceptance criteria. A rigorous Uncertainty Quantification (UQ) process should simultaneously consider multiple sources of quantifiable uncertainties: (1) parameter uncertainty due to randomness or lack of knowledge; (2) experimental uncertainty due to measurement noise; (3) model uncertainty caused by missing/incomplete physics and numerical approximation errors, and (4) code uncertainty when surrogate models are used. In this paper, we propose a comprehensive framework to integrate results from inverse UQ and quantitative validation to provide robust predictions so that all these sources of uncertainties can be taken into consideration. Inverse UQ quantifies the parameter uncertainties based on experimental data while taking into account uncertainties from model, code and measurement. In the validation step, we use a quantitative validation metric based on Bayesian hypothesis testing. The resulting metric, called the Bayes factor, is then used to form weighting factors to combine the prior and posterior knowledge of the parameter uncertainties in a Bayesian model averaging process. In this way, model predictions will be able to integrate the results from inverse UQ and validation to account for all available sources of uncertainties. This framework is a step towards addressing the ANS Nuclear Grand Challenge on Simulation/Experimentation by bridging the gap between models and data.



قيم البحث

اقرأ أيضاً

95 - Linyu Lin , Nam Dinh 2020
In nuclear engineering, modeling and simulations (M&Ss) are widely applied to support risk-informed safety analysis. Since nuclear safety analysis has important implications, a convincing validation process is needed to assess simulation adequacy, i. e., the degree to which M&S tools can adequately represent the system quantities of interest. However, due to data gaps, validation becomes a decision-making process under uncertainties. Expert knowledge and judgments are required to collect, choose, characterize, and integrate evidence toward the final adequacy decision. However, in validation frameworks CSAU: Code Scaling, Applicability, and Uncertainty (NUREG/CR-5249) and EMDAP: Evaluation Model Development and Assessment Process (RG 1.203), such a decision-making process is largely implicit and obscure. When scenarios are complex, knowledge biases and unreliable judgments can be overlooked, which could increase uncertainty in the simulation adequacy result and the corresponding risks. Therefore, a framework is required to formalize the decision-making process for simulation adequacy in a practical, transparent, and consistent manner. This paper suggests a framework Predictive Capability Maturity Quantification using Bayesian network (PCMQBN) as a quantified framework for assessing simulation adequacy based on information collected from validation activities. A case study is prepared for evaluating the adequacy of a Smoothed Particle Hydrodynamic simulation in predicting the hydrodynamic forces onto static structures during an external flooding scenario. Comparing to the qualitative and implicit adequacy assessment, PCMQBN is able to improve confidence in the simulation adequacy result and to reduce expected loss in the risk-informed safety analysis.
We present the VECMA toolkit (VECMAtk), a flexible software environment for single and multiscale simulations that introduces directly applicable and reusable procedures for verification, validation (V&V), sensitivity analysis (SA) and uncertainty qu antification (UQ). It enables users to verify key aspects of their applications, systematically compare and validate the simulation outputs against observational or benchmark data, and run simulations conveniently on any platform from the desktop to current multi-petascale computers. In this sequel to our paper on VECMAtk which we presented last year, we focus on a range of functional and performance improvements that we have introduced, cover newly introduced components, and applications examples from seven different domains such as conflict modelling and environmental sciences. We also present several implemented patterns for UQ/SA and V&V, and guide the reader through one example concerning COVID-19 modelling in detail.
112 - Chris J. Oates 2021
The lectures were prepared for the {E}cole Th{e}matique sur les Incertitudes en Calcul Scientifique (ETICS) in September 2021.
The problem of estimating certain distributions over ${0,1}^d$ is considered here. The distribution represents a quantum system of $d$ qubits, where there are non-trivial dependencies between the qubits. A maximum entropy approach is adopted to recon struct the distribution from exact moments or observed empirical moments. The Robbins Monro algorithm is used to solve the intractable maximum entropy problem, by constructing an unbiased estimator of the un-normalized target with a sequential Monte Carlo sampler at each iteration. In the case of empirical moments, this coincides with a maximum likelihood estimator. A Bayesian formulation is also considered in order to quantify posterior uncertainty. Several approaches are proposed in order to tackle this challenging problem, based on recently developed methodologies. In particular, unbiased estimators of the gradient of the log posterior are constructed and used within a provably convergent Langevin-based Markov chain Monte Carlo method. The methods are illustrated on classically simulated output from quantum simulators.
This work affords new insights into Bayesian CART in the context of structured wavelet shrinkage. The main thrust is to develop a formal inferential framework for Bayesian tree-based regression. We reframe Bayesian CART as a g-type prior which depart s from the typical wavelet product priors by harnessing correlation induced by the tree topology. The practically used Bayesian CART priors are shown to attain adaptive near rate-minimax posterior concentration in the supremum norm in regression models. For the fundamental goal of uncertainty quantification, we construct adaptive confidence bands for the regression function with uniform coverage under self-similarity. In addition, we show that tree-posteriors enable optimal inference in the form of efficient confidence sets for smooth functionals of the regression function.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا