ترغب بنشر مسار تعليمي؟ اضغط هنا

Prior-predictive value from fast-growth simulations: Error analysis and bias estimation

222   0   0.0 ( 0 )
 نشر من قبل Alberto Favaro
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Variants of fluctuation theorems recently discovered in the statistical mechanics of non-equilibrium processes may be used for the efficient determination of high-dimensional integrals as typically occurring in Bayesian data analysis. In particular for multimodal distributions, Monte-Carlo procedures not relying on perfect equilibration are advantageous. We provide a comprehensive statistical error analysis for the determination of the prior-predictive value in a Bayes problem building on a variant of the Jarzynski equation. Special care is devoted to the characterization of the bias intrinsic to the method. We also discuss the determination of averages over multimodal posterior distributions with the help of a variant of the Crooks theorem. All our findings are verified by extensive numerical simulations of two model systems with bimodal likelihoods.



قيم البحث

اقرأ أيضاً

We examine the problem of construction of confidence intervals within the basic single-parameter, single-iteration variation of the method of quasi-optimal weights. Two kinds of distortions of such intervals due to insufficiently large samples are ex amined, both allowing an analytical investigation. First, a criterion is developed for validity of the assumption of asymptotic normality together with a recipe for the corresponding corrections. Second, a method is derived to take into account the systematic shift of the confidence interval due to the non-linearity of the theoretical mean of the weight as a function of the parameter to be estimated. A numerical example illustrates the two corrections.
We propose a novel method for computing $p$-values based on nested sampling (NS) applied to the sampling space rather than the parameter space of the problem, in contrast to its usage in Bayesian computation. The computational cost of NS scales as $l og^2{1/p}$, which compares favorably to the $1/p$ scaling for Monte Carlo (MC) simulations. For significances greater than about $4sigma$ in both a toy problem and a simplified resonance search, we show that NS requires orders of magnitude fewer simulations than ordinary MC estimates. This is particularly relevant for high-energy physics, which adopts a $5sigma$ gold standard for discovery. We conclude with remarks on new connections between Bayesian and frequentist computation and possibilities for tuning NS implementations for still better performance in this setting.
Using the latest numerical simulations of rotating stellar core collapse, we present a Bayesian framework to extract the physical information encoded in noisy gravitational wave signals. We fit Bayesian principal component regression models with know n and unknown signal arrival times to reconstruct gravitational wave signals, and subsequently fit known astrophysical parameters on the posterior means of the principal component coefficients using a linear model. We predict the ratio of rotational kinetic energy to gravitational energy of the inner core at bounce by sampling from the posterior predictive distribution, and find that these predictions are generally very close to the true parameter values, with $90%$ credible intervals $sim 0.04$ and $sim 0.06$ wide for the known and unknown arrival time models respectively. Two supervised machine learning methods are implemented to classify precollapse differential rotation, and we find that these methods discriminate rapidly rotating progenitors particularly well. We also introduce a constrained optimization approach to model selection to find an optimal number of principal components in the signal reconstruction step. Using this approach, we select 14 principal components as the most parsimonious model.
LISA is the upcoming space-based Gravitational Wave telescope. LISA Pathfinder, to be launched in the coming years, will prove and verify the detection principle of the fundamental Doppler link of LISA on a flight hardware identical in design to that of LISA. LISA Pathfinder will collect a picture of all noise disturbances possibly affecting LISA, achieving the unprecedented pureness of geodesic motion necessary for the detection of gravitational waves. The first steps of both missions will crucially depend on a very precise calibration of the key system parameters. Moreover, robust parameters estimation is of fundamental importance in the correct assessment of the residual force noise, an essential part of the data processing for LISA. In this paper we present a maximum likelihood parameter estimation technique in time domain being devised for this calibration and show its proficiency on simulated data and validation through Monte Carlo realizations of independent noise runs. We discuss its robustness to non-standard scenarios possibly arising during the real-life mission, as well as its independence to the initial guess and non-gaussianities. Furthermore, we apply the same technique to data produced in mission-like fashion during operational exercises with a realistic simulator provided by ESA.
95 - Linyu Lin , Nam Dinh 2020
In nuclear engineering, modeling and simulations (M&Ss) are widely applied to support risk-informed safety analysis. Since nuclear safety analysis has important implications, a convincing validation process is needed to assess simulation adequacy, i. e., the degree to which M&S tools can adequately represent the system quantities of interest. However, due to data gaps, validation becomes a decision-making process under uncertainties. Expert knowledge and judgments are required to collect, choose, characterize, and integrate evidence toward the final adequacy decision. However, in validation frameworks CSAU: Code Scaling, Applicability, and Uncertainty (NUREG/CR-5249) and EMDAP: Evaluation Model Development and Assessment Process (RG 1.203), such a decision-making process is largely implicit and obscure. When scenarios are complex, knowledge biases and unreliable judgments can be overlooked, which could increase uncertainty in the simulation adequacy result and the corresponding risks. Therefore, a framework is required to formalize the decision-making process for simulation adequacy in a practical, transparent, and consistent manner. This paper suggests a framework Predictive Capability Maturity Quantification using Bayesian network (PCMQBN) as a quantified framework for assessing simulation adequacy based on information collected from validation activities. A case study is prepared for evaluating the adequacy of a Smoothed Particle Hydrodynamic simulation in predicting the hydrodynamic forces onto static structures during an external flooding scenario. Comparing to the qualitative and implicit adequacy assessment, PCMQBN is able to improve confidence in the simulation adequacy result and to reduce expected loss in the risk-informed safety analysis.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا