ترغب بنشر مسار تعليمي؟ اضغط هنا

Calibration of Computational Models with Categorical Parameters and Correlated Outputs via Bayesian Smoothing Spline ANOVA

86   0   0.0 ( 0 )
 نشر من قبل Curtis Storlie
 تاريخ النشر 2014
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

It has become commonplace to use complex computer models to predict outcomes in regions where data does not exist. Typically these models need to be calibrated and validated using some experimental data, which often consists of multiple correlated outcomes. In addition, some of the model parameters may be categorical in nature, such as a pointer variable to alternate models (or submodels) for some of the physics of the system. Here we present a general approach for calibration in such situations where an emulator of the computationally demanding models and a discrepancy term from the model to reality are represented within a Bayesian Smoothing Spline (BSS) ANOVA framework. The BSS-ANOVA framework has several advantages over the traditional Gaussian Process, including ease of handling categorical inputs and correlated outputs, and improved computational efficiency. Finally this framework is then applied to the problem that motivated its design; a calibration of a computational fluid dynamics model of a bubbling fluidized which is used as an absorber in a CO2 capture system.



قيم البحث

اقرأ أيضاً

60 - Faicel Chamroukhi 2015
This work relates the framework of model-based clustering for spatial functional data where the data are surfaces. We first introduce a Bayesian spatial spline regression model with mixed-effects (BSSR) for modeling spatial function data. The BSSR mo del is based on Nodal basis functions for spatial regression and accommodates both common mean behavior for the data through a fixed-effects part, and variability inter-individuals thanks to a random-effects part. Then, in order to model populations of spatial functional data issued from heterogeneous groups, we integrate the BSSR model into a mixture framework. The resulting model is a Bayesian mixture of spatial spline regressions with mixed-effects (BMSSR) used for density estimation and model-based surface clustering. The models, through their Bayesian formulation, allow to integrate possible prior knowledge on the data structure and constitute a good alternative to recent mixture of spatial spline regressions model estimated in a maximum likelihood framework via the expectation-maximization (EM) algorithm. The Bayesian model inference is performed by Markov Chain Monte Carlo (MCMC) sampling. We derive two Gibbs sampler to infer the BSSR and the BMSSR models and apply them on simulated surfaces and a real problem of handwritten digit recognition using the MNIST data set. The obtained results highlight the potential benefit of the proposed Bayesian approaches for modeling surfaces possibly dispersed in particular in clusters.
We propose a methodology for filtering, smoothing and assessing parameter and filtering uncertainty in score-driven models. Our technique is based on a general representation of the Kalman filter and smoother recursions for linear Gaussian models in terms of the score of the conditional log-likelihood. We prove that, when data is generated by a nonlinear non-Gaussian state-space model, the proposed methodology results from a local expansion of the true filtering density. A formal characterization of the approximation error is provided. As shown in extensive Monte Carlo analyses, our methodology performs very similarly to exact simulation-based methods, while remaining computationally extremely simple. We illustrate empirically the advantages in employing score-driven models as approximate filters rather than purely predictive processes.
This paper develops a Bayesian network-based method for the calibration of multi-physics models, integrating various sources of uncertainty with information from computational models and experimental data. We adopt the Kennedy and OHagan (KOH) framew ork for model calibration under uncertainty, and develop extensions to multi-physics models and various scenarios of available data. Both aleatoric uncertainty (due to natural variability) and epistemic uncertainty (due to lack of information, including data uncertainty and model uncertainty) are accounted for in the calibration process. Challenging aspects of Bayesian calibration for multi-physics models are investigated, including: (1) calibration with different forms of experimental data (e.g., interval data and time series data), (2) determination of the identifiability of model parameters when the analytical expression of model is known or unknown, (3) calibration of multiple physics models sharing common parameters, which enables efficient use of data especially when the experimental resources are limited. A first-order Taylor series expansion-based method is proposed to determine which model parameters are identifiable. Following the KOH framework, a probabilistic discrepancy function is estimated and added to the prediction of the calibrated model, attempting to account for model uncertainty. This discrepancy function is modeled as a Gaussian process when sufficient data are available for multiple model input combinations, and is modeled as a random variable when the available data are limited. The overall approach is illustrated using two application examples related to microelectromechanical system (MEMS) devices: (1) calibration of a dielectric charging model with time-series data, and (2) calibration of two physics models (pull-in voltage and creep) using measurements of different physical quantities in different devices.
209 - K. Kopotun , D. Leviatan , 2014
Several results on constrained spline smoothing are obtained. In particular, we establish a general result, showing how one can constructively smooth any monotone or convex piecewise polynomial function (ppf) (or any $q$-monotone ppf, $qgeq 3$, with one additional degree of smoothness) to be of minimal defect while keeping it close to the original function in the ${mathbb L}_p$-(quasi)norm. It is well known that approximating a function by ppfs of minimal defect (splines) avoids introduction of artifacts which may be unrelated to the original function, thus it is always preferable. On the other hand, it is usually easier to construct constrained ppfs with as little requirements on smoothness as possible. Our results allow to obtain shape-preserving splines of minimal defect with equidistant or Chebyshev knots. The validity of the corresponding Jackson-type estimates for shape-preserving spline approximation is summarized, in particular we show, that the ${mathbb L}_p$-estimates, $pge1$, can be immediately derived from the ${mathbb L}_infty$-estimates.
Functional data, with basic observational units being functions (e.g., curves, surfaces) varying over a continuum, are frequently encountered in various applications. While many statistical tools have been developed for functional data analysis, the issue of smoothing all functional observations simultaneously is less studied. Existing methods often focus on smoothing each individual function separately, at the risk of removing important systematic patterns common across functions. We propose a nonparametric Bayesian approach to smooth all functional observations simultaneously and nonparametrically. In the proposed approach, we assume that the functional observations are independent Gaussian processes subject to a common level of measurement errors, enabling the borrowing of strength across all observations. Unlike most Gaussian process regression models that rely on pre-specified structures for the covariance kernel, we adopt a hierarchical framework by assuming a Gaussian process prior for the mean function and an Inverse-Wishart process prior for the covariance function. These prior assumptions induce an automatic mean-covariance estimation in the posterior inference in addition to the simultaneous smoothing of all observations. Such a hierarchical framework is flexible enough to incorporate functional data with different characteristics, including data measured on either common or uncommon grids, and data with either stationary or nonstationary covariance structures. Simulations and real data analysis demonstrate that, in comparison with alternative methods, the proposed Bayesian approach achieves better smoothing accuracy and comparable mean-covariance estimation results. Furthermore, it can successfully retain the systematic patterns in the functional observations that are usually neglected by the existing functional data analyses based on individual-curve smoothing.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا