ﻻ يوجد ملخص باللغة العربية
Reliable models of the thermodynamic properties of materials are critical for industrially relevant applications that require a good understanding of equilibrium phase diagrams, thermal and chemical transport, and microstructure evolution. The goal of thermodynamic models is to capture data from both experimental and computational studies and then make reliable predictions when extrapolating to new regions of parameter space. These predictions will be impacted by artifacts present in real data sets such as outliers, systematics errors and unreliable or missing uncertainty bounds. Such issues increase the probability of the thermodynamic model producing erroneous predictions. We present a Bayesian framework for the selection, calibration and quantification of uncertainty of thermodynamic property models. The modular framework addresses numerous concerns regarding thermodynamic models including thermodynamic consistency, robustness to outliers and systematic errors by the use of hyperparameter weightings and robust Likelihood and Prior distribution choices. Furthermore, the frameworks inherent transparency (e.g. our choice of probability functions and associated parameters) enables insights into the complex process of thermodynamic assessment. We introduce these concepts through examples where the true property model is known. In addition, we demonstrate the utility of the framework through the creation of a property model from a large set of experimental specific heat and enthalpy measurements of Hafnium metal from 0 to 4900K.
This work affords new insights into Bayesian CART in the context of structured wavelet shrinkage. The main thrust is to develop a formal inferential framework for Bayesian tree-based regression. We reframe Bayesian CART as a g-type prior which depart
Bayesian optimization is a class of global optimization techniques. It regards the underlying objective function as a realization of a Gaussian process. Although the outputs of Bayesian optimization are random according to the Gaussian process assump
Within a Bayesian statistical framework using the standard Skyrme-Hartree-Fcok model, the maximum a posteriori (MAP) values and uncertainties of nuclear matter incompressibility and isovector interaction parameters are inferred from the experimental
Bayesian Neural Networks (BNNs) place priors over the parameters in a neural network. Inference in BNNs, however, is difficult; all inference methods for BNNs are approximate. In this work, we empirically compare the quality of predictive uncertainty
Obtaining accurate estimates of machine learning model uncertainties on newly predicted data is essential for understanding the accuracy of the model and whether its predictions can be trusted. A common approach to such uncertainty quantification is