ﻻ يوجد ملخص باللغة العربية
Weakly stationary Gaussian processes (GPs) are the principal tool in the statistical approaches to the design and analysis of computer experiments (or Uncertainty Quantification). Such processes are fitted to computer model output using a set of training runs to learn the parameters of the process covariance kernel. The stationarity assumption is often adequate, yet can lead to poor predictive performance when the model response exhibits nonstationarity, for example, if its smoothness varies across the input space. In this paper, we introduce a diagnostic-led approach to fitting nonstationary GP emulators by specifying finite mixtures of region-specific covariance kernels. Our method first fits a stationary GP and, if traditional diagnostics exhibit nonstationarity, those diagnostics are used to fit appropriate mixing functions for a covariance kernel mixture designed to capture the nonstationarity, ensuring an emulator that is continuous in parameter space and readily interpretable. We compare our approach to the principal nonstationary GP models in the literature and illustrate its performance on a number of idealised test cases and in an application to modelling the cloud parameterization of the French climate model.
The use of a finite mixture of normal distributions in model-based clustering allows to capture non-Gaussian data clusters. However, identifying the clusters from the normal components is challenging and in general either achieved by imposing constra
Bilateral filtering (BF) is one of the most classical denoising filters, however, the manually initialized filtering kernel hampers its adaptivity across images with various characteristics. To deal with image variation (i.e., non-stationary noise),
An emulator is a fast-to-evaluate statistical approximation of a detailed mathematical model (simulator). When used in lieu of simulators, emulators can expedite tasks that require many repeated evaluations, such as sensitivity analyses, policy optim
We determine the expected error by smoothing the data locally. Then we optimize the shape of the kernel smoother to minimize the error. Because the optimal estimator depends on the unknown function, our scheme automatically adjusts to the unknown fun
A framework is presented to model instances and degrees of local item dependence within the context of diagnostic classification models (DCMs). The study considers an undirected graphical model to describe dependent structure of test items and draws