ﻻ يوجد ملخص باللغة العربية
We investigate the frequentist coverage properties of credible sets resulting in from Gaussian process priors with squared exponential covariance kernel. First we show that by selecting the scaling hyper-parameter using the maximum marginal likelihood estimator in the (slightly modified) squared exponential covariance kernel the corresponding credible sets will provide overconfident, misleading uncertainty statements for a large, representative subclass of the functional parameters in context of the Gaussian white noise model. Then we show that by either blowing up the credible sets with a logarithmic factor or modifying the maximum marginal likelihood estimator with a logarithmic term one can get reliable uncertainty statement and adaptive size of the credible sets under some additional restriction. Finally we demonstrate on a numerical study that the derived negative and positive results extend beyond the Gaussian white noise model to the nonparametric regression and classification models for small sample sizes as well.
This work affords new insights into Bayesian CART in the context of structured wavelet shrinkage. The main thrust is to develop a formal inferential framework for Bayesian tree-based regression. We reframe Bayesian CART as a g-type prior which depart
In this work, we investigate Gaussian process regression used to recover a function based on noisy observations. We derive upper and lower error bounds for Gaussian process regression with possibly misspecified correlation functions. The optimal conv
Bayesian optimization is a class of global optimization techniques. It regards the underlying objective function as a realization of a Gaussian process. Although the outputs of Bayesian optimization are random according to the Gaussian process assump
We develop singular value shrinkage priors for the mean matrix parameters in the matrix-variate normal model with known covariance matrices. Our priors are superharmonic and put more weight on matrices with smaller singular values. They are a natural
The Riemannian geometry of covariance matrices has been essential to several successful applications, in computer vision, biomedical signal and image processing, and radar data processing. For these applications, an important ongoing challenge is to