ﻻ يوجد ملخص باللغة العربية
We introduce constrained Gaussian process (CGP), a Gaussian process model for random functions that allows easy placement of mathematical constrains (e.g., non-negativity, monotonicity, etc) on its sample functions. CGP comes with closed-form probability density function (PDF), and has the attractive feature that its posterior distributions for regression and classification are again CGPs with closed-form expressions. Furthermore, we show that CGP inherents the optimal theoretical properties of the Gaussian process, e.g. rates of posterior contraction, due to the fact that CGP is an Gaussian process with a more efficient model space.
In this paper we introduce a novel model for Gaussian process (GP) regression in the fully Bayesian setting. Motivated by the ideas of sparsification, localization and Bayesian additive modeling, our model is built around a recursive partitioning (RP
In this work, we investigate Gaussian process regression used to recover a function based on noisy observations. We derive upper and lower error bounds for Gaussian process regression with possibly misspecified correlation functions. The optimal conv
We apply Gaussian process (GP) regression, which provides a powerful non-parametric probabilistic method of relating inputs to outputs, to survival data consisting of time-to-event and covariate measurements. In this context, the covariates are regar
Subspace-valued functions arise in a wide range of problems, including parametric reduced order modeling (PROM). In PROM, each parameter point can be associated with a subspace, which is used for Petrov-Galerkin projections of large system matrices.
We introduce Latent Gaussian Process Regression which is a latent variable extension allowing modelling of non-stationary multi-modal processes using GPs. The approach is built on extending the input space of a regression problem with a latent variab