ﻻ يوجد ملخص باللغة العربية
Let $X|musim N_p(mu,v_xI)$ and $Y|musim N_p(mu,v_yI)$ be independent $p$-dimensional multivariate normal vectors with common unknown mean $mu$. Based on observing $X=x$, we consider the problem of estimating the true predictive density $p(y|mu)$ of $Y$ under expected Kullback--Leibler loss. Our focus here is the characterization of admissible procedures for this problem. We show that the class of all generalized Bayes rules is a complete class, and that the easily interpretable conditions of Brown and Hwang [Statistical Decision Theory and Related Topics (1982) III 205--230] are sufficient for a formal Bayes rule to be admissible.
We investigate predictive density estimation under the $L^2$ Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in den
We consider the problem of estimating the predictive density of future observations from a non-parametric regression model. The density estimators are evaluated under Kullback--Leibler divergence and our focus is on establishing the exact asymptotics
We present a new adaptive kernel density estimator based on linear diffusion processes. The proposed estimator builds on existing ideas for adaptive smoothing by incorporating information from a pilot density estimate. In addition, we propose a new p
This paper presents minimax rates for density estimation when the data dimension $d$ is allowed to grow with the number of observations $n$ rather than remaining fixed as in previous analyses. We prove a non-asymptotic lower bound which gives the wor
We study the estimation, in Lp-norm, of density functions defined on [0,1]^d. We construct a new family of kernel density estimators that do not suffer from the so-called boundary bias problem and we propose a data-driven procedure based on the Golde