ترغب بنشر مسار تعليمي؟ اضغط هنا

Admissible predictive density estimation

139   0   0.0 ( 0 )
 نشر من قبل Xinyi Xu
 تاريخ النشر 2008
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

Let $X|musim N_p(mu,v_xI)$ and $Y|musim N_p(mu,v_yI)$ be independent $p$-dimensional multivariate normal vectors with common unknown mean $mu$. Based on observing $X=x$, we consider the problem of estimating the true predictive density $p(y|mu)$ of $Y$ under expected Kullback--Leibler loss. Our focus here is the characterization of admissible procedures for this problem. We show that the class of all generalized Bayes rules is a complete class, and that the easily interpretable conditions of Brown and Hwang [Statistical Decision Theory and Related Topics (1982) III 205--230] are sufficient for a formal Bayes rule to be admissible.

قيم البحث

اقرأ أيضاً

We investigate predictive density estimation under the $L^2$ Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in den sity with the posterior mean of the location and scale parameters. We provide Bayesian predictive densities that dominate the best equivariant one in normal models.
129 - Xinyi Xu , Feng Liang 2010
We consider the problem of estimating the predictive density of future observations from a non-parametric regression model. The density estimators are evaluated under Kullback--Leibler divergence and our focus is on establishing the exact asymptotics of minimax risk in the case of Gaussian errors. We derive the convergence rate and constant for minimax risk among Bayesian predictive densities under Gaussian priors and we show that this minimax risk is asymptotically equivalent to that among all density estimators.
We present a new adaptive kernel density estimator based on linear diffusion processes. The proposed estimator builds on existing ideas for adaptive smoothing by incorporating information from a pilot density estimate. In addition, we propose a new p lug-in bandwidth selection method that is free from the arbitrary normal reference rules used by existing methods. We present simulation examples in which the proposed approach outperforms existing methods in terms of accuracy and reliability.
162 - Daniel J. McDonald 2017
This paper presents minimax rates for density estimation when the data dimension $d$ is allowed to grow with the number of observations $n$ rather than remaining fixed as in previous analyses. We prove a non-asymptotic lower bound which gives the wor st-case rate over standard classes of smooth densities, and we show that kernel density estimators achieve this rate. We also give oracle choices for the bandwidth and derive the fastest rate $d$ can grow with $n$ to maintain estimation consistency.
We study the estimation, in Lp-norm, of density functions defined on [0,1]^d. We construct a new family of kernel density estimators that do not suffer from the so-called boundary bias problem and we propose a data-driven procedure based on the Golde nshluger and Lepski approach that jointly selects a kernel and a bandwidth. We derive two estimators that satisfy oracle-type inequalities. They are also proved to be adaptive over a scale of anisotropic or isotropic Sobolev-Slobodetskii classes (which are particular cases of Besov or Sobolev classical classes). The main interest of the isotropic procedure is to obtain adaptive results without any restriction on the smoothness parameter.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا