We consider the problem of estimating the predictive density of future observations from a non-parametric regression model. The density estimators are evaluated under Kullback--Leibler divergence and our focus is on establishing the exact asymptotics of minimax risk in the case of Gaussian errors. We derive the convergence rate and constant for minimax risk among Bayesian predictive densities under Gaussian priors and we show that this minimax risk is asymptotically equivalent to that among all density estimators.
We study asymptotic minimax problems for estimating a $d$-dimensional regression parameter over spheres of growing dimension ($dto infty$). Assuming that the data follows a linear model with Gaussian predictors and errors, we show that ridge regression is asymptotically minimax and derive new closed form expressions for its asymptotic risk under squared-error loss. The asymptotic risk of ridge regression is closely related to the Stieltjes transform of the Marv{c}enko-Pastur distribution and the spectral distribution of the predictors from the linear model. Adaptive ridge estimators are also proposed (which adapt to the unknown radius of the sphere) and connections with equivariant estimation are highlighted. Our results are mostly relevant for asymptotic settings where the number of observations, $n$, is proportional to the number of predictors, that is, $d/ntorhoin(0,infty)$.
This paper presents minimax rates for density estimation when the data dimension $d$ is allowed to grow with the number of observations $n$ rather than remaining fixed as in previous analyses. We prove a non-asymptotic lower bound which gives the worst-case rate over standard classes of smooth densities, and we show that kernel density estimators achieve this rate. We also give oracle choices for the bandwidth and derive the fastest rate $d$ can grow with $n$ to maintain estimation consistency.
In this paper,we consider a macro approximation of the flow of a risk reserve, The process is observed at discrete time points. Because we cannot directly observe each jump time and size then we will make use of a technique for identifying the times when jumps larger than a suitably defined threshold occurred. We estimate the jump size and survival probability of our risk process from discrete observations.
Let $X|musim N_p(mu,v_xI)$ and $Y|musim N_p(mu,v_yI)$ be independent $p$-dimensional multivariate normal vectors with common unknown mean $mu$. Based on observing $X=x$, we consider the problem of estimating the true predictive density $p(y|mu)$ of $Y$ under expected Kullback--Leibler loss. Our focus here is the characterization of admissible procedures for this problem. We show that the class of all generalized Bayes rules is a complete class, and that the easily interpretable conditions of Brown and Hwang [Statistical Decision Theory and Related Topics (1982) III 205--230] are sufficient for a formal Bayes rule to be admissible.
We address the problem of adaptive minimax density estimation on $bR^d$ with $bL_p$--loss on the anisotropic Nikolskii classes. We fully characterize behavior of the minimax risk for different relationships between regularity parameters and norm indexes in definitions of the functional class and of the risk. In particular, we show that there are four different regimes with respect to the behavior of the minimax risk. We develop a single estimator which is (nearly) optimal in orderover the complete scale of the anisotropic Nikolskii classes. Our estimation procedure is based on a data-driven selection of an estimator from a fixed family of kernel estimators.