Do you want to publish a course? Click here

Shrinkage estimation with a matrix loss function

184   0   0.0 ( 0 )
 Added by John Kent
 Publication date 2011
and research's language is English




Ask ChatGPT about the research

Consider estimating the n by p matrix of means of an n by p matrix of independent normally distributed observations with constant variance, where the performance of an estimator is judged using a p by p matrix quadratic error loss function. A matrix version of the James-Stein estimator is proposed, depending on a tuning constant. It is shown to dominate the usual maximum likelihood estimator for some choices of of the tuning constant when n is greater than or equal to 3. This result also extends to other shrinkage estimators and settings.



rate research

Read More

We consider the problem of estimating the mean vector $theta$ of a $d$-dimensional spherically symmetric distributed $X$ based on balanced loss functions of the forms: {bf (i)} $omega rho(|de-de_{0}|^{2}) +(1-omega)rho(|de - theta|^{2})$ and {bf (ii)} $ellleft(omega |de - de_{0}|^{2} +(1-omega)|de - theta|^{2}right)$, where $delta_0$ is a target estimator, and where $rho$ and $ell$ are increasing and concave functions. For $dgeq 4$ and the target estimator $delta_0(X)=X$, we provide Baranchik-type estimators that dominate $delta_0(X)=X$ and are minimax. The findings represent extensions of those of Marchand & Strawderman (cite{ms2020}) in two directions: {bf (a)} from scale mixture of normals to the spherical class of distributions with Lebesgue densities and {bf (b)} from completely monotone to concave $rho$ and $ell$.
Let $pi_1$ and $pi_2$ be two independent populations, where the population $pi_i$ follows a bivariate normal distribution with unknown mean vector $boldsymbol{theta}^{(i)}$ and common known variance-covariance matrix $Sigma$, $i=1,2$. The present paper is focused on estimating a characteristic $theta_{textnormal{y}}^S$ of the selected bivariate normal population, using a LINEX loss function. A natural selection rule is used for achieving the aim of selecting the best bivariate normal population. Some natural-type estimators and Bayes estimator (using a conjugate prior) of $theta_{textnormal{y}}^S$ are presented. An admissible subclass of equivariant estimators, using the LINEX loss function, is obtained. Further, a sufficient condition for improving the competing estimators of $theta_{textnormal{y}}^S$ is derived. Using this sufficient condition, several estimators improving upon the proposed natural estimators are obtained. Further, a real data example is provided for illustration purpose. Finally, a comparative study on the competing estimators of $theta_{text{y}}^S$ is carried-out using simulation.
Data in non-Euclidean spaces are commonly encountered in many fields of Science and Engineering. For instance, in Robotics, attitude sensors capture orientation which is an element of a Lie group. In the recent past, several researchers have reported methods that take into account the geometry of Lie Groups in designing parameter estimation algorithms in nonlinear spaces. Maximum likelihood estimators (MLE) are quite commonly used for such tasks and it is well known in the field of statistics that Steins shrinkage estimators dominate the MLE in a mean-squared sense assuming the observations are from a normal population. In this paper, we present a novel shrinkage estimator for data residing in Lie groups, specifically, abelian or compact Lie groups. The key theoretical results presented in this paper are: (i) Steins Lemma and its proof for Lie groups and, (ii) proof of dominance of the proposed shrinkage estimator over MLE for abelian and compact Lie groups. We present examples of simulation studies of the dominance of the proposed shrinkage estimator and an application of shrinkage estimation to multiple-robot localization.
Bayesian methods are developed for the multivariate nonparametric regression problem where the domain is taken to be a compact Riemannian manifold. In terms of the latter, the underlying geometry of the manifold induces certain symmetries on the multivariate nonparametric regression function. The Bayesian approach then allows one to incorporate hierarchical Bayesian methods directly into the spectral structure, thus providing a symmetry-adaptive multivariate Bayesian function estimator. One can also diffuse away some prior information in which the limiting case is a smoothing spline on the manifold. This, together with the result that the smoothing spline solution obtains the minimax rate of convergence in the multivariate nonparametric regression problem, provides good frequentist properties for the Bayes estimators. An application to astronomy is included.
149 - Alain Celisse 2014
We analyze the performance of cross-validation (CV) in the density estimation framework with two purposes: (i) risk estimation and (ii) model selection. The main focus is given to the so-called leave-$p$-out CV procedure (Lpo), where $p$ denotes the cardinality of the test set. Closed-form expressions are settled for the Lpo estimator of the risk of projection estimators. These expressions provide a great improvement upon $V$-fold cross-validation in terms of variability and computational complexity. From a theoretical point of view, closed-form expressions also enable to study the Lpo performance in terms of risk estimation. The optimality of leave-one-out (Loo), that is Lpo with $p=1$, is proved among CV procedures used for risk estimation. Two model selection frameworks are also considered: estimation, as opposed to identification. For estimation with finite sample size $n$, optimality is achieved for $p$ large enough [with $p/n=o(1)$] to balance the overfitting resulting from the structure of the model collection. For identification, model selection consistency is settled for Lpo as long as $p/n$ is conveniently related to the rate of convergence of the best estimator in the collection: (i) $p/nto1$ as $nto+infty$ with a parametric rate, and (ii) $p/n=o(1)$ with some nonparametric estimators. These theoretical results are validated by simulation experiments.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا