Do you want to publish a course? Click here

On the estimation of the mean of a random vector

80   0   0.0 ( 0 )
 Added by Emilien Joly
 Publication date 2016
and research's language is English
 Authors Emilien Joly




Ask ChatGPT about the research

We study the problem of estimating the mean of a multivariatedistribution based on independent samples. The main result is the proof of existence of an estimator with a non-asymptotic sub-Gaussian performance for all distributions satisfying some mild moment assumptions.



rate research

Read More

144 - Mohammad Arashi 2012
In this paper, we are basically discussing on a class of Baranchik type shrinkage estimators of the vector parameter in a location model, with errors belonging to a sub-class of elliptically contoured distributions. We derive conditions under Schwartz space in which the underlying class of shrinkage estimators outperforms the sample mean. Sufficient conditions on dominant class to outperform the usual James-Stein estimator are also established. It is nicely presented that the dominant properties of the class of estimators are robust truly respect to departures from normality.
122 - W. J. Hall , Jon A. Wellner 2017
Yang (1978) considered an empirical estimate of the mean residual life function on a fixed finite interval. She proved it to be strongly uniformly consistent and (when appropriately standardized) weakly convergent to a Gaussian process. These results are extended to the whole half line, and the variance of the the limiting process is studied. Also, nonparametric simultaneous confidence bands for the mean residual life function are obtained by transforming the limiting process to Brownian motion.
Data in non-Euclidean spaces are commonly encountered in many fields of Science and Engineering. For instance, in Robotics, attitude sensors capture orientation which is an element of a Lie group. In the recent past, several researchers have reported methods that take into account the geometry of Lie Groups in designing parameter estimation algorithms in nonlinear spaces. Maximum likelihood estimators (MLE) are quite commonly used for such tasks and it is well known in the field of statistics that Steins shrinkage estimators dominate the MLE in a mean-squared sense assuming the observations are from a normal population. In this paper, we present a novel shrinkage estimator for data residing in Lie groups, specifically, abelian or compact Lie groups. The key theoretical results presented in this paper are: (i) Steins Lemma and its proof for Lie groups and, (ii) proof of dominance of the proposed shrinkage estimator over MLE for abelian and compact Lie groups. We present examples of simulation studies of the dominance of the proposed shrinkage estimator and an application of shrinkage estimation to multiple-robot localization.
We consider high-dimensional multivariate linear regression models, where the joint distribution of covariates and response variables is a multivariate normal distribution with a bandable covariance matrix. The main goal of this paper is to estimate the regression coefficient matrix, which is a function of the bandable covariance matrix. Although the tapering estimator of covariance has the minimax optimal convergence rate for the class of bandable covariances, we show that it has a sub-optimal convergence rate for the regression coefficient; that is, a minimax estimator for the class of bandable covariances may not be a minimax estimator for its functionals. We propose the blockwise tapering estimator of the regression coefficient, which has the minimax optimal convergence rate for the regression coefficient under the bandable covariance assumption. We also propose a Bayesian procedure called the blockwise tapering post-processed posterior of the regression coefficient and show that the proposed Bayesian procedure has the minimax optimal convergence rate for the regression coefficient under the bandable covariance assumption. We show that the proposed methods outperform the existing methods via numerical studies.
We study the least squares estimator in the residual variance estimation context. We show that the mean squared differences of paired observations are asymptotically normally distributed. We further establish that, by regressing the mean squared differences of these paired observations on the squared distances between paired covariates via a simple least squares procedure, the resulting variance estimator is not only asymptotically normal and root-$n$ consistent, but also reaches the optimal bound in terms of estimation variance. We also demonstrate the advantage of the least squares estimator in comparison with existing methods in terms of the second order asymptotic properties.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا