Analytic Bias Reduction for $k$-Sample Functionals


الملخص بالإنكليزية

We give analytic methods for nonparametric bias reduction that remove the need for computationally intensive methods like the bootstrap and the jackknife. We call an estimate {it $p$th order} if its bias has magnitude $n_0^{-p}$ as $n_0 to infty$, where $n_0$ is the sample size (or the minimum sample size if the estimate is a function of more than one sample). Most estimates are only first order and require O(N) calculations, where $N$ is the total sample size. The usual bootstrap and jackknife estimates are second order but they are computationally intensive, requiring $O(N^2)$ calculations for one sample. By contrast Jaeckels infinitesimal jackknife is an analytic second order one sample estimate requiring only O(N) calculations. When $p$th order bootstrap and jackknife estimates are available, they require $O(N^p)$ calculations, and so become even more computationally intensive if one chooses $p>2$. For general $p$ we provide analytic $p$th order nonparametric estimates that require only O(N) calculations. Our estimates are given in terms of the von Mises derivatives of the functional being estimated, evaluated at the empirical distribution. For products of moments an unbiased estimate exists: our form for this polykay is much simpler than the usual form in terms of power sums.

تحميل البحث