Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences


الملخص بالإنكليزية

An empirical Bayes approach to the estimation of possibly sparse sequences observed in Gaussian white noise is set out and investigated. The prior considered is a mixture of an atom of probability at zero and a heavy-tailed density gamma, with the mixing weight chosen by marginal maximum likelihood, in the hope of adapting between sparse and dense sequences. If estimation is then carried out using the posterior median, this is a random thresholding procedure. Other thresholding rules employing the same threshold can also be used. Probability bounds on the threshold chosen by the marginal maximum likelihood approach lead to overall risk bounds over classes of signal sequences of length n, allowing for sparsity of various kinds and degrees. The signal classes considered are ``nearly black sequences where only a proportion eta is allowed to be nonzero, and sequences with normalized ell_p norm bounded by eta, for eta >0 and 0<ple 2. Estimation error is measured by mean qth power loss, for 0<qle 2. For all the classes considered, and for all q in (0,2], the method achieves the optimal estimation rate as nto infty and eta to 0 at various rates, and in this sense adapts automatically to the sparseness or otherwise of the underlying signal. In addition the risk is uniformly bounded over all signals. If the posterior mean is used as the estimator, the results still hold for q>1. Simulations show excellent performance.

تحميل البحث