Non-Asymptotic Bounds for the $ell_{infty}$ Estimator in Linear Regression with Uniform Noise


Abstract in English

The Chebyshev or $ell_{infty}$ estimator is an unconventional alternative to the ordinary least squares in solving linear regressions. It is defined as the minimizer of the $ell_{infty}$ objective function begin{align*} hat{boldsymbol{beta}} := argmin_{boldsymbol{beta}} |boldsymbol{Y} - mathbf{X}boldsymbol{beta}|_{infty}. end{align*} The asymptotic distribution of the Chebyshev estimator under fixed number of covariates were recently studied (Knight, 2020), yet finite sample guarantees and generalizations to high-dimensional settings remain open. In this paper, we develop non-asymptotic upper bounds on the estimation error $|hat{boldsymbol{beta}}-boldsymbol{beta}^*|_2$ for a Chebyshev estimator $hat{boldsymbol{beta}}$, in a regression setting with uniformly distributed noise $varepsilon_isim U([-a,a])$ where $a$ is either known or unknown. With relatively mild assumptions on the (random) design matrix $mathbf{X}$, we can bound the error rate by $frac{C_p}{n}$ with high probability, for some constant $C_p$ depending on the dimension $p$ and the law of the design. Furthermore, we illustrate that there exist designs for which the Chebyshev estimator is (nearly) minimax optimal. In addition we show that Chebyshevs LASSO has advantages over the regular LASSO in high dimensional situations, provided that the noise is uniform. Specifically, we argue that it achieves a much faster rate of estimation under certain assumptions on the growth rate of the sparsity level and the ambient dimension with respect to the sample size.

Download