Linear and Sublinear Time Spectral Density Estimation


الملخص بالإنكليزية

We analyze the popular kernel polynomial method (KPM) for approximating the spectral density (eigenvalue distribution) of an $ntimes n$ Hermitian matrix $A$. We prove that a simple and practical variant of the KPM algorithm can approximate the spectral density to $epsilon$ accuracy in the Wasserstein-1 distance with roughly $O({1}/{epsilon})$ matrix-vector multiplications with $A$. This yields a provable linear time result for the problem with better $epsilon$ dependence than prior work. The KPM variant we study is based on damped Chebyshev polynomial expansions. We show that it is stable, meaning that it can be combined with any approximate matrix-vector multiplication algorithm for $A$. As an application, we develop an $O(ncdot text{poly}(1/epsilon))$ time algorithm for computing the spectral density of any $ntimes n$ normalized graph adjacency or Laplacian matrix. This runtime is sublinear in the size of the matrix, and assumes sample access to the graph. Our approach leverages several tools from approximation theory, including Jacksons seminal work on approximation with positive kernels [Jackson, 1912], and stability properties of three-term recurrence relations for orthogonal polynomials.

تحميل البحث