On Dimension-free Tail Inequalities for Sums of Random Matrices and Applications


Abstract in English

In this paper, we present a new framework to obtain tail inequalities for sums of random matrices. Compared with existing works, our tail inequalities have the following characteristics: 1) high feasibility--they can be used to study the tail behavior of various matrix functions, e.g., arbitrary matrix norms, the absolute value of the sum of the sum of the $j$ largest singular values (resp. eigenvalues) of complex matrices (resp. Hermitian matrices); and 2) independence of matrix dimension --- they do not have the matrix-dimension term as a product factor, and thus are suitable to the scenario of high-dimensional or infinite-dimensional random matrices. The price we pay to obtain these advantages is that the convergence rate of the resulting inequalities will become slow when the number of summand random matrices is large. We also develop the tail inequalities for matrix random series and matrix martingale difference sequence. We also demonstrate usefulness of our tail bounds in several fields. In compressed sensing, we employ the resulted tail inequalities to achieve a proof of the restricted isometry property when the measurement matrix is the sum of random matrices without any assumption on the distributions of matrix entries. In probability theory, we derive a new upper bound to the supreme of stochastic processes. In machine learning, we prove new expectation bounds of sums of random matrices matrix and obtain matrix approximation schemes via random sampling. In quantum information, we show a new analysis relating to the fractional cover number of quantum hypergraphs. In theoretical computer science, we obtain randomness-efficient samplers using matrix expander graphs that can be efficiently implemented in time without dependence on matrix dimensions.

Download