Do you want to publish a course? Click here

On the Largest Singular Value/Eigenvalue of a Random Tensor

483   0   0.0 ( 0 )
 Added by Yuning Yang
 Publication date 2021
  fields
and research's language is English
 Authors Yuning Yang




Ask ChatGPT about the research

This short note presents upper bounds of the expectations of the largest singular values/eigenvalues of various types of random tensors in the non-asymptotic sense. For a standard Gaussian tensor of size $n_1timescdotstimes n_d$, it is shown that the expectation of its largest singular value is upper bounded by $sqrt {n_1}+cdots+sqrt {n_d}$. For the expectation of the largest $ell^d$-singular value, it is upper bounded by $2^{frac{d-1}{2}}prod_{j=1}^{d}n_j^{frac{d-2}{2d}}sum^d_{j=1}n_j^{frac{1}{2}}$. We also derive the upper bounds of the expectations of the largest Z-/H-($ell^d$)/M-/C-eigenvalues of symmetric, partially symmetric, and piezoelectric-type Gaussian tensors, which are respectively upper bounded by $dsqrt n$, $dcdot 2^{frac{d-1}{2}}n^{frac{d-1}{2}}$, $2sqrt m+2sqrt n$, and $3sqrt n$.



rate research

Read More

72 - Oleg Evnin 2020
We consider a Gaussian rotationally invariant ensemble of random real totally symmetric tensors with independent normally distributed entries, and estimate the largest eigenvalue of a typical tensor in this ensemble by examining the rate of growth of a random initial vector under successive applications of a nonlinear map defined by the random tensor. In the limit of a large number of dimensions, we observe that a simple form of melonic dominance holds, and the quantity we study is effectively determined by a single Feynman diagram arising from the Gaussian average over the tensor components. This computation suggests that the largest tensor eigenvalue in our ensemble in the limit of a large number of dimensions is proportional to the square root of the number of dimensions, as it is for random real symmetric matrices.
The hierarchical SVD provides a quasi-best low rank approximation of high dimensional data in the hierarchical Tucker framework. Similar to the SVD for matrices, it provides a fundamental but expensive tool for tensor computations. In the present work we examine generalizations of randomized matrix decomposition methods to higher order tensors in the framework of the hierarchical tensors representation. In particular we present and analyze a randomized algorithm for the calculation of the hierarchical SVD (HSVD) for the tensor train (TT) format.
We show that for an $ntimes n$ random symmetric matrix $A_n$, whose entries on and above the diagonal are independent copies of a sub-Gaussian random variable $xi$ with mean $0$ and variance $1$, [mathbb{P}[s_n(A_n) le epsilon/sqrt{n}] le O_{xi}(epsilon^{1/8} + exp(-Omega_{xi}(n^{1/2}))) quad text{for all } epsilon ge 0.] This improves a result of Vershynin, who obtained such a bound with $n^{1/2}$ replaced by $n^{c}$ for a small constant $c$, and $1/8$ replaced by $(1/8) + eta$ (with implicit constants also depending on $eta > 0$). Furthermore, when $xi$ is a Rademacher random variable, we prove that [mathbb{P}[s_n(A_n) le epsilon/sqrt{n}] le O(epsilon^{1/8} + exp(-Omega((log{n})^{1/4}n^{1/2}))) quad text{for all } epsilon ge 0.] The special case $epsilon = 0$ improves a recent result of Campos, Mattos, Morris, and Morrison, which showed that $mathbb{P}[s_n(A_n) = 0] le O(exp(-Omega(n^{1/2}))).$ The main innovation in our work are new notions of arithmetic structure -- the Median Regularized Least Common Denominator and the Median Threshold, which we believe should be more generally useful in contexts where one needs to combine anticoncentration information of different parts of a vector.
This paper introduces the functional tensor singular value decomposition (FTSVD), a novel dimension reduction framework for tensors with one functional mode and several tabular modes. The problem is motivated by high-order longitudinal data analysis. Our model assumes the observed data to be a random realization of an approximate CP low-rank functional tensor measured on a discrete time grid. Incorporating tensor algebra and the theory of Reproducing Kernel Hilbert Space (RKHS), we propose a novel RKHS-based constrained power iteration with spectral initialization. Our method can successfully estimate both singular vectors and functions of the low-rank structure in the observed data. With mild assumptions, we establish the non-asymptotic contractive error bounds for the proposed algorithm. The superiority of the proposed framework is demonstrated via extensive experiments on both simulated and real data.
We precisely determine the SDP value (equivalently, quantum value) of large random instances of certain kinds of constraint satisfaction problems, ``two-eigenvalue 2CSPs. We show this SDP value coincides with the spectral relaxation value, possibly indicating a computational threshold. Our analysis extends the previously resolved cases of random regular $mathsf{2XOR}$ and $textsf{NAE-3SAT}$, and includes new cases such as random $mathsf{Sort}_4$ (equivalently, $mathsf{CHSH}$) and $mathsf{Forrelation}$ CSPs. Our techniques include new generalizations of the nonbacktracking operator, the Ihara--Bass Formula, and the Friedman/Bordenave proof of Alons Conjecture.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا