Do you want to publish a course? Click here

Mode-wise Tensor Decompositions: Multi-dimensional Generalizations of CUR Decompositions

77   0   0.0 ( 0 )
 Added by Longxiu Huang
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Low rank tensor approximation is a fundamental tool in modern machine learning and data science. In this paper, we study the characterization, perturbation analysis, and an efficient sampling strategy for two primary tensor CUR approximations, namely Chidori and Fiber CUR. We characterize exact tensor CUR decompositions for low multilinear rank tensors. We also present theoretical error bounds of the tensor CUR approximations when (adversarial or Gaussian) noise appears. Moreover, we show that low cost uniform sampling is sufficient for tensor CUR approximations if the tensor has an incoherent structure. Empirical performance evaluations, with both synthetic and real-world datasets, establish the speed advantage of the tensor CUR approximations over other state-of-the-art low multilinear rank tensor approximations.



rate research

Read More

108 - Jiawang Nie , Zi Yang 2019
Hermitian tensors are generalizations of Hermitian matrices, but they have very different properties. Every complex Hermitian tensor is a sum of complex Hermitian rank-1 tensors. However, this is not true for the real case. We study basic properties for Hermitian tensors such as Hermitian decompositions and Hermitian ranks. For canonical basis tensors, we determine their Hermitian ranks and decompositions. For real Hermitian tensors, we give a full characterization for them to have Hermitian decompositions over the real field. In addition to traditional flattening, Hermitian tensors specially have Hermitian and Kronecker flattenings, which may give different lower bounds for Hermitian ranks. We also study other topics such as eigenvalues, positive semidefiniteness, sum of squares representations, and separability.
207 - Jiawang Nie , Ke Ye , Lihong Zhi 2020
This paper discusses the problem of symmetric tensor decomposition on a given variety $X$: decomposing a symmetric tensor into the sum of tensor powers of vectors contained in $X$. In this paper, we first study geometric and algebraic properties of such decomposable tensors, which are crucial to the practical computations of such decompositions. For a given tensor, we also develop a criterion for the existence of a symmetric decomposition on $X$. Secondly and most importantly, we propose a method for computing symmetric tensor decompositions on an arbitrary $X$. As a specific application, Vandermonde decompositions for nonsymmetric tensors can be computed by the proposed algorithm.
Low-rank Tucker and CP tensor decompositions are powerful tools in data analytics. The widely used alternating least squares (ALS) method, which solves a sequence of over-determined least squares subproblems, is costly for large and sparse tensors. We propose a fast and accurate sketched ALS algorithm for Tucker decomposition, which solves a sequence of sketched rank-constrained linear least squares subproblems. Theoretical sketch size upper bounds are provided to achieve $O(epsilon)$ relative error for each subproblem with two sketching techniques, TensorSketch and leverage score sampling. Experimental results show that this new ALS algorithm, combined with a new initialization scheme based on randomized range finder, yields up to $22.0%$ relative decomposition residual improvement compared to the state-of-the-art sketched randomized algorithm for Tucker decomposition of various synthetic and real datasets. This Tucker-ALS algorithm is further used to accelerate CP decomposition, by using randomized Tucker compression followed by CP decomposition of the Tucker core tensor. Experimental results show that this algorithm not only converges faster, but also yields more accurate CP decompositions.
123 - Chao Zeng 2021
The orthogonal decomposition factorizes a tensor into a sum of an orthogonal list of rankone tensors. We present several properties of orthogonal rank. We find that a subtensor may have a larger orthogonal rank than the whole tensor and prove the lower semicontinuity of orthogonal rank. The lower semicontinuity guarantees the existence of low orthogonal rank approximation. To fit the orthogonal decomposition, we propose an algorithm based on the augmented Lagrangian method and guarantee the orthogonality by a novel orthogonalization procedure. Numerical experiments show that the proposed method has a great advantage over the existing methods for strongly orthogonal decompositions in terms of the approximation error.
This paper studies how to learn parameters in diagonal Gaussian mixture models. The problem can be formulated as computing incomplete symmetric tensor decompositions. We use generating polynomials to compute incomplete symmetric tensor decompositions and approximations. Then the tensor approximation method is used to learn diagonal Gaussian mixture models. We also do the stability analysis. When the first and third order moments are sufficiently accurate, we show that the obtained parameters for the Gaussian mixture models are also highly accurate. Numerical experiments are also provided.

suggested questions

comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا