Do you want to publish a course? Click here

ST-SVD Factorization and s-Diagonal Tensors

149   0   0.0 ( 0 )
 Added by Liqun Qi
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

A third order real tensor is mapped to a special f-diagonal tensor by going through Discrete Fourier Transform (DFT), standard matrix SVD and inverse DFT. We call such an f-diagonal tensor an s-diagonal tensor. An f-diagonal tensor is an s-diagonal tensor if and only if it is mapped to itself in the above process. The third order tensor space is partitioned to orthogonal equivalence classes. Each orthogonal equivalence class has a unique s-diagonal tensor. Two s-diagonal tensors are equal if they are orthogonally equivalent. Third order tensors in an orthogonal equivalence class have the same tensor tubal rank and T-singular values. Four meaningful necessary conditions for s-diagonal tensors are presented. Then we present a set of sufficient and necessary conditions for s-diagonal tensors. Such conditions involve a special complex number. In the cases that the dimension of the third mode of the considered tensor is $2, 3$ and $4$, we present direct sufficient and necessary conditions which do not involve such a complex number.



rate research

Read More

148 - Lek-Heng Lim 2021
The notion of a tensor captures three great ideas: equivariance, multilinearity, separability. But trying to be three things at once makes the notion difficult to understand. We will explain tensors in an accessible and elementary way through the lens of linear algebra and numerical linear algebra, elucidated with examples from computational and applied mathematics.
125 - Liqun Qi , Changqing Xu , Yi Xu 2013
Nonnegative tensor factorization has applications in statistics, computer vision, exploratory multiway data analysis and blind source separation. A symmetric nonnegative tensor, which has a symmetric nonnegative factorization, is called a completely positive (CP) tensor. The H-eigenvalues of a CP tensor are always nonnegative. When the order is even, the Z-eigenvalue of a CP tensor are all nonnegative. When the order is odd, a Z-eigenvector associated with a positive (negative) Z-eigenvalue of a CP tensor is always nonnegative (nonpositive). The entries of a CP tensor obey some dominance properties. The CP tensor cone and the copositive tensor cone of the same order are dual to each other. We introduce strongly symmetric tensors and show that a symmetric tensor has a symmetric binary decomposition if and only if it is strongly symmetric. Then we show that a strongly symmetric, hierarchically dominated nonnegative tensor is a CP tensor, and present a hierarchical elimination algorithm for checking this. Numerical examples are also given.
This paper studies how to learn parameters in diagonal Gaussian mixture models. The problem can be formulated as computing incomplete symmetric tensor decompositions. We use generating polynomials to compute incomplete symmetric tensor decompositions and approximations. Then the tensor approximation method is used to learn diagonal Gaussian mixture models. We also do the stability analysis. When the first and third order moments are sufficiently accurate, we show that the obtained parameters for the Gaussian mixture models are also highly accurate. Numerical experiments are also provided.
This paper focuses on the fast evaluation of the matvec $g=Kf$ for $Kin mathbb{C}^{Ntimes N}$, which is the discretization of a multidimensional oscillatory integral transform $g(x) = int K(x,xi) f(xi)dxi$ with a kernel function $K(x,xi)=e^{2pii Phi(x,xi)}$, where $Phi(x,xi)$ is a piecewise smooth phase function with $x$ and $xi$ in $mathbb{R}^d$ for $d=2$ or $3$. A new framework is introduced to compute $Kf$ with $O(Nlog N)$ time and memory complexity in the case that only indirect access to the phase function $Phi$ is available. This framework consists of two main steps: 1) an $O(Nlog N)$ algorithm for recovering the multidimensional phase function $Phi$ from indirect access is proposed; 2) a multidimensional interpolative decomposition butterfly factorization (MIDBF) is designed to evaluate the matvec $Kf$ with an $O(Nlog N)$ complexity once $Phi$ is available. Numerical results are provided to demonstrate the effectiveness of the proposed framework.
The problem of partitioning a large and sparse tensor is considered, where the tensor consists of a sequence of adjacency matrices. Theory is developed that is a generalization of spectral graph partitioning. A best rank-$(2,2,lambda)$ approximation is computed for $lambda=1,2,3$, and the partitioning is computed from the orthogonal matrices and the core tensor of the approximation. It is shown that if the tensor has a certain reducibility structure, then the solution of the best approximation problem exhibits the reducibility structure of the tensor. Further, if the tensor is close to being reducible, then still the solution of the exhibits the structure of the tensor. Numerical examples with synthetic data corroborate the theoretical results. Experiments with tensors from applications show that the method can be used to extract relevant information from large, sparse, and noisy data.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا