ترغب بنشر مسار تعليمي؟ اضغط هنا

Hermitian Tensor Decompositions

109   0   0.0 ( 0 )
 نشر من قبل Jiawang Nie
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Hermitian tensors are generalizations of Hermitian matrices, but they have very different properties. Every complex Hermitian tensor is a sum of complex Hermitian rank-1 tensors. However, this is not true for the real case. We study basic properties for Hermitian tensors such as Hermitian decompositions and Hermitian ranks. For canonical basis tensors, we determine their Hermitian ranks and decompositions. For real Hermitian tensors, we give a full characterization for them to have Hermitian decompositions over the real field. In addition to traditional flattening, Hermitian tensors specially have Hermitian and Kronecker flattenings, which may give different lower bounds for Hermitian ranks. We also study other topics such as eigenvalues, positive semidefiniteness, sum of squares representations, and separability.


قيم البحث

اقرأ أيضاً

207 - Jiawang Nie , Ke Ye , Lihong Zhi 2020
This paper discusses the problem of symmetric tensor decomposition on a given variety $X$: decomposing a symmetric tensor into the sum of tensor powers of vectors contained in $X$. In this paper, we first study geometric and algebraic properties of s uch decomposable tensors, which are crucial to the practical computations of such decompositions. For a given tensor, we also develop a criterion for the existence of a symmetric decomposition on $X$. Secondly and most importantly, we propose a method for computing symmetric tensor decompositions on an arbitrary $X$. As a specific application, Vandermonde decompositions for nonsymmetric tensors can be computed by the proposed algorithm.
123 - Chao Zeng 2021
The orthogonal decomposition factorizes a tensor into a sum of an orthogonal list of rankone tensors. We present several properties of orthogonal rank. We find that a subtensor may have a larger orthogonal rank than the whole tensor and prove the low er semicontinuity of orthogonal rank. The lower semicontinuity guarantees the existence of low orthogonal rank approximation. To fit the orthogonal decomposition, we propose an algorithm based on the augmented Lagrangian method and guarantee the orthogonality by a novel orthogonalization procedure. Numerical experiments show that the proposed method has a great advantage over the existing methods for strongly orthogonal decompositions in terms of the approximation error.
This paper studies how to learn parameters in diagonal Gaussian mixture models. The problem can be formulated as computing incomplete symmetric tensor decompositions. We use generating polynomials to compute incomplete symmetric tensor decompositions and approximations. Then the tensor approximation method is used to learn diagonal Gaussian mixture models. We also do the stability analysis. When the first and third order moments are sufficiently accurate, we show that the obtained parameters for the Gaussian mixture models are also highly accurate. Numerical experiments are also provided.
Low rank tensor approximation is a fundamental tool in modern machine learning and data science. In this paper, we study the characterization, perturbation analysis, and an efficient sampling strategy for two primary tensor CUR approximations, namely Chidori and Fiber CUR. We characterize exact tensor CUR decompositions for low multilinear rank tensors. We also present theoretical error bounds of the tensor CUR approximations when (adversarial or Gaussian) noise appears. Moreover, we show that low cost uniform sampling is sufficient for tensor CUR approximations if the tensor has an incoherent structure. Empirical performance evaluations, with both synthetic and real-world datasets, establish the speed advantage of the tensor CUR approximations over other state-of-the-art low multilinear rank tensor approximations.
120 - Changxin Mo , Weiyang Ding 2021
Perturbation analysis has been primarily considered to be one of the main issues in many fields and considerable progress, especially getting involved with matrices, has been made from then to now. In this paper, we pay our attention to the perturbat ion analysis subject on tensor eigenvalues under tensor-tensor multiplication sense; and also {epsilon}-pseudospectra theory for third-order tensors. The definition of the generalized T-eigenvalue of third-order tensors is given. Several classical results, such as the Bauer-Fike theorem and its general case, Gershgorin circle theorem and Kahan theorem, are extended from matrix to tensor case. The study on {epsilon}-pseudospectra of tensors is presented, together with various pseudospectra properties and numerical examples which show the boundaries of the {epsilon}-pseudospectra of certain tensors under different levels.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا