ترغب بنشر مسار تعليمي؟ اضغط هنا

Border rank non-additivity for higher order tensors

119   0   0.0 ( 0 )
 نشر من قبل Fulvio Gesmundo
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Whereas matrix rank is additive under direct sum, in 1981 Schonhage showed that one of its generalizations to the tensor setting, tensor border rank, can be strictly subadditive for tensors of order three. Whether border rank is additive for higher order tensors has remained open. In this work, we settle this problem by providing analogs of Schonhages construction for tensors of order four and higher. Schonhages work was motivated by the study of the computational complexity of matrix multiplication; we discuss implications of our results for the asymptotic rank of higher order generalizations of the matrix multiplication tensor.



قيم البحث

اقرأ أيضاً

We investigate an extension of a lower bound on the Waring (cactus) rank of homogeneous forms due to Ranestad and Schreyer. We show that for particular classes of homogeneous forms, for which a generalization of this method applies, the lower bound e xtends to the level of border (cactus) rank. The approach is based on recent results on tensor asymptotic rank.
The $X$-rank of a point $p$ in projective space is the minimal number of points of an algebraic variety $X$ whose linear span contains $p$. This notion is naturally submultiplicative under tensor product. We study geometric conditions that guarantee strict submultiplicativity. We prove that in the case of points of rank two, strict submultiplicativity is entirely characterized in terms of the trisecant lines to the variety. Moreover, we focus on the case of curves: we prove that for curves embedded in an even-dimensional projective space, there are always points for which strict submultiplicativity occurs, with the only exception of rational normal curves.
Sparse tensor best rank-1 approximation (BR1Approx), which is a sparsity generalization of the dense tensor BR1Approx, and is a higher-order extension of the sparse matrix BR1Approx, is one of the most important problems in sparse tensor decompositio n and related problems arising from statistics and machine learning. By exploiting the multilinearity as well as the sparsity structure of the problem, four approximation algorithms are proposed, which are easily implemented, of low computational complexity, and can serve as initial procedures for iterative algorithms. In addition, theoretically guaranteed worst-case approximation lower bounds are proved for all the algorithms. We provide numerical experiments on synthetic and real data to illustrate the effectiveness of the proposed algorithms.
It has recently been shown that the tensor rank can be strictly submultiplicative under the tensor product, where the tensor product of two tensors is a tensor whose order is the sum of the orders of the two factors. The necessary upper bounds were o btained with help of border rank. It was left open whether border rank itself can be strictly submultiplicative. We answer this question in the affirmative. In order to do so, we construct lines in projective space along which the border rank drops multiple times and use this result in conjunction with a previous construction for a tensor rank drop. Our results also imply strict submultiplicativity for cactus rank and border cactus rank.
We present three families of minimal border rank tensors: they come from highest weight vectors, smoothable algebras, or monomial algebras. We analyse them using Strassens laser method and obtain an upper bound $2.431$ on $omega$. We also explain how in certain monomial cases using the laser method directly is less profitable than first degenerating. Our results form possible paths in the search for valuable tensors for the laser method away from Coppersmith-Winograd tensors.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا