ترغب بنشر مسار تعليمي؟ اضغط هنا

Inheritance Properties and Sum-of-Squares Decomposition of Hankel Tensors: Theory and Algorithms

106   0   0.0 ( 0 )
 نشر من قبل Liqun Qi
 تاريخ النشر 2015
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

In this paper, we show that if a lower-order Hankel tensor is positive semi-definite (or positive definite, or negative semi-definite, or negative definite, or SOS), then its associated higher-order Hankel tensor with the same generating vector, where the higher order is a multiple of the lower order, is also positive semi-definite (or positive definite, or negative semi-definite, or negative definite, or SOS, respectively). Furthermore, in this case, the extremal H-eigenvalues of the higher order tensor are bounded by the extremal H-eigenvalues of the lower order tensor, multiplied with some constants. Based on this inheritance property, we give a concrete sum-of-squares decomposition for each strong Hankel tensor. Then we prove the second inheritance property of Hankel tensors, i.e., a Hankel tensor has no negative (or non-positive, or positive, or nonnegative) H-eigenvalues if the associated Hankel matrix of that Hankel tensor has no negative (or non-positive, or positive, or nonnegative, respectively) eigenvalues. In this case, the extremal H-eigenvalues of the Hankel tensor are also bounded by the extremal eigenvalues of the associated Hankel matrix, multiplied with some constants. The third inheritance property of Hankel tensors is raised as a conjecture.



قيم البحث

اقرأ أيضاً

We consider two problems that arise in machine learning applications: the problem of recovering a planted sparse vector in a random linear subspace and the problem of decomposing a random low-rank overcomplete 3-tensor. For both problems, the best kn own guarantees are based on the sum-of-squares method. We develop new algorithms inspired by analyses of the sum-of-squares method. Our algorithms achieve the same or similar guarantees as sum-of-squares for these problems but the running time is significantly faster. For the planted sparse vector problem, we give an algorithm with running time nearly linear in the input size that approximately recovers a planted sparse vector with up to constant relative sparsity in a random subspace of $mathbb R^n$ of dimension up to $tilde Omega(sqrt n)$. These recovery guarantees match the best known ones of Barak, Kelner, and Steurer (STOC 2014) up to logarithmic factors. For tensor decomposition, we give an algorithm with running time close to linear in the input size (with exponent $approx 1.086$) that approximately recovers a component of a random 3-tensor over $mathbb R^n$ of rank up to $tilde Omega(n^{4/3})$. The best previous algorithm for this problem due to Ge and Ma (RANDOM 2015) works up to rank $tilde Omega(n^{3/2})$ but requires quasipolynomial time.
281 - Boaz Barak , David Steurer 2014
In order to obtain the best-known guarantees, algorithms are traditionally tailored to the particular problem we want to solve. Two recent developments, the Unique Games Conjecture (UGC) and the Sum-of-Squares (SOS) method, surprisingly suggest that this tailoring is not necessary and that a single efficient algorithm could achieve best possible guarantees for a wide range of different problems. The Unique Games Conjecture (UGC) is a tantalizing conjecture in computational complexity, which, if true, will shed light on the complexity of a great many problems. In particular this conjecture predicts that a single concrete algorithm provides optimal guarantees among all efficient algorithms for a large class of computational problems. The Sum-of-Squares (SOS) method is a general approach for solving systems of polynomial constraints. This approach is studied in several scientific disciplines, including real algebraic geometry, proof complexity, control theory, and mathematical programming, and has found applications in fields as diverse as quantum information theory, formal verification, game theory and many others. We survey some connections that were recently uncovered between the Unique Games Conjecture and the Sum-of-Squares method. In particular, we discuss new tools to rigorously bound the running time of the SOS method for obtaining approximate solutions to hard optimization problems, and how these tools give the potential for the sum-of-squares method to provide new guarantees for many problems of interest, and possibly to even refute the UGC.
We introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders t hem particularly relevant for applications in machine learning and large-scale data analysis. Relying on sum-of-squares (SOS) optimization, we introduce a hierarchy of semidefinite programs that give increasingly better convergence bounds for higher levels of the hierarchy. Alluding to the power of the SOS hierarchy, we show that the (dual of the) first level corresponds to the Performance Estimation Problem (PEP) introduced by Drori and Teboulle [Math. Program., 145(1):451--482, 2014], a powerful framework for determining convergence rates of first-order optimization algorithms. Consequently, many results obtained within the PEP framework can be reinterpreted as degree-1 SOS proofs, and thus, the SOS framework provides a promising new approach for certifying improved rates of convergence by means of higher-order SOS certificates. To determine analytical rate bounds, in this work we use the first level of the SOS hierarchy and derive new result{s} for noisy gradient descent with inexact line search methods (Armijo, Wolfe, and Goldstein).
We give a new approach to the dictionary learning (also known as sparse coding) problem of recovering an unknown $ntimes m$ matrix $A$ (for $m geq n$) from examples of the form [ y = Ax + e, ] where $x$ is a random vector in $mathbb R^m$ with at most $tau m$ nonzero coordinates, and $e$ is a random noise vector in $mathbb R^n$ with bounded magnitude. For the case $m=O(n)$, our algorithm recovers every column of $A$ within arbitrarily good constant accuracy in time $m^{O(log m/log(tau^{-1}))}$, in particular achieving polynomial time if $tau = m^{-delta}$ for any $delta>0$, and time $m^{O(log m)}$ if $tau$ is (a sufficiently small) constant. Prior algorithms with comparable assumptions on the distribution required the vector $x$ to be much sparser---at most $sqrt{n}$ nonzero coordinates---and there were intrinsic barriers preventing these algorithms from applying for denser $x$. We achieve this by designing an algorithm for noisy tensor decomposition that can recover, under quite general conditions, an approximate rank-one decomposition of a tensor $T$, given access to a tensor $T$ that is $tau$-close to $T$ in the spectral norm (when considered as a matrix). To our knowledge, this is the first algorithm for tensor decomposition that works in the constant spectral-norm noise regime, where there is no guarantee that the local optima of $T$ and $T$ have similar structures. Our algorithm is based on a novel approach to using and analyzing the Sum of Squares semidefinite programming hierarchy (Parrilo 2000, Lasserre 2001), and it can be viewed as an indication of the utility of this very general and powerful tool for unsupervised learning problems.
118 - Yuning Yang , Qingzhi Yang 2011
In this paper, we mainly focus on how to generalize some conclusions from nonnegative irreducible tensors to nonnegative weakly irreducible tensors. To do so, a basic and important lemma is proven using new tools. First, we give the definition of sto chastic tensors. Then we show that every nonnegative weakly irreducible tensor with spectral radius being one is diagonally similar to a unique weakly irreducible stochastic tensor. Based on it, we prove some important lemmas, which help us to generalize the results related. Some counterexamples are provided to show that some conclusions for nonnegative irreducible tensors do not hold for nonnegative weakly irreducible tensors.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا