ﻻ يوجد ملخص باللغة العربية
We consider the problem of decomposing higher-order moment tensors, i.e., the sum of symmetric outer products of data vectors. Such a decomposition can be used to estimate the means in a Gaussian mixture model and for other applications in machine learning. The $d$th-order empirical moment tensor of a set of $p$ observations of $n$ variables is a symmetric $d$-way tensor. Our goal is to find a low-rank tensor approximation comprising $r ll p$ symmetric outer products. The challenge is that forming the empirical moment tensors costs $O(pn^d)$ operations and $O(n^d)$ storage, which may be prohibitively expensive; additionally, the algorithm to compute the low-rank approximation costs $O(n^d)$ per iteration. Our contribution is avoiding formation of the moment tensor, computing the low-rank tensor approximation of the moment tensor implicitly using $O(pnr)$ operations per iteration and no extra memory. This advance opens the door to more applications of higher-order moments since they can now be efficiently computed. We present numerical evidence of the computational savings and show an example of estimating the means for higher-order moments.
We consider the problem of decomposing a real-valued symmetric tensor as the sum of outer products of real-valued, pairwise orthogonal vectors. Such decompositions do not generally exist, but we show that some symmetric tensor decomposition problems
We introduce the Subspace Power Method (SPM) for calculating the CP decomposition of low-rank even-order real symmetric tensors. This algorithm applies the tensor power method of Kolda-Mayo to a certain modified tensor, constructed from a matrix flat
In this paper we study the problem of recovering a tensor network decomposition of a given tensor $mathcal{T}$ in which the tensors at the vertices of the network are orthogonally decomposable. Specifically, we consider tensor networks in the form of
This paper discusses the problem of symmetric tensor decomposition on a given variety $X$: decomposing a symmetric tensor into the sum of tensor powers of vectors contained in $X$. In this paper, we first study geometric and algebraic properties of s
We propose a new algorithm for computing the tensor rank decomposition or canonical polyadic decomposition of higher-order tensors subject to a rank and genericity constraint. Reformulating this as a system of polynomial equations allows us to levera