Block-Randomized Gradient Descent Methods with Importance Sampling for CP Tensor Decomposition


Abstract in English

This work considers the problem of computing the CANDECOMP/PARAFAC (CP) decomposition of large tensors. One popular way is to translate the problem into a sequence of overdetermined least squares subproblems with Khatri-Rao product (KRP) structure. In this work, for tensor with different levels of importance in each fiber, combining stochastic optimization with randomized sampling, we present a mini-batch stochastic gradient descent algorithm with importance sampling for those special least squares subproblems. Four different sampling strategies are provided. They can avoid forming the full KRP or corresponding probabilities and sample the desired fibers from the original tensor directly. Moreover, a more practical algorithm with adaptive step size is also given. For the proposed algorithms, we present their convergence properties and numerical performance. The results on synthetic data show that our algorithms outperform the existing algorithms in terms of accuracy or the number of iterations.

Download