ﻻ يوجد ملخص باللغة العربية
Tensor completion can estimate missing values of a high-order data from its partially observed entries. Recent works show that low rank tensor ring approximation is one of the most powerful tools to solve tensor completion problem. However, existing algorithms need predefined tensor ring rank which may be hard to determine in practice. To address the issue, we propose a hierarchical tensor ring decomposition for more compact representation. We use the standard tensor ring to decompose a tensor into several 3-order sub-tensors in the first layer, and each sub-tensor is further factorized by tensor singular value decomposition (t-SVD) in the second layer. In the low rank tensor completion based on the proposed decomposition, the zero elements in the 3-order core tensor are pruned in the second layer, which helps to automatically determinate the tensor ring rank. To further enhance the recovery performance, we use total variation to exploit the locally piece-wise smoothness data structure. The alternating direction method of multiplier can divide the optimization model into several subproblems, and each one can be solved efficiently. Numerical experiments on color images and hyperspectral images demonstrate that the proposed algorithm outperforms state-of-the-arts ones in terms of recovery accuracy.
Tensor decomposition is a popular technique for tensor completion, However most of the existing methods are based on linear or shallow model, when the data tensor becomes large and the observation data is very small, it is prone to over fitting and t
In this paper, we consider the tensor completion problem, which has many researchers in the machine learning particularly concerned. Our fast and precise method is built on extending the $L_{2,1}$-norm minimization and Qatar Riyal decomposition (LNM-
This paper considers the completion problem for a tensor (also referred to as a multidimensional array) from limited sampling. Our greedy method is based on extending the low-rank approximation pursuit (LRAP) method for matrix completions to tensor c
Tensor decompositions such as the canonical format and the tensor train format have been widely utilized to reduce storage costs and operational complexities for high-dimensional data, achieving linear scaling with the input dimension instead of expo
Low-rank tensor completion recovers missing entries based on different tensor decompositions. Due to its outstanding performance in exploiting some higher-order data structure, low rank tensor ring has been applied in tensor completion. To further de