ﻻ يوجد ملخص باللغة العربية
In recent years, low-rank based tensor completion, which is a higher-order extension of matrix completion, has received considerable attention. However, the low-rank assumption is not sufficient for the recovery of visual data, such as color and 3D images, where the ratio of missing data is extremely high. In this paper, we consider smoothness constraints as well as low-rank approximations, and propose an efficient algorithm for performing tensor completion that is particularly powerful regarding visual data. The proposed method admits significant advantages, owing to the integration of smooth PARAFAC decomposition for incomplete tensors and the efficient selection of models in order to minimize the tensor rank. Thus, our proposed method is termed as smooth PARAFAC tensor completion (SPC). In order to impose the smoothness constraints, we employ two strategies, total variation (SPC-TV) and quadratic variation (SPC-QV), and invoke the corresponding algorithms for model learning. Extensive experimental evaluations on both synthetic and real-world visual data illustrate the significant improvements of our method, in terms of both prediction performance and efficiency, compared with many state-of-the-art tensor completion methods.
We consider the problem of factorizing a structured 3-way tensor into its constituent Canonical Polyadic (CP) factors. This decomposition, which can be viewed as a generalization of singular value decomposition (SVD) for tensors, reveals how the tens
This work studies the problem of high-dimensional data (referred to tensors) completion from partially observed samplings. We consider that a tensor is a superposition of multiple low-rank components. In particular, each component can be represented
Tensor completion refers to the task of estimating the missing data from an incomplete measurement or observation, which is a core problem frequently arising from the areas of big data analysis, computer vision, and network engineering. Due to the mu
In this paper, we consider the tensor completion problem, which has many researchers in the machine learning particularly concerned. Our fast and precise method is built on extending the $L_{2,1}$-norm minimization and Qatar Riyal decomposition (LNM-
Higher-order low-rank tensor arises in many data processing applications and has attracted great interests. Inspired by low-rank approximation theory, researchers have proposed a series of effective tensor completion methods. However, most of these m