ترغب بنشر مسار تعليمي؟ اضغط هنا

Enhanced nonconvex low-rank approximation of tensor multi-modes for tensor completion

215   0   0.0 ( 0 )
 نشر من قبل Xiaozhen Xie
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Higher-order low-rank tensor arises in many data processing applications and has attracted great interests. Inspired by low-rank approximation theory, researchers have proposed a series of effective tensor completion methods. However, most of these methods directly consider the global low-rankness of underlying tensors, which is not sufficient for a low sampling rate; in addition, the single nuclear norm or its relaxation is usually adopted to approximate the rank function, which would lead to suboptimal solution deviated from the original one. To alleviate the above problems, in this paper, we propose a novel low-rank approximation of tensor multi-modes (LRATM), in which a double nonconvex $L_{gamma}$ norm is designed to represent the underlying joint-manifold drawn from the modal factorization factors of the underlying tensor. A block successive upper-bound minimization method-based algorithm is designed to efficiently solve the proposed model, and it can be demonstrated that our numerical scheme converges to the coordinatewise minimizers. Numerical results on three types of public multi-dimensional datasets have tested and shown that our algorithm can recover a variety of low-rank tensors with significantly fewer samples than the compared methods.

قيم البحث

اقرأ أيضاً

In this paper, we propose a novel model to recover a low-rank tensor by simultaneously performing double nuclear norm regularized low-rank matrix factorizations to the all-mode matricizations of the underlying tensor. An block successive upper-bound minimization algorithm is applied to solve the model. Subsequence convergence of our algorithm can be established, and our algorithm converges to the coordinate-wise minimizers in some mild conditions. Several experiments on three types of public data sets show that our algorithm can recover a variety of low-rank tensors from significantly fewer samples than the other testing tensor completion methods.
292 - An-Bao Xu 2020
This paper considers the completion problem for a tensor (also referred to as a multidimensional array) from limited sampling. Our greedy method is based on extending the low-rank approximation pursuit (LRAP) method for matrix completions to tensor c ompletions. The method performs a tensor factorization using the tensor singular value decomposition (t-SVD) which extends the standard matrix SVD to tensors. The t-SVD leads to a notion of rank, called tubal-rank here. We want to recreate the data in tensors from low resolution samples as best we can here. To complete a low resolution tensor successfully we assume that the given tensor data has low tubal-rank. For tensors of low tubal-rank, we establish convergence results for our method that are based on the tensor restricted isometry property (TRIP). Our result with the TRIP condition for tensors is similar to low-rank matrix completions under the RIP condition. The TRIP condition uses the t-SVD for low tubal-rank tensors, while RIP uses the SVD for matrices. We show that a subgaussian measurement map satisfies the TRIP condition with high probability and gives an almost optimal bound on the number of required measurements. We compare the numerical performance of the proposed algorithm with those for state-of-the-art approaches on video recovery and color image recovery.
Tensor completion refers to the task of estimating the missing data from an incomplete measurement or observation, which is a core problem frequently arising from the areas of big data analysis, computer vision, and network engineering. Due to the mu ltidimensional nature of high-order tensors, the matrix approaches, e.g., matrix factorization and direct matricization of tensors, are often not ideal for tensor completion and recovery. In this paper, we introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion. Our model possesses a sparse regularization term to promote a sparse core tensor of the Tucker decomposition, which is beneficial for tensor data compression. Moreover, we enforce low-rank regularization terms on factor matrices of the Tucker decomposition for inducing the low-rankness of the tensor with a cheap computational cost. Numerically, we propose a customized ADMM with enough easy subproblems to solve the underlying model. It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors. A series of computational experiments on real-world data sets, including internet traffic data sets, color images, and face recognition, demonstrate that our model performs better than many existing state-of-the-art matricization and tensorization approaches in terms of achieving higher recovery accuracy.
162 - Huyan Huang , Yipeng Liu , Ce Zhu 2019
Tensor completion estimates missing components by exploiting the low-rank structure of multi-way data. The recently proposed methods based on tensor train (TT) and tensor ring (TR) show better performance in image recovery than classical ones. Compar ed with TT and TR, the projected entangled pair state (PEPS), which is also called tensor grid (TG), allows more interactions between different dimensions, and may lead to more compact representation. In this paper, we propose to perform image completion based on low-rank tensor grid. A two-stage density matrix renormalization group algorithm is used for initialization of TG decomposition, which consists of multiple TT decompositions. The latent TG factors can be alternatively obtained by solving alternating least squares problems. To further improve the computational efficiency, a multi-linear matrix factorization for low rank TG completion is developed by using parallel matrix factorization. Experimental results on synthetic data and real-world images show the proposed methods outperform the existing ones in terms of recovery accuracy.
210 - Huyan Huang , Yipeng Liu , Ce Zhu 2019
Low-rank tensor completion recovers missing entries based on different tensor decompositions. Due to its outstanding performance in exploiting some higher-order data structure, low rank tensor ring has been applied in tensor completion. To further de al with its sensitivity to sparse component as it does in tensor principle component analysis, we propose robust tensor ring completion (RTRC), which separates latent low-rank tensor component from sparse component with limited number of measurements. The low rank tensor component is constrained by the weighted sum of nuclear norms of its balanced unfoldings, while the sparse component is regularized by its l1 norm. We analyze the RTRC model and gives the exact recovery guarantee. The alternating direction method of multipliers is used to divide the problem into several sub-problems with fast solutions. In numerical experiments, we verify the recovery condition of the proposed method on synthetic data, and show the proposed method outperforms the state-of-the-art ones in terms of both accuracy and computational complexity in a number of real-world data based tasks, i.e., light-field image recovery, shadow removal in face images, and background extraction in color video.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا