Do you want to publish a course? Click here

Randomized Quaternion Singular Value Decomposition for Low-Rank Approximation

121   0   0.0 ( 0 )
 Added by Zhigang Jia
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Quaternion matrix approximation problems construct the approximated matrix via the quaternion singular value decomposition (SVD) by selecting some singular value decomposition (SVD) triplets of quaternion matrices. In applications such as color image processing and recognition problems, only a small number of dominant SVD triplets are selected, while in some applications such as quaternion total least squares problem, small SVD triplets (small singular values and associated singular vectors) and numerical rank with respect to a small threshold are required. In this paper, we propose a randomized quaternion SVD (verbrandsvdQ) method to compute a small number of SVD triplets of a large-scale quaternion matrix. Theoretical results are given about approximation errors and the corresponding algorithm adapts to the low-rank matrix approximation problem. When the restricted rank increases, it might lead to information loss of small SVD triplets. The blocked quaternion randomized SVD algorithm is then developed when the numerical rank and information about small singular values are required. For color face recognition problems, numerical results show good performance of the developed quaternion randomized SVD method for low-rank approximation of a large-scale quaternion matrix. The blocked randomized SVD algorithm is also shown to be more robust than unblocked method through several experiments, and approximation errors from the blocked scheme are very close to the optimal error obtained by truncating a full SVD.



rate research

Read More

Quaternion matrices are employed successfully in many color image processing applications. In particular, a pure quaternion matrix can be used to represent red, green and blue channels of color images. A low-rank approximation for a pure quaternion matrix can be obtained by using the quaternion singular value decomposition. However, this approximation is not optimal in the sense that the resulting low-rank approximation matrix may not be pure quaternion, i.e., the low-rank matrix contains real component which is not useful for the representation of a color image. The main contribution of this paper is to find an optimal rank-$r$ pure quaternion matrix approximation for a pure quaternion matrix (a color image). Our idea is to use a projection on a low-rank quaternion matrix manifold and a projection on a quaternion matrix with zero real component, and develop an alternating projections algorithm to find such optimal low-rank pure quaternion matrix approximation. The convergence of the projection algorithm can be established by showing that the low-rank quaternion matrix manifold and the zero real component quaternion matrix manifold has a non-trivial intersection point. Numerical examples on synthetic pure quaternion matrices and color images are presented to illustrate the projection algorithm can find optimal low-rank pure quaternion approximation for pure quaternion matrices or color images.
The hierarchical SVD provides a quasi-best low rank approximation of high dimensional data in the hierarchical Tucker framework. Similar to the SVD for matrices, it provides a fundamental but expensive tool for tensor computations. In the present work we examine generalizations of randomized matrix decomposition methods to higher order tensors in the framework of the hierarchical tensors representation. In particular we present and analyze a randomized algorithm for the calculation of the hierarchical SVD (HSVD) for the tensor train (TT) format.
In this paper we propose an approach to approximate a truncated singular value decomposition of a large structured matrix. By first decomposing the matrix into a sum of Kronecker products, our approach can be used to approximate a large number of singular values and vectors more efficiently than other well known schemes, such as randomized matrix algorithms or iterative algorithms based on Golub-Kahan bidiagonalization. We provide theoretical results and numerical experiments to demonstrate the accuracy of our approximation and show how the approximation can be used to solve large scale ill-posed inverse problems, either as an approximate filtering method, or as a preconditioner to accelerate iterative algorithms.
Quaternion singular value decomposition (QSVD) is a robust technique of digital watermarking which can extract high quality watermarks from watermarked images with low distortion. In this paper, QSVD technique is further investigated and an efficient robust watermarking scheme is proposed. The improved algebraic structure-preserving method is proposed to handle the problem of explosion of complexity occurred in the conventional QSVD design. Secret information is transmitted blindly by incorporating in QSVD two new strategies, namely, coefficient pair selection and adaptive embedding. Unlike conventional QSVD which embeds watermarks in a single imaginary unit, we propose to adaptively embed the watermark into the optimal hiding position using the Normalized Cross-Correlation (NC) method. This avoids the selection of coefficient pair with less correlation, and thus, it reduces embedding impact by decreasing the maximum modification of coefficient values. In this way, compared with conventional QSVD, the proposed watermarking strategy avoids more modifications to a single color image layer and a better visual quality of the watermarked image is observed. Meanwhile, adaptive QSVD resists some common geometric attacks, and it improves the robustness of conventional QSVD. With these improvements, our method outperforms conventional QSVD. Its superiority over other state-of-the-art methods is also demonstrated experimentally.
174 - Huamin Li , Yuval Kluger , 2016
Randomized algorithms provide solutions to two ubiquitous problems: (1) the distributed calculation of a principal component analysis or singular value decomposition of a highly rectangular matrix, and (2) the distributed calculation of a low-rank approximation (in the form of a singular value decomposition) to an arbitrary matrix. Carefully honed algorithms yield results that are uniformly superior to those of the stock, deterministic implementations in Spark (the popular platform for distributed computation); in particular, whereas the stock software will without warning return left singular vectors that are far from numerically orthonormal, a significantly burnished randomized implementation generates left singular vectors that are numerically orthonormal to nearly the machine precision.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا