Do you want to publish a course? Click here

Simultaneous Orthogonal Matching Pursuit With Noise Stabilization: Theoretical Analysis

110   0   0.0 ( 0 )
 Publication date 2015
and research's language is English




Ask ChatGPT about the research

This paper studies the joint support recovery of similar sparse vectors on the basis of a limited number of noisy linear measurements, i.e., in a multiple measurement vector (MMV) model. The additive noise signals on each measurement vector are assumed to be Gaussian and to exhibit different variances. The simultaneous orthogonal matching pursuit (SOMP) algorithm is generalized to weight the impact of each measurement vector on the choice of the atoms to be picked according to their noise levels. The new algorithm is referred to as SOMP-NS where NS stands for noise stabilization. To begin with, a theoretical framework to analyze the performance of the proposed algorithm is developed. This framework is then used to build conservative lower bounds on the probability of partial or full joint support recovery. Numerical simulations show that the proposed algorithm outperforms SOMP and that the theoretical lower bound provides a great insight into how SOMP-NS behaves when the weighting strategy is modified.



rate research

Read More

169 - Mingrui Yang , , Frank de Hoog 2014
In this paper we define a new coherence index, named the global 2-coherence, of a given dictionary and study its relationship with the traditional mutual coherence and the restricted isometry constant. By exploring this relationship, we obtain more general results on sparse signal reconstruction using greedy algorithms in the compressive sensing (CS) framework. In particular, we obtain an improved bound over the best known results on the restricted isometry constant for successful recovery of sparse signals using orthogonal matching pursuit (OMP).
109 - Jinming Wen , Rui Zhang , 2020
Exact recovery of $K$-sparse signals $x in mathbb{R}^{n}$ from linear measurements $y=Ax$, where $Ain mathbb{R}^{mtimes n}$ is a sensing matrix, arises from many applications. The orthogonal matching pursuit (OMP) algorithm is widely used for reconstructing $x$. A fundamental question in the performance analysis of OMP is the characterizations of the probability of exact recovery of $x$ for random matrix $A$ and the minimal $m$ to guarantee a target recovery performance. In many practical applications, in addition to sparsity, $x$ also has some additional properties. This paper shows that these properties can be used to refine the answer to the above question. In this paper, we first show that the prior information of the nonzero entries of $x$ can be used to provide an upper bound on $|x|_1^2/|x|_2^2$. Then, we use this upper bound to develop a lower bound on the probability of exact recovery of $x$ using OMP in $K$ iterations. Furthermore, we develop a lower bound on the number of measurements $m$ to guarantee that the exact recovery probability using $K$ iterations of OMP is no smaller than a given target probability. Finally, we show that when $K=O(sqrt{ln n})$, as both $n$ and $K$ go to infinity, for any $0<zetaleq 1/sqrt{pi}$, $m=2Kln (n/zeta)$ measurements are sufficient to ensure that the probability of exact recovering any $K$-sparse $x$ is no lower than $1-zeta$ with $K$ iterations of OMP. For $K$-sparse $alpha$-strongly decaying signals and for $K$-sparse $x$ whose nonzero entries independently and identically follow the Gaussian distribution, the number of measurements sufficient for exact recovery with probability no lower than $1-zeta$ reduces further to $m=(sqrt{K}+4sqrt{frac{alpha+1}{alpha-1}ln(n/zeta)})^2$ and asymptotically $mapprox 1.9Kln (n/zeta)$, respectively.
Greed is good. However, the tighter you squeeze, the less you have. In this paper, a less greedy algorithm for sparse signal reconstruction in compressive sensing, named orthogonal matching pursuit with thresholding is studied. Using the global 2-coherence , which provides a bridge between the well known mutual coherence and the restricted isometry constant, the performance of orthogonal matching pursuit with thresholding is analyzed and more general results for sparse signal reconstruction are obtained. It is also shown that given the same assumption on the coherence index and the restricted isometry constant as required for orthogonal matching pursuit, the thresholding variation gives exactly the same reconstruction performance with significantly less complexity.
180 - Jinming Wen , Wei Yu 2019
The orthogonal matching pursuit (OMP) algorithm is a commonly used algorithm for recovering $K$-sparse signals $xin mathbb{R}^{n}$ from linear model $y=Ax$, where $Ain mathbb{R}^{mtimes n}$ is a sensing matrix. A fundamental question in the performance analysis of OMP is the characterization of the probability that it can exactly recover $x$ for random matrix $A$. Although in many practical applications, in addition to the sparsity, $x$ usually also has some additional property (for example, the nonzero entries of $x$ independently and identically follow the Gaussian distribution), none of existing analysis uses these properties to answer the above question. In this paper, we first show that the prior distribution information of $x$ can be used to provide an upper bound on $|x|_1^2/|x|_2^2$, and then explore the bound to develop a better lower bound on the probability of exact recovery with OMP in $K$ iterations. Simulation tests are presented to illustrate the superiority of the new bound.
162 - Rong Fan , Qun Wan , Yipeng Liu 2012
In this paper, we present new results on using orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries for complex cases (i.e., complex measurement vector, complex dictionary and complex additive white Gaussian noise (CAWGN)). A sufficient condition that OMP can recover the optimal representation of an exactly sparse signal in the complex cases is proposed both in noiseless and bound Gaussian noise settings. Similar to exact recovery condition (ERC) results in real cases, we extend them to complex case and derivate the corresponding ERC in the paper. It leverages this theory to show that OMP succeed for k-sparse signal from a class of complex dictionary. Besides, an application with geometrical theory of diffraction (GTD) model is presented for complex cases. Finally, simulation experiments illustrate the validity of the theoretical analysis.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا