ﻻ يوجد ملخص باللغة العربية
In this work, we consider the problem of recovering analysis-sparse signals from under-sampled measurements when some prior information about the support is available. We incorporate such information in the recovery stage by suitably tuning the weights in a weighted $ell_1$ analysis optimization problem. Indeed, we try to set the weights such that the method succeeds with minimum number of measurements. For this purpose, we exploit the upper-bound on the statistical dimension of a certain cone to determine the weights. Our numerical simulations confirm that the introduced method with tuned weights outperforms the standard $ell_1$ analysis technique.
The orthogonal matching pursuit (OMP) algorithm is a commonly used algorithm for recovering $K$-sparse signals $xin mathbb{R}^{n}$ from linear model $y=Ax$, where $Ain mathbb{R}^{mtimes n}$ is a sensing matrix. A fundamental question in the performan
We study the problem of recovering a block-sparse signal from under-sampled observations. The non-zero values of such signals appear in few blocks, and their recovery is often accomplished using a $ell_{1,2}$ optimization problem. In applications suc
This paper considers the problem of recovering a structured signal from a relatively small number of noisy measurements with the aid of a similar signal which is known beforehand. We propose a new approach to integrate prior information into the stan
Matrix sensing is the problem of reconstructing a low-rank matrix from a few linear measurements. In many applications such as collaborative filtering, the famous Netflix prize problem, and seismic data interpolation, there exists some prior informat
We address the exact recovery of a k-sparse vector in the noiseless setting when some partial information on the support is available. This partial information takes the form of either a subset of the true support or an approximate subset including w