ﻻ يوجد ملخص باللغة العربية
In this paper, we investigate in a unified way the structural properties of solutions to inverse problems. These solutions are regularized by the generic class of semi-norms defined as a decomposable norm composed with a linear operator, the so-called analysis type decomposable prior. This encompasses several well-known analysis-type regularizations such as the discrete total variation (in any dimension), analysis group-Lasso or the nuclear norm. Our main results establish sufficient conditions under which uniqueness and stability to a bounded noise of the regularized solution are guaranteed. Along the way, we also provide a strong sufficient uniqueness result that is of independent interest and goes beyond the case of decomposable norms.
We present a detailed analysis of the unconstrained $ell_1$-method Lasso method for sparse recovery of noisy data. The data is recovered by sensing its compressed output produced by randomly generated class of observing matrices satisfying a Restrict
Approximate Message Passing (AMP) has been shown to be an excellent statistical approach to signal inference and compressed sensing problem. The AMP framework provides modularity in the choice of signal prior; here we propose a hierarchical form of t
Generative neural networks have been empirically found very promising in providing effective structural priors for compressed sensing, since they can be trained to span low-dimensional data manifolds in high-dimensional signal spaces. Despite the non
This paper studies the problem of accurately recovering a structured signal from a small number of corrupted sub-Gaussian measurements. We consider three different procedures to reconstruct signal and corruption when different kinds of prior knowledg
Because of its self-regularizing nature and uncertainty estimation, the Bayesian approach has achieved excellent recovery performance across a wide range of sparse signal recovery applications. However, most methods are based on the real-value signal