Do you want to publish a course? Click here

Statistical mechanical assessment of a reconstruction limit of compressed sensing: Toward theoretical analysis of correlated signals

413   0   0.0 ( 0 )
 Added by Koujin Takeda
 Publication date 2010
and research's language is English




Ask ChatGPT about the research

We provide a scheme for exploring the reconstruction limit of compressed sensing by minimizing the general cost function under the random measurement constraints for generic correlated signal sources. Our scheme is based on the statistical mechanical replica method for dealing with random systems. As a simple but non-trivial example, we apply the scheme to a sparse autoregressive model, where the first differences in the input signals of the correlated time series are sparse, and evaluate the critical compression rate for a perfect reconstruction. The results are in good agreement with a numerical experiment for a signal reconstruction.



rate research

Read More

We investigate a reconstruction limit of compressed sensing for a reconstruction scheme based on the L1-norm minimization utilizing a correlated compression matrix with a statistical mechanics method. We focus on the compression matrix modeled as the Kronecker-type random matrix studied in research on multi-input multi-output wireless communication systems. We found that strong one-dimensional correlations between expansion bases of original information slightly degrade reconstruction performance.
Xampling generalizes compressed sensing (CS) to reduced-rate sampling of analog signals. A unified framework is introduced for low rate sampling and processing of signals lying in a union of subspaces. Xampling consists of two main blocks: Analog compression that narrows down the input bandwidth prior to sampling with commercial devices followed by a nonlinear algorithm that detects the input subspace prior to conventional signal processing. A variety of analog CS applications are reviewed within the unified Xampling framework including a general filter-bank scheme for sparse shift-invariant spaces, periodic nonuniform sampling and modulated wideband conversion for multiband communications with unknown carrier frequencies, acquisition techniques for finite rate of innovation signals with applications to medical and radar imaging, and random demodulation of sparse harmonic tones. A hardware-oriented viewpoint is advocated throughout, addressing practical constraints and exemplifying hardware realizations where relevant. It will appear as a chapter in a book on Compressed Sensing: Theory and Applications edited by Yonina Eldar and Gitta Kutyniok.
We consider the problem of recovering a set of correlated signals (e.g., images from different viewpoints) from a few linear measurements per signal. We assume that each sensor in a network acquires a compressed signal in the form of linear measurements and sends it to a joint decoder for reconstruction. We propose a novel joint reconstruction algorithm that exploits correlation among underlying signals. Our correlation model considers geometrical transformations between the supports of the different signals. The proposed joint decoder estimates the correlation and reconstructs the signals using a simple thresholding algorithm. We give both theoretical and experimental evidence to show that our method largely outperforms independent decoding in terms of support recovery and reconstruction quality.
Many interesting problems in fields ranging from telecommunications to computational biology can be formalized in terms of large underdetermined systems of linear equations with additional constraints or regularizers. One of the most studied ones, the Compressed Sensing problem (CS), consists in finding the solution with the smallest number of non-zero components of a given system of linear equations $boldsymbol y = mathbf{F} boldsymbol{w}$ for known measurement vector $boldsymbol{y}$ and sensing matrix $mathbf{F}$. Here, we will address the compressed sensing problem within a Bayesian inference framework where the sparsity constraint is remapped into a singular prior distribution (called Spike-and-Slab or Bernoulli-Gauss). Solution to the problem is attempted through the computation of marginal distributions via Expectation Propagation (EP), an iterative computational scheme originally developed in Statistical Physics. We will show that this strategy is comparatively more accurate than the alternatives in solving instances of CS generated from statistically correlated measurement matrices. For computational strategies based on the Bayesian framework such as variants of Belief Propagation, this is to be expected, as they implicitly rely on the hypothesis of statistical independence among the entries of the sensing matrix. Perhaps surprisingly, the method outperforms uniformly also all the other state-of-the-art methods in our tests.
Motivated by applications in unsourced random access, this paper develops a novel scheme for the problem of compressed sensing of binary signals. In this problem, the goal is to design a sensing matrix $A$ and a recovery algorithm, such that the sparse binary vector $mathbf{x}$ can be recovered reliably from the measurements $mathbf{y}=Amathbf{x}+sigmamathbf{z}$, where $mathbf{z}$ is additive white Gaussian noise. We propose to design $A$ as a parity check matrix of a low-density parity-check code (LDPC), and to recover $mathbf{x}$ from the measurements $mathbf{y}$ using a Markov chain Monte Carlo algorithm, which runs relatively fast due to the sparse structure of $A$. The performance of our scheme is comparable to state-of-the-art schemes, which use dense sensing matrices, while enjoying the advantages of using a sparse sensing matrix.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا