Exploiting Prior Information in Block Sparse Signals


الملخص بالإنكليزية

We study the problem of recovering a block-sparse signal from under-sampled observations. The non-zero values of such signals appear in few blocks, and their recovery is often accomplished using a $ell_{1,2}$ optimization problem. In applications such as DNA micro-arrays, some prior information about the block support, i.e., blocks containing non-zero elements, is available. A typical way to consider the extra information in recovery procedures is to solve a weighted $ell_{1,2}$ problem. In this paper, we consider a block sparse model, where the block support has intersection with some given subsets of blocks with known probabilities. Our goal in this work is to minimize the number of required linear Gaussian measurements for perfect recovery of the signal by tuning the weights of a weighted $ell_{1,2}$ problem. For this goal, we apply tools from conic integral geometry and derive closed-form expressions for the optimal weights. We show through precise analysis and simulations that the weighted $ell_{1,2}$ problem with optimal weights significantly outperforms the regular $ell_{1,2}$ problem. We further examine the sensitivity of the optimal weights to the mismatch of block probabilities, and conclude stability under small probability deviations.

تحميل البحث