ترغب بنشر مسار تعليمي؟ اضغط هنا

Precise Phase Transition of Total Variation Minimization

83   0   0.0 ( 0 )
 نشر من قبل Weiyu Xu
 تاريخ النشر 2015
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Characterizing the phase transitions of convex optimizations in recovering structured signals or data is of central importance in compressed sensing, machine learning and statistics. The phase transitions of many convex optimization signal recovery methods such as $ell_1$ minimization and nuclear norm minimization are well understood through recent years research. However, rigorously characterizing the phase transition of total variation (TV) minimization in recovering sparse-gradient signal is still open. In this paper, we fully characterize the phase transition curve of the TV minimization. Our proof builds on Donoho, Johnstone and Montanaris conjectured phase transition curve for the TV approximate message passing algorithm (AMP), together with the linkage between the minmax Mean Square Error of a denoising problem and the high-dimensional convex geometry for TV minimization.



قيم البحث

اقرأ أيضاً

This work considers the use of Total variation (TV) minimization in the recovery of a given gradient sparse vector from Gaussian linear measurements. It has been shown in recent studies that there exist a sharp phase transition behavior in TV minimiz ation in asymptotic regimes. The phase transition curve specifies the boundary of success and failure of TV minimization for large number of measurements. It is a challenging task to obtain a theoretical bound that reflects this curve. In this work, we present a novel upper-bound that suitably approximates this curve and is asymptotically sharp. Numerical results show that our bound is closer to the empirical TV phase transition curve than the previously known bound obtained by Kabanava.
This work is about the total variation (TV) minimization which is used for recovering gradient-sparse signals from compressed measurements. Recent studies indicate that TV minimization exhibits a phase transition behavior from failure to success as t he number of measurements increases. In fact, in large dimensions, TV minimization succeeds in recovering the gradient-sparse signal with high probability when the number of measurements exceeds a certain threshold; otherwise, it fails almost certainly. Obtaining a closed-form expression that approximates this threshold is a major challenge in this field and has not been appropriately addressed yet. In this work, we derive a tight lower-bound on this threshold in case of any random measurement matrix whose null space is distributed uniformly with respect to the Haar measure. In contrast to the conventional TV phase transition results that depend on the simple gradient-sparsity level, our bound is highly affected by generalized notions of gradient-sparsity. Our proposed bound is very close to the true phase transition of TV minimization confirmed by simulation results.
117 - Xin Yuan 2015
We consider the total variation (TV) minimization problem used for compressive sensing and solve it using the generalized alternating projection (GAP) algorithm. Extensive results demonstrate the high performance of proposed algorithm on compressive sensing, including two dimensional images, hyperspectral images and videos. We further derive the Alternating Direction Method of Multipliers (ADMM) framework with TV minimization for video and hyperspectral image compressive sensing under the CACTI and CASSI framework, respectively. Connections between GAP and ADMM are also provided.
Linear regression without correspondences concerns the recovery of a signal in the linear regression setting, where the correspondences between the observations and the linear functionals are unknown. The associated maximum likelihood function is NP- hard to compute when the signal has dimension larger than one. To optimize this objective function we reformulate it as a concave minimization problem, which we solve via branch-and-bound. This is supported by a computable search space to branch, an effective lower bounding scheme via convex envelope minimization and a refined upper bound, all naturally arising from the concave minimization reformulation. The resulting algorithm outperforms state-of-the-art methods for fully shuffled data and remains tractable for up to $8$-dimensional signals, an untouched regime in prior work.
Total variation (TV) regularization has proven effective for a range of computer vision tasks through its preferential weighting of sharp image edges. Existing TV-based methods, however, often suffer from the over-smoothing issue and solution bias ca used by the homogeneous penalization. In this paper, we consider addressing these issues by applying inhomogeneous regularization on different image components. We formulate the inhomogeneous TV minimization problem as a convex quadratic constrained linear programming problem. Relying on this new model, we propose a matching pursuit based total variation minimization method (MPTV), specifically for image deconvolution. The proposed MPTV method is essentially a cutting-plane method, which iteratively activates a subset of nonzero image gradients, and then solves a subproblem focusing on those activated gradients only. Compared to existing methods, MPTV is less sensitive to the choice of the trade-off parameter between data fitting and regularization. Moreover, the inhomogeneity of MPTV alleviates the over-smoothing and ringing artifacts, and improves the robustness to errors in blur kernel. Extensive experiments on different tasks demonstrate the superiority of the proposed method over the current state-of-the-art.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا