A Preconditioned Difference of Convex Algorithm for Truncated Quadratic Regularization with Application to Imaging


الملخص بالإنكليزية

We consider the minimization problem with the truncated quadratic regularization with gradient operator, which is a nonsmooth and nonconvex problem. We cooperated the classical preconditioned iterations for linear equations into the nonlinear difference of convex functions algorithms with extrapolation. Especially, our preconditioned framework can deal with the large linear system efficiently which is usually expensive for computations. Global convergence is guaranteed and local linear convergence rate is given based on the analysis of the Kurdyka-L ojasiewicz exponent of the minimization functional. The proposed algorithm with preconditioners turns out to be very efficient for image restoration and is also appealing for image segmentation.

تحميل البحث