ﻻ يوجد ملخص باللغة العربية
We consider the minimization problem with the truncated quadratic regularization with gradient operator, which is a nonsmooth and nonconvex problem. We cooperated the classical preconditioned iterations for linear equations into the nonlinear difference of convex functions algorithms with extrapolation. Especially, our preconditioned framework can deal with the large linear system efficiently which is usually expensive for computations. Global convergence is guaranteed and local linear convergence rate is given based on the analysis of the Kurdyka-L ojasiewicz exponent of the minimization functional. The proposed algorithm with preconditioners turns out to be very efficient for image restoration and is also appealing for image segmentation.
A constraint-reduced Mehrotra-Predictor-Corrector algorithm for convex quadratic programming is proposed. (At each iteration, such algorithms use only a subset of the inequality constraints in constructing the search direction, resulting in CPU savin
A framework is proposed for solving general convex quadratic programs (CQPs) from an infeasible starting point by invoking an existing feasible-start algorithm tailored for inequality-constrained CQPs. The central tool is an exact penalty function sc
We propose an extended primal-dual algorithm framework for solving a general nonconvex optimization model. This work is motivated by image reconstruction problems in a class of nonlinear imaging, where the forward operator can be formulated as a nonl
For some typical and widely used non-convex half-quadratic regularization models and the Ambrosio-Tortorelli approximate Mumford-Shah model, based on the Kurdyka-L ojasiewicz analysis and the recent nonconvex proximal algorithms, we developed an effi
This work presents a new algorithm for empirical risk minimization. The algorithm bridges the gap between first- and second-order methods by computing a search direction that uses a second-order-type update in one subspace, coupled with a scaled stee