ترغب بنشر مسار تعليمي؟ اضغط هنا

An Investigation on Semismooth Newton based Augmented Lagrangian Method for Image Restoration

307   0   0.0 ( 0 )
 نشر من قبل Hongpeng Sun Dr.
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English
 تأليف Hongpeng Sun




اسأل ChatGPT حول البحث

Augmented Lagrangian method (also called as method of multipliers) is an important and powerful optimization method for lots of smooth or nonsmooth variational problems in modern signal processing, imaging, optimal control and so on. However, one usually needs to solve the coupled and nonlinear system together and simultaneously, which is very challenging. In this paper, we proposed several semismooth Newton methods to solve the nonlinear subproblems arising in image restoration, which leads to several highly efficient and competitive algorithms for imaging processing. With the analysis of the metric subregularities of the corresponding functions, we give both the global convergence and local linear convergence rate for the proposed augmented Lagrangian methods with semismooth Newton solvers.



قيم البحث

اقرأ أيضاً

118 - Hongpeng Sun 2020
Total generalization variation (TGV) is a very powerful and important regularization for various inverse problems and computer vision tasks. In this paper, we proposed a semismooth Newton based augmented Lagrangian method to solve this problem. The a ugmented Lagrangian method (also called as method of multipliers) is widely used for lots of smooth or nonsmooth variational problems. However, its efficiency usually heavily depends on solving the coupled and nonlinear system together and simultaneously, which is very complicated and highly coupled for total generalization variation. With efficient primal-dual semismooth Newton methods for the complicated linear subproblems involving total generalized variation, we investigated a highly efficient and competitive algorithm compared to some efficient first-order method. With the analysis of the metric subregularities of the corresponding functions, we give both the global convergence and local linear convergence rate for the proposed augmented Lagrangian methods.
Support vector machines (SVMs) are successful modeling and prediction tools with a variety of applications. Previous work has demonstrated the superiority of the SVMs in dealing with the high dimensional, low sample size problems. However, the numeri cal difficulties of the SVMs will become severe with the increase of the sample size. Although there exist many solvers for the SVMs, only few of them are designed by exploiting the special structures of the SVMs. In this paper, we propose a highly efficient sparse semismooth Newton based augmented Lagrangian method for solving a large-scale convex quadratic programming problem with a linear equality constraint and a simple box constraint, which is generated from the dual problems of the SVMs. By leveraging the primal-dual error bound result, the fast local convergence rate of the augmented Lagrangian method can be guaranteed. Furthermore, by exploiting the second-order sparsity of the problem when using the semismooth Newton method,the algorithm can efficiently solve the aforementioned difficult problems. Finally, numerical comparisons demonstrate that the proposed algorithm outperforms the current state-of-the-art solvers for the large-scale SVMs.
The octagonal shrinkage and clustering algorithm for regression (OSCAR), equipped with the $ell_1$-norm and a pair-wise $ell_{infty}$-norm regularizer, is a useful tool for feature selection and grouping in high-dimensional data analysis. The computa tional challenge posed by OSCAR, for high dimensional and/or large sample size data, has not yet been well resolved due to the non-smoothness and inseparability of the regularizer involved. In this paper, we successfully resolve this numerical challenge by proposing a sparse semismooth Newton-based augmented Lagrangian method to solve the more general SLOPE (the sorted L-one penalized estimation) model. By appropriately exploiting the inherent sparse and low-rank property of the generalized Jacobian of the semismooth Newton system in the augmented Lagrangian subproblem, we show how the computational complexity can be substantially reduced. Our algorithm presents a notable advantage in the high-dimensional statistical regression settings. Numerical experiments are conducted on real data sets, and the results demonstrate that our algorithm is far superior, in both speed and robustness, than the existing state-of-the-art algorithms based on first-order iterative schemes, including the widely used accelerated proximal gradient (APG) method and the alternating direction method of multipliers (ADMM).
This paper is devoted to studying an inexact augmented Lagrangian method for solving a class of manifold optimization problems, which have non-smooth objective functions and non-negative constraints. Under the constant positive linear dependence cond ition on manifold, we show that the proposed method converges to a stationary point of the non-smooth manifold optimization problem. Moreover, we propose a globalized semi-smooth Newton method to solve the augmented Lagrangian subproblem on manifolds efficiently. The local superlinear convergence of the manifold semi-smooth Newton method is also established under some suitable conditions. Finally, numerical experiments on compressed modes and (constrained) sparse PCA illustrate the advantages of the proposed method in terms of accuracy and computational efficiency.
79 - Yinqiao Yan , Qingna Li 2019
Support vector machine (SVM) has proved to be a successful approach for machine learning. Two typical SVM models are the L1-loss model for support vector classification (SVC) and $epsilon$-L1-loss model for support vector regression (SVR). Due to the nonsmoothness of the L1-loss function in the two models, most of the traditional approaches focus on solving the dual problem. In this paper, we propose an augmented Lagrangian method for the L1-loss model, which is designed to solve the primal problem. By tackling the nonsmooth term in the model with Moreau-Yosida regularization and the proximal operator, the subproblem in augmented Lagrangian method reduces to a nonsmooth linear system, which can be solved via the quadratically convergent semismooth Newtons method. Moreover, the high computational cost in semismooth Newtons method can be significantly reduced by exploring the sparse structure in the generalized Jacobian. Numerical results on various datasets in LIBLINEAR show that the proposed method is competitive with the most popular solvers in both speed and accuracy.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا