ﻻ يوجد ملخص باللغة العربية
The paper starts with a concise description of the recently developed semismooth* Newton method for the solution of general inclusions. This method is then applied to a class of variational inequalities of the second kind. As a result, one obtains an implementable algorithm exhibiting a local superlinear convergence. Thereafter we suggest several globally convergent hybrid algorithms in which one combines the semismooth* Newton method with selected splitting algorithms for the solution of monotone variational inequalities. Their efficiency is documented by extensive numerical experiments.
An equilibrium of a linear elastic body subject to loading and satisfying the friction and contact conditions can be described by a variational inequality of the second kind and the respective discrete model attains the form of a generalized equation
We introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by the alternating direction method of multipliers (ADMM) as a set of fixed point equations, and then use a nons
In this work, we present a globalized stochastic semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. We assume that only noisy gradient and Hessian inf
Augmented Lagrangian method (also called as method of multipliers) is an important and powerful optimization method for lots of smooth or nonsmooth variational problems in modern signal processing, imaging, optimal control and so on. However, one usu
The octagonal shrinkage and clustering algorithm for regression (OSCAR), equipped with the $ell_1$-norm and a pair-wise $ell_{infty}$-norm regularizer, is a useful tool for feature selection and grouping in high-dimensional data analysis. The computa