ﻻ يوجد ملخص باللغة العربية
One revisits the standard saddle-point method based on conjugate duality for solving convex minimization problems. Our aim is to reduce or remove unnecessary topological restrictions on the constraint set. Dual equalities and characterizations of the minimizers are obtained with weak or without constraint qualifications. The main idea is to work with intrinsic topologies which reflect some geometry of the objective function. The abstract results of this article are applied in other papers to the Monge-Kantorovich optimal transport problem and the minimization of entropy functionals.
Shape-constrained convex regression problem deals with fitting a convex function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This paper provides a unified frame
The divergence minimization problem plays an important role in various fields. In this note, we focus on differentiable and strictly convex divergences. For some minimization problems, we show the minimizer conditions and the uniqueness of the minimi
We propose faster methods for unconstrained optimization of emph{structured convex quartics}, which are convex functions of the form begin{equation*} f(x) = c^top x + x^top mathbf{G} x + mathbf{T}[x,x,x] + frac{1}{24} mathopen| mathbf{A} x mathclose|
We propose a new framework for deriving screening rules for convex optimization problems. Our approach covers a large class of constrained and penalized optimization formulations, and works in two steps. First, given any approximate point, the struct
We consider the problem of minimizing a block separable convex function (possibly nondifferentiable, and including constraints) plus Laplacian regularization, a problem that arises in applications including model fitting, regularizing stratified mode