ﻻ يوجد ملخص باللغة العربية
This paper considers the problem of designing accelerated gradient-based algorithms for optimization and saddle-point problems. The class of objective functions is defined by a generalized sector condition. This class of functions contains strongly convex functions with Lipschitz gradients but also non-convex functions, which allows not only to address optimization problems but also saddle-point problems. The proposed design procedure relies on a suitable class of Lyapunov functions and on convex semi-definite programming. The proposed synthesis allows the design of algorithms that reach the performance of state-of-the-art accelerated gradient methods and beyond.
We study the robustness of accelerated first-order algorithms to stochastic uncertainties in gradient evaluation. Specifically, for unconstrained, smooth, strongly convex optimization problems, we examine the mean-squared error in the optimization va
We propose a framework to use Nesterovs accelerated method for constrained convex optimization problems. Our approach consists of first reformulating the original problem as an unconstrained optimization problem using a continuously differentiable ex
In this study, we present a general framework of outer approximation algorithms to solve convex vector optimization problems, in which the Pascoletti-Serafini (PS) scalarization is solved iteratively. This scalarization finds the minimum distance fro
This paper presents a novel scalable framework to solve the optimization of a nonlinear system with differential algebraic equation (DAE) constraints that enforce the asymptotic stability of the underlying dynamic model with respect to certain distur
This article derives lower bounds on the convergence rate of continuous-time gradient-based optimization algorithms. The algorithms are subjected to a time-normalization constraint that avoids a reparametrization of time in order to make the discussi