ﻻ يوجد ملخص باللغة العربية
We study novel robust zero-order algorithms with acceleration for the solution of real-time optimization problems. In particular, we propose a family of extremum seeking dynamics that can be universally modeled as singularly perturbed hybrid dynamical systems with restarting mechanisms. From this family of dynamics, we synthesize four fast algorithms for the solution of convex, strongly convex, constrained, and unconstrained optimization problems. In each case, we establish robust semi-global practical asymptotic or exponential stability results, and we show how to obtain well-posed discretized algorithms that retain the main properties of the original dynamics. Given that existing averaging theorems for singularly perturbed hybrid systems are not directly applicable to our setting, we derive a new averaging theorem that relaxes some of the assumptions made in the literature, allowing us to make a clear link between the KL bounds that characterize the rates of convergence of the hybrid dynamics and their average dynamics. We also show that our results are applicable to non-hybrid algorithms, thus providing a general framework for accelerated dynamics based on averaging theory. We present different numerical examples to illustrate our results.
A collection of optimization problems central to power system operation requires distributed solution architectures to avoid the need for aggregation of all information at a central location. In this paper, we study distributed dual subgradient metho
In this paper, we consider a stochastic distributed nonconvex optimization problem with the cost function being distributed over $n$ agents having access only to zeroth-order (ZO) information of the cost. This problem has various machine learning app
We propose a novel second-order ODE as the continuous-time limit of a Riemannian accelerated gradient-based method on a manifold with curvature bounded from below. This ODE can be seen as a generalization of the ODE derived for Euclidean spaces, and
We study gradient-based optimization methods obtained by direct Runge-Kutta discretization of the ordinary differential equation (ODE) describing the movement of a heavy-ball under constant friction coefficient. When the function is high order smooth
We introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders t