ﻻ يوجد ملخص باللغة العربية
We introduce a class of stochastic algorithms for minimizing weakly convex functions over proximally smooth sets. As their main building blocks, the algorithms use simplified models of the objective function and the constraint set, along with a retraction operation to restore feasibility. All the proposed methods come equipped with a finite time efficiency guarantee in terms of a natural stationarity measure. We discuss consequences for nonsmooth optimization over smooth manifolds and over sets cut out by weakly-convex inequalities.
We consider stochastic optimization problems where a smooth (and potentially nonconvex) objective is to be minimized using a stochastic first-order oracle. These type of problems arise in many settings from simulation optimization to deep learning. W
Optimization models with non-convex constraints arise in many tasks in machine learning, e.g., learning with fairness constraints or Neyman-Pearson classification with non-convex loss. Although many efficient methods have been developed with theoreti
We introduce SPRING, a novel stochastic proximal alternating linearized minimization algorithm for solving a class of non-smooth and non-convex optimization problems. Large-scale imaging problems are becoming increasingly prevalent due to advances in
Riemannian optimization has drawn a lot of attention due to its wide applications in practice. Riemannian stochastic first-order algorithms have been studied in the literature to solve large-scale machine learning problems over Riemannian manifolds.
This paper studies stochastic optimization problems with polynomials. We propose an optimization model with sample averages and perturbations. The Lasserre type Moment-SOS relaxations are used to solve the sample average optimization. Properties of t