ﻻ يوجد ملخص باللغة العربية
We provide a condition-based analysis of two interior-point methods for unconstrained geometric programs, a class of convex programs that arise naturally in applications including matrix scaling, matrix balancing, and entropy maximization. Our condition numbers are natural geometric quantities associated with the Newton polytope of the geometric program, and lead to diameter bounds on approximate minimizers. We also provide effective bounds on the condition numbers both in general and under combinatorial assumptions on the Newton polytope. In this way, we generalize the iteration complexity of recent interior-point methods for matrix scaling and matrix balancing. Recently, there has been much work on algorithms for certain optimization problems on Lie groups, known as capacity and scaling problems. For commutative groups, these problems reduce to unconstrained geometric programs, which serves as a particular source of motivation for our work.
We propose a primal-dual interior-point method (IPM) with convergence to second-order stationary points (SOSPs) of nonlinear semidefinite optimization problems, abbreviated as NSDPs. As far as we know, the current algorithms for NSDPs only ensure con
We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps. However, for all classes
First-order methods (FOMs) have recently been applied and analyzed for solving problems with complicated functional constraints. Existing works show that FOMs for functional constrained problems have lower-order convergence rates than those for uncon
We introduce and study conic geometric programs (CGPs), which are convex optimization problems that unify geometric programs (GPs) and conic optimization problems such as semidefinite programs (SDPs). A CGP consists of a linear objective function tha
We introduce log-log convex programs, which are optimization problems with positive variables that become convex when the variables, objective functions, and constraint functions are replaced with their logs, which we refer to as a log-log transforma