ﻻ يوجد ملخص باللغة العربية
Many iterative methods in optimization are fixed-point iterations with averaged operators. As such methods converge at an $mathcal{O}(1/k)$ rate with the constant determined by the averagedness coefficient, establishing small averagedness coefficients for operators is of broad interest. In this paper, we show that the averagedness coefficients of the composition of averaged operators by Ogura and Yamada (Numer Func Anal Opt 32(1--2):113--137, 2002) and the three-operator splitting by Davis and Yin (Set-Valued Var Anal 25(4):829--858, 2017) are tight. The analysis relies on the scaled relative graph, a geometric tool recently proposed by Ryu, Hannah, and Yin (arXiv:1902.09788, 2019).
The Scaled Relative Graph (SRG) by Ryu, Hannah, and Yin (arXiv:1902.09788, 2019) is a geometric tool that maps the action of a multi-valued nonlinear operator onto the 2D plane, used to analyze the convergence of a wide range of iterative methods. As
We estimate convergence rates for fixed-point iterations of a class of nonlinear operators which are partially motivated from solving convex optimization problems. We introduce the notion of the generalized averaged nonexpansive (GAN) operator with a
Partition functions arise in statistical physics and probability theory as the normalizing constant of Gibbs measures and in combinatorics and graph theory as graph polynomials. For instance the partition functions of the hard-core model and monomer-
Communication has been seen as a significant bottleneck in industrial applications over large-scale networks. To alleviate the communication burden, sign-based optimization algorithms have gained popularity recently in both industrial and academic co
Structured convex optimization on weighted graphs finds numerous applications in machine learning and computer vision. In this work, we propose a novel adaptive preconditioning strategy for proximal algorithms on this problem class. Our preconditione