Finite Step Performance of First-order Methods Using Interpolation Conditions Without Function Evaluations


الملخص بالإنكليزية

We present a procedure to numerically compute finite step worst case performance guarantees on a given algorithm for the unconstrained optimization of strongly convex functions with Lipschitz continuous gradients. The solution method provided serves as an alternative approach to that derived by Taylor, Hendrickx, and Glineur in [Math. Prog. 161 (1-2), 2017]. The difference lies in the fact that our solution uses conditions for the interpolation of a set of points and gradient evaluations by the gradient of a function in the class of interest, whereas their solution uses conditions for the interpolation of a set of points, gradient evaluations, and function evaluations by a function in the class of interest. The motivation for this alternative solution is that, in many cases, neither the algorithm nor the performance metric of interest rely upon function evaluations. The primary development is a procedure to avoid suffering from the factorial growth in the number of these conditions with the size of the set to be interpolated when solving for the worst case performance.

تحميل البحث