ﻻ يوجد ملخص باللغة العربية
We examine popular gradient-based algorithms for nonlinear control in the light of the modern complexity analysis of first-order optimization algorithms. The examination reveals that the complexity bounds can be clearly stated in terms of calls to a computational oracle related to dynamic programming and implementable by gradient back-propagation using machine learning software libraries such as PyTorch or TensorFlow. Finally, we propose a regularized Gauss-Newton algorithm enjoying worst-case complexity bounds and improved convergence behavior in practice. The software library based on PyTorch is publicly available.
As a judicious correspondence to the classical maxcut, the anti-Cheeger cut has more balanced structure, but few numerical results on it have been reported so far. In this paper, we propose a continuous iterative algorithm for the anti-Cheeger cut pr
The fragility of deep neural networks to adversarially-chosen inputs has motivated the need to revisit deep learning algorithms. Including adversarial examples during training is a popular defense mechanism against adversarial attacks. This mechanism
Stochastic MPECs have found increasing relevance for modeling a broad range of settings in engineering and statistics. Yet, there seem to be no efficient first/zeroth-order schemes equipped with non-asymptotic rate guarantees for resolving even deter
When are two algorithms the same? How can we be sure a recently proposed algorithm is novel, and not a minor twist on an existing method? In this paper, we present a framework for reasoning about equivalence between a broad class of iterative algorit
We consider the problem of distributed secondary frequency regulation in power networks such that stability and an optimal power allocation are attained. This is a problem that has been widely studied in the literature, and two main control schemes h