Exponentially Convergent Algorithm Design for Constrained Distributed Optimization via Non-smooth Approach


Abstract in English

We consider minimizing a sum of non-smooth objective functions with set constraints in a distributed manner. As to this problem, we propose a distributed algorithm with an exponential convergence rate for the first time. By the exact penalty method, we reformulate the problem equivalently as a standard distributed one without consensus constraints. Then we design a distributed projected subgradient algorithm with the help of differential inclusions. Furthermore, we show that the algorithm converges to the optimal solution exponentially for strongly convex objective functions.

Download