ﻻ يوجد ملخص باللغة العربية
This paper aims at addressing distributed averaging problems for signed networks in the presence of general directed topologies that are represented by signed digraphs. A new class of improved Laplacian potential functions is proposed by presenting two notions of any signed digraph: induced unsigned digraph and mirror (undirected) signed graph, based on which two distributed averaging protocols are designed using the nearest neighbor rules. It is shown that with any of the designed protocols, signed-average consensus (respectively, state stability) can be achieved if and only if the associated signed digraph of signed network is structurally balanced (respectively, unbalanced), regardless of whether weight balance is satisfied or not. Further, improved Laplacian potential functions can be exploited to solve fixed-time consensus problems of signed networks with directed topologies, in which a nonlinear distributed protocol is proposed to ensure the bipartite consensus or state stability within a fixed time. Additionally, the convergence analyses of directed signed networks can be implemented with the Lyapunov stability analysis method, which is realized by revealing the tight relationship between convergence behaviors of directed signed networks and properties of improved Laplacian potential functions. Illustrative examples are presented to demonstrate the validity of our theoretical results for directed signed networks.
Distributed processing over networks relies on in-network processing and cooperation among neighboring agents. Cooperation is beneficial when agents share a common objective. However, in many applications agents may belong to different clusters that
This paper considers a distributed convex optimization problem over a time-varying multi-agent network, where each agent has its own decision variables that should be set so as to minimize its individual objective subject to local constraints and glo
In this paper, we consider the binary classification problem via distributed Support-Vector-Machines (SVM), where the idea is to train a network of agents, with limited share of data, to cooperatively learn the SVM classifier for the global database.
In this paper, we propose two communication-efficient algorithms for decentralized optimization over a multi-agent network with general directed network topology. In the first part, we consider a novel communication-efficient gradient tracking based
This paper deals with linear algebraic equations where the global coefficient matrix and constant vector are given respectively, by the summation of the coefficient matrices and constant vectors of the individual agents. Our approach is based on refo