ﻻ يوجد ملخص باللغة العربية
Pairwise interactions between individuals are taken as fundamental drivers of collective behavior responsible for group cohesion and decision-making. While an individual directly influences only a few neighbors, over time indirect influences penetrate a much larger group. The abiding question is how this spread of influence comes to affect the collective. One or a few individuals are often identified as leaders, being more influential than others. Transfer entropy and time-delayed mutual information are used to identify underlying asymmetric interactions, such as leader-follower classification in aggregated individuals--cells, birds, fish, and animals. However, these conflate distinct functional modes of information flow between individuals. Computing information measures conditioning on multiple agents requires the proper sampling of a probability distribution whose dimension grows exponentially with the number of agents being conditioned on. Employing simple models of interacting self-propelled particles, we examine the pitfalls of using time-delayed mutual information and transfer entropy to quantify the strength of influence from a leader to a follower. Surprisingly, one must be wary of these pitfalls even for two interacting particles. As an alternative we decompose transfer entropy and time-delayed mutual information into intrinsic, shared, and synergistic modes of information flow. The result not only properly reveals the underlying effective interactions, but also facilitates a more detailed diagnosis of how individual interactions lead to collective behavior. This exposes the role of individual and group memory in collective behaviors. In addition, we demonstrate in a multi-agent system how knowledge of the decomposed information modes between a single pair of agents reveals the nature of many-body interactions without conditioning on additional agents.
Information flow between components of a system takes many forms and is key to understanding the organization and functioning of large-scale, complex systems. We demonstrate three modalities of information flow from time series X to time series Y. In
Understanding and even defining what constitutes animal interactions remains a challenging problem. Correlational tools may be inappropriate for detecting communication between a set of many agents exhibiting nonlinear behavior. A different approach
Opinion formation is an important element of social dynamics. It has been widely studied in the last years with tools from physics, mathematics and computer science. Here, a continuous model of opinion dynamics for multiple possible choices is analys
We study a system of self-propelled agents in which each agent has a part of omnidirectional or panoramic view of its sensor disc, the field of vision of the agent in this case is only a sector of a disc bounded by two radii and the included arc. The
We define Persistent Mutual Information (PMI) as the Mutual (Shannon) Information between the past history of a system and its evolution significantly later in the future. This quantifies how much past observations enable long term prediction, which