No Arabic abstract
Pairwise interactions between individuals are taken as fundamental drivers of collective behavior responsible for group cohesion and decision-making. While an individual directly influences only a few neighbors, over time indirect influences penetrate a much larger group. The abiding question is how this spread of influence comes to affect the collective. One or a few individuals are often identified as leaders, being more influential than others. Transfer entropy and time-delayed mutual information are used to identify underlying asymmetric interactions, such as leader-follower classification in aggregated individuals--cells, birds, fish, and animals. However, these conflate distinct functional modes of information flow between individuals. Computing information measures conditioning on multiple agents requires the proper sampling of a probability distribution whose dimension grows exponentially with the number of agents being conditioned on. Employing simple models of interacting self-propelled particles, we examine the pitfalls of using time-delayed mutual information and transfer entropy to quantify the strength of influence from a leader to a follower. Surprisingly, one must be wary of these pitfalls even for two interacting particles. As an alternative we decompose transfer entropy and time-delayed mutual information into intrinsic, shared, and synergistic modes of information flow. The result not only properly reveals the underlying effective interactions, but also facilitates a more detailed diagnosis of how individual interactions lead to collective behavior. This exposes the role of individual and group memory in collective behaviors. In addition, we demonstrate in a multi-agent system how knowledge of the decomposed information modes between a single pair of agents reveals the nature of many-body interactions without conditioning on additional agents.
Information flow between components of a system takes many forms and is key to understanding the organization and functioning of large-scale, complex systems. We demonstrate three modalities of information flow from time series X to time series Y. Intrinsic information flow exists when the past of X is individually predictive of the present of Y, independent of Ys past; this is most commonly considered information flow. Shared information flow exists when Xs past is predictive of Ys present in the same manner as Ys past; this occurs due to synchronization or common driving, for example. Finally, synergistic information flow occurs when neither Xs nor Ys pasts are predictive of Ys present on their own, but taken together they are. The two most broadly-employed information-theoretic methods of quantifying information flow---time-delayed mutual information and transfer entropy---are both sensitive to a pair of these modalities: time-delayed mutual information to both intrinsic and shared flow, and transfer entropy to both intrinsic and synergistic flow. To quantify each mode individually we introduce our cryptographic flow ansatz, positing that intrinsic flow is synonymous with secret key agreement between X and Y. Based on this, we employ an easily-computed secret-key-agreement bound---intrinsic mutual information&mdashto quantify the three flow modalities in a variety of systems including asymmetric flows and financial markets.
Understanding and even defining what constitutes animal interactions remains a challenging problem. Correlational tools may be inappropriate for detecting communication between a set of many agents exhibiting nonlinear behavior. A different approach is to define coordinated motions in terms of an information theoretic channel of direct causal information flow. In this work, we consider time series data obtained by an experimental protocol of optical tracking of the insect species Chironomus riparius. The data constitute reconstructed 3-D spatial trajectories of the insects flight trajectories and kinematics. We present an application of the optimal causation entropy (oCSE) principle to identify direct causal relationships or information channels among the insects. The collection of channels inferred by oCSE describes a network of information flow within the swarm. We find that information channels with a long spatial range are more common than expected under the assumption that causal information flows should be spatially localized. The tools developed herein are general and applicable to the inference and study of intercommunication networks in a wide variety of natural settings.
Opinion formation is an important element of social dynamics. It has been widely studied in the last years with tools from physics, mathematics and computer science. Here, a continuous model of opinion dynamics for multiple possible choices is analysed. Its main features are the inclusion of disagreement and possibility of modulating information, both from one and multiple sources. The interest is in identifying the effect of the initial cohesion of the population, the interplay between cohesion and information extremism, and the effect of using multiple sources of information that can influence the system. Final consensus, especially with external information, depends highly on these factors, as numerical simulations show. When no information is present, consensus or segregation is determined by the initial cohesion of the population. Interestingly, when only one source of information is present, consensus can be obtained, in general, only when this is extremely mild, i.e. there is not a single opinion strongly promoted, or in the special case of a large initial cohesion and low information exposure. On the contrary, when multiple information sources are allowed, consensus can emerge with an information source even when this is not extremely mild, i.e. it carries a strong message, for a large range of initial conditions.
We study a system of self-propelled agents in which each agent has a part of omnidirectional or panoramic view of its sensor disc, the field of vision of the agent in this case is only a sector of a disc bounded by two radii and the included arc. The inclination of these two radii is characterized as the view angle. Contrary to our intuition, we find that, the non-omnidirectional-view for swarm agents with periodic boundary conditions in noiseless Vicsek model can accelerate the transient process of the emergence of the ordered state. One consequent implication is that, there are generally superfluous communications in the Vicsek Model, which may even obstruct the possible fast swarm emergence. This phenomenon may invoke further efforts and attentions to explore the underlying mechanism of the emergence in self-propelled agents.
We define Persistent Mutual Information (PMI) as the Mutual (Shannon) Information between the past history of a system and its evolution significantly later in the future. This quantifies how much past observations enable long term prediction, which we propose as the primary signature of (Strong) Emergent Behaviour. The key feature of our definition of PMI is the omission of an interval of present time, so that the mutual information between close times is excluded: this renders PMI robust to superposed noise or chaotic behaviour or graininess of data, distinguishing it from a range of established Complexity Measures. For the logistic map we compare predicted with measured long-time PMI data. We show that measured PMI data captures not just the period doubling cascade but also the associated cascade of banded chaos, without confusion by the overlayer of chaotic decoration. We find that the standard map has apparently infinite PMI, but with well defined fractal scaling which we can interpret in terms of the relative information codimension. Whilst our main focus is in terms of PMI over time, we can also apply the idea to PMI across space in spatially-extended systems as a generalisation of the notion of ordered phases.