ﻻ يوجد ملخص باللغة العربية
Collective behavior, both in real biological systems as well as in theoretical models, often displays a rich combination of different kinds of order. A clear-cut and unique definition of phase based on the standard concept of order parameter may therefore be complicated, and made even trickier by the lack of thermodynamic equilibrium. Compression-based entropies have been proved useful in recent years in describing the different phases of out-of-equilibrium systems. Here, we investigate the performance of a compression-based entropy, namely the Computable Information Density (CID), within the Vicsek model of collective motion. Our entropy is defined through a crude coarse-graining of the particle positions, in which the key role of velocities in the model only enters indirectly through the velocity-density coupling. We discover that such entropy is a valid tool in distinguishing the various noise regimes, including the crossover between an aligned and misaligned phase of the velocities, despite the fact that velocities are not used by this entropy. Furthermore, we unveil the subtle role of the time coordinate, unexplored in previous studies on the CID: a new encoding recipe, where space and time locality are both preserved on the same ground, is demonstrated to reduce the CID. Such an improvement is particularly significant when working with partial and/or corrupted data, as it is often the case in real biological experiments.
The well-known Vicsek model describes the dynamics of a flock of self-propelled particles (SPPs). Surprisingly, there is no direct measure of the chaotic behavior of such systems. Here, we discuss the dynamical phase transition present in Vicsek syst
Recent experiments have indicated that many biological systems self-organise near their critical point, which hints at a common design principle. While it has been suggested that information transmission is optimized near the critical point, it remai
Computable Information Density (CID), the ratio of the length of a losslessly compressed data file to that of the uncompressed file, is a measure of order and correlation in both equilibrium and nonequilibrium systems. Here we show that correlation l
In this paper we raise the question of how to compress sparse graphs. By introducing the idea of redundancy, we find a way to measure the overlap of neighbors between nodes in networks. We exploit symmetry and information by making use of the overlap
A polymer chain pinned in space exerts a fluctuating force on the pin point in thermal equilibrium. The average of such fluctuating force is well understood from statistical mechanics as an entropic force, but little is known about the underlying for