Do you want to publish a course? Click here

The behaviour of information flow near criticality

97   0   0.0 ( 0 )
 Added by Matthijs Meijers
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

Recent experiments have indicated that many biological systems self-organise near their critical point, which hints at a common design principle. While it has been suggested that information transmission is optimized near the critical point, it remains unclear how information transmission depends on the dynamics of the input signal, the distance over which the information needs to be transmitted, and the distance to the critical point. Here we employ stochastic simulations of a driven 2D Ising system and study the instantaneous mutual information and the information transmission rate between a driven input spin and an output spin. The instantaneous mutual information varies non-monotonically with the temperature, but increases monotonically with the correlation time of the input signal. In contrast, the information transmission rate exhibits a maximum as a function of the input correlation time. Moreover, there exists an optimal temperature that maximizes this maximum information transmission rate. It arises from a tradeoff between the necessity to respond fast to changes in the input so that more information per unit amount of time can be transmitted, and the need to respond to reliably. The optimal temperature lies above the critical point, but moves towards it as the distance between the input and output spin is increased.



rate research

Read More

Collective behavior, both in real biological systems as well as in theoretical models, often displays a rich combination of different kinds of order. A clear-cut and unique definition of phase based on the standard concept of order parameter may therefore be complicated, and made even trickier by the lack of thermodynamic equilibrium. Compression-based entropies have been proved useful in recent years in describing the different phases of out-of-equilibrium systems. Here, we investigate the performance of a compression-based entropy, namely the Computable Information Density (CID), within the Vicsek model of collective motion. Our entropy is defined through a crude coarse-graining of the particle positions, in which the key role of velocities in the model only enters indirectly through the velocity-density coupling. We discover that such entropy is a valid tool in distinguishing the various noise regimes, including the crossover between an aligned and misaligned phase of the velocities, despite the fact that velocities are not used by this entropy. Furthermore, we unveil the subtle role of the time coordinate, unexplored in previous studies on the CID: a new encoding recipe, where space and time locality are both preserved on the same ground, is demonstrated to reduce the CID. Such an improvement is particularly significant when working with partial and/or corrupted data, as it is often the case in real biological experiments.
We present a field-theoretic renormalization group (RG) analysis of a single flexible, screened polyelectrolyte chain (a Debye-Huckel chain) in a polar solvent. We point out that the Debye-Huckel chain may be mapped onto a local field theory which has the same fixed point as a generalised $n to 1$ Potts model. Systematic analysis of the field theory shows that the system is one with two interplaying length-scales requiring the calculation of scaling functions as well as exponents to fully describe its physical behaviour. To illustrate this, we solve the RG equation and explicitly calculate the exponents and the mean end-to-end length of the chain.
Critical exponents of the infinitely slowly driven Zhang model of self-organized criticality are computed for $d=2,3$ with particular emphasis devoted to the various roughening exponents. Besides confirming recent estimates of some exponents, new quantities are monitored and their critical exponents computed. Among other results, it is shown that the three dimensional exponents do not coincide with the Bak, Tang, and Wiesenfeld (abelian) model and that the dynamical exponent as computed from the correlation length and from the roughness of the energy profile do not necessarily coincide as it is usually implicitly assumed. An explanation for this is provided. The possibility of comparing these results with those obtained from Renormalization Group arguments is also briefly addressed.
Information flow between components of a system takes many forms and is key to understanding the organization and functioning of large-scale, complex systems. We demonstrate three modalities of information flow from time series X to time series Y. Intrinsic information flow exists when the past of X is individually predictive of the present of Y, independent of Ys past; this is most commonly considered information flow. Shared information flow exists when Xs past is predictive of Ys present in the same manner as Ys past; this occurs due to synchronization or common driving, for example. Finally, synergistic information flow occurs when neither Xs nor Ys pasts are predictive of Ys present on their own, but taken together they are. The two most broadly-employed information-theoretic methods of quantifying information flow---time-delayed mutual information and transfer entropy---are both sensitive to a pair of these modalities: time-delayed mutual information to both intrinsic and shared flow, and transfer entropy to both intrinsic and synergistic flow. To quantify each mode individually we introduce our cryptographic flow ansatz, positing that intrinsic flow is synonymous with secret key agreement between X and Y. Based on this, we employ an easily-computed secret-key-agreement bound---intrinsic mutual information&mdashto quantify the three flow modalities in a variety of systems including asymmetric flows and financial markets.
The dominant reaction pathway (DRP) is a rigorous framework to microscopically compute the most probable trajectories, in non-equilibrium transitions. In the low-temperature regime, such dominant pathways encode the information about the reaction mechanism and can be used to estimate non-equilibrium averages of arbitrary observables. On the other hand, at sufficiently high temperatures, the stochastic fluctuations around the dominant paths become important and have to be taken into account. In this work, we develop a technique to systematically include the effects of such stochastic fluctuations, to order k_B T. This method is used to compute the probability for a transition to take place through a specific reaction channel and to evaluate the reaction rate.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا