ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantifying Emergence in terms of Persistent Mutual Information

155   0   0.0 ( 0 )
 نشر من قبل Robin C. Ball Prof.
 تاريخ النشر 2010
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We define Persistent Mutual Information (PMI) as the Mutual (Shannon) Information between the past history of a system and its evolution significantly later in the future. This quantifies how much past observations enable long term prediction, which we propose as the primary signature of (Strong) Emergent Behaviour. The key feature of our definition of PMI is the omission of an interval of present time, so that the mutual information between close times is excluded: this renders PMI robust to superposed noise or chaotic behaviour or graininess of data, distinguishing it from a range of established Complexity Measures. For the logistic map we compare predicted with measured long-time PMI data. We show that measured PMI data captures not just the period doubling cascade but also the associated cascade of banded chaos, without confusion by the overlayer of chaotic decoration. We find that the standard map has apparently infinite PMI, but with well defined fractal scaling which we can interpret in terms of the relative information codimension. Whilst our main focus is in terms of PMI over time, we can also apply the idea to PMI across space in spatially-extended systems as a generalisation of the notion of ordered phases.



قيم البحث

اقرأ أيضاً

Neuromorphic networks can be described in terms of coarse-grained variables, where emergent sustained behaviours spontaneously arise if stochasticity is properly taken in account. For example it has been recently found that a directed linear chain of connected patch of neurons amplifies an input signal, also tuning its characteristic frequency. Here we study a generalization of such a simple model, introducing heterogeneity and variability in the parameter space and long-range interactions, breaking, in turn, the preferential direction of information transmission of a directed chain. On one hand, enlarging the region of parameters leads to a more complex state space that we analytically characterise; moreover, we explicitly link the strength distribution of the non-local interactions with the frequency distribution of the network oscillations. On the other hand, we found that adding long-range interactions can cause the onset of novel phenomena, as coherent and synchronous oscillations among all the interacting units, which can also coexist with the amplification of the signal.
We consider the estimation of a n-dimensional vector x from the knowledge of noisy and possibility non-linear element-wise measurements of xxT , a very generic problem that contains, e.g. stochastic 2-block model, submatrix localization or the spike perturbation of random matrices. We use an interpolation method proposed by Guerra and later refined by Korada and Macris. We prove that the Bethe mutual information (related to the Bethe free energy and conjectured to be exact by Lesieur et al. on the basis of the non-rigorous cavity method) always yields an upper bound to the exact mutual information. We also provide a lower bound using a similar technique. For concreteness, we illustrate our findings on the sparse PCA problem, and observe that (a) our bounds match for a large region of parameters and (b) that it exists a phase transition in a region where the spectum remains uninformative. While we present only the case of rank-one symmetric matrix estimation, our proof technique is readily extendable to low-rank symmetric matrix or low-rank symmetric tensor estimation
We argue that the coordination of the activities of individual complex agents enables a system to develop and sustain complexity at a higher level. We exemplify relevant mechanisms through computer simulations of a toy system, a coupled map lattice w ith transmission delays. The coordination here is achieved through the synchronization of the chaotic operations of the individual elements, and on the basis of this, regular behavior at a longer temporal scale emerges that is inaccessible to the uncoupled individual dynamics.
In this paper, we study measures of quantum non-Markovianity based on the conditional mutual information. We obtain such measures by considering multiple parts of the total environment such that the conditional mutual information can be defined in th is multipartite setup. The benefit of this approach is that the conditional mutual information is closely related to recovery maps and Markov chains; we also point out its relations with the change of distinguishability. We study along the way the properties of leaked information which is the conditional mutual information that can be back flowed, and we use this leaked information to show that the correlated environment is necessary for nonlocal memory effect.
We study the statistical physics of a surprising phenomenon arising in large networks of excitable elements in response to noise: while at low noise, solutions remain in the vicinity of the resting state and large-noise solutions show asynchronous ac tivity, the network displays orderly, perfectly synchronized periodic responses at intermediate level of noise. We show that this phenomenon is fundamentally stochastic and collective in nature. Indeed, for noise and coupling within specific ranges, an asymmetry in the transition rates between a resting and an excited regime progressively builds up, leading to an increase in the fraction of excited neurons eventually triggering a chain reaction associated with a macroscopic synchronized excursion and a collective return to rest where this process starts afresh, thus yielding the observed periodic synchronized oscillations. We further uncover a novel anti-resonance phenomenon: noise-induced synchronized oscillations disappear when the system is driven by periodic stimulation with frequency within a specific range. In that anti-resonance regime, the system is optimal for measures of information capacity. This observation provides a new hypothesis accounting for the efficiency of Deep Brain Stimulation therapies in Parkinsons disease, a neurodegenerative disease characterized by an increased synchronization of brain motor circuits. We further discuss the universality of these phenomena in the class of stochastic networks of excitable elements with confining coupling, and illustrate this universality by analyzing various classical models of neuronal networks. Altogether, these results uncover some universal mechanisms supporting a regularizing impact of noise in excitable systems, reveal a novel anti-resonance phenomenon in these systems, and propose a new hypothesis for the efficiency of high-frequency stimulation in Parkinsons disease.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا