No Arabic abstract
Natural and artificial networks, from the cerebral cortex to large-scale power grids, face the challenge of converting noisy inputs into robust signals. The input fluctuations often exhibit complex yet statistically reproducible correlations that reflect underlying internal or environmental processes such as synaptic noise or atmospheric turbulence. This raises the practically and biophysically relevant of question whether and how noise-filtering can be hard-wired directly into a networks architecture. By considering generic phase oscillator arrays under cost constraints, we explore here analytically and numerically the design, efficiency and topology of noise-canceling networks. Specifically, we find that when the input fluctuations become more correlated in space or time, optimal network architectures become sparser and more hierarchically organized, resembling the vasculature in plants or animals. More broadly, our results provide concrete guiding principles for designing more robust and efficient power grids and sensor networks.
Synchronization is a widespread phenomenon observed in physical, biological, and social networks, which persists even under the influence of strong noise. Previous research on oscillators subject to common noise has shown that noise can actually facilitate synchronization, as correlations in the dynamics can be inherited from the noise itself. However, in many spatially distributed networks, such as the mammalian circadian system, the noise that different oscillators experience can be effectively uncorrelated. Here, we show that uncorrelated noise can in fact enhance synchronization when the oscillators are coupled. Strikingly, our analysis also shows that uncorrelated noise can be more effective than common noise in enhancing synchronization. We first establish these results theoretically for phase and phase-amplitude oscillators subject to either or both additive and multiplicative noise. We then confirm the predictions through experiments on coupled electrochemical oscillators. Our findings suggest that uncorrelated noise can promote rather than inhibit coherence in natural systems and that the same effect can be harnessed in engineered systems.
Behavioral homogeneity is often critical for the functioning of network systems of interacting entities. In power grids, whose stable operation requires generator frequencies to be synchronized--and thus homogeneous--across the network, previous work suggests that the stability of synchronous states can be improved by making the generators homogeneous. Here, we show that a substantial additional improvement is possible by instead making the generators suitably heterogeneous. We develop a general method for attributing this counterintuitive effect to converse symmetry breaking, a recently established phenomenon in which the system must be asymmetric to maintain a stable symmetric state. These findings constitute the first demonstration of converse symmetry breaking in real-world systems, and our method promises to enable identification of this phenomenon in other networks whose functions rely on behavioral homogeneity.
The relation between network structure and dynamics is determinant for the behavior of complex systems in numerous domains. An important long-standing problem concerns the properties of the networks that optimize the dynamics with respect to a given performance measure. Here we show that such optimization can lead to sensitive dependence of the dynamics on the structure of the network. Specifically, using diffusively coupled systems as examples, we demonstrate that the stability of a dynamical state can exhibit sensitivity to unweighted structural perturbations (i.e., link removals and node additions) for undirected optimal networks and to weighted perturbations (i.e., small changes in link weights) for directed optimal networks. As mechanisms underlying this sensitivity, we identify discontinuous transitions occurring in the complement of undirected optimal networks and the prevalence of eigenvector degeneracy in directed optimal networks. These findings establish a unified characterization of networks optimized for dynamical stability, which we illustrate using Turing instability in activator-inhibitor systems, synchronization in power-grid networks, network diffusion, and several other network processes. Our results suggest that the network structure of a complex system operating near an optimum can potentially be fine-tuned for a significantly enhanced stability compared to what one might expect from simple extrapolation. On the other hand, they also suggest constraints on how close to the optimum the system can be in practice. Finally, the results have potential implications for biophysical networks, which have evolved under the competing pressures of optimizing fitness while remaining robust against perturbations.
In self-organized criticality (SOC) models, as well as in standard phase transitions, criticality is only present for vanishing driving external fields $h rightarrow 0$. Considering that this is rarely the case for natural systems, such a restriction poses a challenge to the explanatory power of these models. Besides that, in models of dissipative systems like earthquakes, forest fires and neuronal networks, there is no true critical behavior, as expressed in clean power laws obeying finite-size scaling, but a scenario called dirty criticality or self-organized quasi-criticality (SOqC). Here, we propose simple homeostatic mechanisms which promote self-organization of coupling strengths, gains, and firing thresholds in neuronal networks. We show that near criticality can be reached and sustained even in the presence of external inputs because the firing thresholds adapt to and cancel the inputs, a phenomenon similar to perfect adaptation in sensory systems. Similar mechanisms can be proposed for the couplings and local thresholds in spin systems and cellular automata, which could lead to applications in earthquake, forest fire, stellar flare, voting and epidemic modeling.
A scenario has recently been reported in which in order to stabilize complete synchronization of an oscillator network---a symmetric state---the symmetry of the system itself has to be broken by making the oscillators nonidentical. But how often does such behavior---which we term asymmetry-induced synchronization (AISync)---occur in oscillator networks? Here we present the first general scheme for constructing AISync systems and demonstrate that this behavior is the norm rather than the exception in a wide class of physical systems that can be seen as multilayer networks. Since a symmetric network in complete synchrony is the basic building block of cluster synchronization in more general networks, AISync should be common also in facilitating cluster synchronization by breaking the symmetry of the cluster subnetworks.