No Arabic abstract
We study information processing in populations of Boolean networks with evolving connectivity and systematically explore the interplay between the learning capability, robustness, the network topology, and the task complexity. We solve a long-standing open question and find computationally that, for large system sizes $N$, adaptive information processing drives the networks to a critical connectivity $K_{c}=2$. For finite size networks, the connectivity approaches the critical value with a power-law of the system size $N$. We show that network learning and generalization are optimized near criticality, given task complexity and the amount of information provided threshold values. Both random and evolved networks exhibit maximal topological diversity near $K_{c}$. We hypothesize that this supports efficient exploration and robustness of solutions. Also reflected in our observation is that the variance of the values is maximal in critical network populations. Finally, we discuss implications of our results for determining the optimal topology of adaptive dynamical networks that solve computational tasks.
Adaptation plays a fundamental role in shaping the structure of a complex network and improving its functional fitting. Even when increasing the level of synchronization in a biological system is considered as the main driving force for adaptation, there is evidence of negative effects induced by excessive synchronization. This indicates that coherence alone can not be enough to explain all the structural features observed in many real-world networks. In this work, we propose an adaptive network model where the dynamical evolution of the node states towards synchronization is coupled with an evolution of the link weights based on an anti-Hebbian adaptive rule, which accounts for the presence of inhibitory effects in the system. We found that the emergent networks spontaneously develop the structural conditions to sustain explosive synchronization. Our results can enlighten the shaping mechanisms at the heart of the structural and dynamical organization of some relevant biological systems, namely brain networks, for which the emergence of explosive synchronization has been observed.
Typical properties of computing circuits composed of noisy logical gates are studied using the statistical physics methodology. A growth model that gives rise to typical random Boolean functions is mapped onto a layered Ising spin system, which facilitates the study of their ability to represent arbitrary formulae with a given level of error, the tolerable level of gate-noise, and its dependence on the formulae depth and complexity, the gates used and properties of the function inputs. Bounds on their performance, derived in the information theory literature via specific gates, are straightforwardly retrieved, generalized and identified as the corresponding typical-case phase transitions. The framework is employed for deriving results on error-rates, function-depth and sensitivity, and their dependence on the gate-type and noise model used that are difficult to obtain via the traditional methods used in this field.
We investigate analytically and numerically the critical line in undirected random Boolean networks with arbitrary degree distributions, including scale-free topology of connections $P(k)sim k^{-gamma}$. We show that in infinite scale-free networks the transition between frozen and chaotic phase occurs for $3<gamma < 3.5$. The observation is interesting for two reasons. First, since most of critical phenomena in scale-free networks reveal their non-trivial character for $gamma<3$, the position of the critical line in Kauffman model seems to be an important exception from the rule. Second, since gene regulatory networks are characterized by scale-free topology with $gamma<3$, the observation that in finite-size networks the mentioned transition moves towards smaller $gamma$ is an argument for Kauffman model as a good starting point to model real systems. We also explain that the unattainability of the critical line in numerical simulations of classical random graphs is due to percolation phenomena.
Despite their apparent simplicity, random Boolean networks display a rich variety of dynamical behaviors. Much work has been focused on the properties and abundance of attractors. The topologies of random Boolean networks with one input per node can be seen as graphs of random maps. We introduce an approach to investigating random maps and finding analytical results for attractors in random Boolean networks with the corresponding topology. Approximating some other non-chaotic networks to be of this class, we apply the analytic results to them. For this approximation, we observe a strikingly good agreement on the numbers of attractors of various lengths. We also investigate observables related to the average number of attractors in relation to the typical number of attractors. Here, we find strong differences that highlight the difficulties in making direct comparisons between random Boolean networks and real systems. Furthermore, we demonstrate the power of our approach by deriving some results for random maps. These results include the distribution of the number of components in random maps, along with asymptotic expansions for cumulants up to the 4th order.
We investigate the dynamics of two models of biological networks with purely suppressive interactions between the units; species interacting via niche competition and neurons via inhibitory synaptic coupling. In both of these cases, power-law scaling of the density of states with probability arises without any fine-tuning of the model parameters. These results argue against the increasingly popular notion that non-equilibrium living systems operate at special critical points, driven by there by evolution so as to enable adaptive processing of input data.