ﻻ يوجد ملخص باللغة العربية
We present and study a probabilistic neural automaton in which the fraction of simultaneously-updated neurons is a parameter, rho (0, 1) . For small rho, there is relaxation towards one of the attractors and a great sensibility to external stimuli and, for rho >= rho_c, itinerancy among attractors. Tuning rho in this regime, oscillations may abruptly change from regular to chaotic and vice versa, which allows one to control the efficiency of the searching process. We argue on the similarity of the model behavior with recent observations and on the possible role of chaos in neurobiology.
The dynamics of neural networks is often characterized by collective behavior and quasi-synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are c
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solve
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of layered feedforward neural network models with synaptic noise. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise
We investigate the dynamical role of inhibitory and highly connected nodes (hub) in synchronization and input processing of leaky-integrate-and-fire neural networks with short term synaptic plasticity. We take advantage of a heterogeneous mean-field
For the retrieval dynamics of sparsely coded attractor associative memory models with synaptic noise the inclusion of a macroscopic time-dependent threshold is studied. It is shown that if the threshold is chosen appropriately as a function of the cr