ﻻ يوجد ملخص باللغة العربية
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of layered feedforward neural network models with synaptic noise. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed.This self-control mechanism considerably improves the quality of retrieval, in particular the storage capacity, the basins of attraction and the mutual information content.
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solve
For the retrieval dynamics of sparsely coded attractor associative memory models with synaptic noise the inclusion of a macroscopic time-dependent threshold is studied. It is shown that if the threshold is chosen appropriately as a function of the cr
The dynamics of neural networks is often characterized by collective behavior and quasi-synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are c
We calculate analytically the critical connectivity $K_c$ of Random Threshold Networks (RTN) for homogeneous and inhomogeneous thresholds, and confirm the results by numerical simulations. We find a super-linear increase of $K_c$ with the (average) a
The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents