ترغب بنشر مسار تعليمي؟ اضغط هنا

Multiplicative versus additive noise in multi-state neural networks

252   0   0.0 ( 0 )
 نشر من قبل Jordi Busquets Blanco
 تاريخ النشر 2004
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings.



قيم البحث

اقرأ أيضاً

The principle of adaptation in a noisy retrieval environment is extended here to a diluted attractor neural network of Q-state neurons trained with noisy data. The network is adapted to an appropriate noisy training overlap and training activity whic h are determined self-consistently by the optimized retrieval attractor overlap and activity. The optimized storage capacity and the corresponding retriever overlap are considerably enhanced by an adequate threshold in the states. Explicit results for improved optimal performance and new retriever phase diagrams are obtained for Q=3 and Q=4, with coexisting phases over a wide range of thresholds. Most of the interesting results are stable to replica-symmetry-breaking fluctuations.
114 - D. Bolle , R. Heylen 2007
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solve d numerically. In both cases it is shown that, if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed. This self-control mechanism considerably improves the quality of retrieval, in particular the storage capacity, the basins of attraction and the mutual information content.
63 - D. Bolle , R. Heylen 2006
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of layered feedforward neural network models with synaptic noise. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed.This self-control mechanism considerably improves the quality of retrieval, in particular the storage capacity, the basins of attraction and the mutual information content.
The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperatur e phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. The results are compared with those for sequential updating. The effect of self-coupling is established. Also the dynamics is studied using the generating function technique for both synchronous and sequential updating. Typical flow diagrams for the overlap order parameter are presented. The differences with the signal-to-noise approach are outlined.
95 - D. Bolle , P. Kozlowski 2001
We show that for a particular choice of the coupling parameters the Ashkin-Teller spin-glass neural network model with the Hebb learning rule and one condensed pattern yields the same thermodynamic properties as the four-state anisotropic Potts-glass neural network model. This equivalence is not seen at the level of the Hamiltonians.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا