ﻻ يوجد ملخص باللغة العربية
We show that discrete synaptic weights can be efficiently used for learning in large scale neural systems, and lead to unanticipated computational performance. We focus on the representative case of learning random patterns with binary synapses in single layer networks. The standard statistical analysis shows that this problem is exponentially dominated by isolated solutions that are extremely hard to find algorithmically. Here, we introduce a novel method that allows us to find analytical evidence for the existence of subdominant and extremely dense regions of solutions. Numerical experiments confirm these findings. We also show that the dense regions are surprisingly accessible by simple learning protocols, and that these synaptic configurations are robust to perturbations and generalize better than typical solutions. These outcomes extend to synapses with multiple states and to deeper neural architectures. The large deviation measure also suggests how to design novel algorithmic schemes for optimization based on local entropy maximization.
Stochastic neural networks are a prototypical computational device able to build a probabilistic representation of an ensemble of external stimuli. Building on the relationship between inference and learning, we derive a synaptic plasticity rule that
We investigate the dynamics of continuous attractor neural networks (CANNs). Due to the translational invariance of their neuronal interactions, CANNs can hold a continuous family of stationary states. We systematically explore how their neutral stab
We investigate the dynamical role of inhibitory and highly connected nodes (hub) in synchronization and input processing of leaky-integrate-and-fire neural networks with short term synaptic plasticity. We take advantage of a heterogeneous mean-field
We introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especi
We study a network of spiking neurons with heterogeneous excitabilities connected via inhibitory delayed pulses. For globally coupled systems the increase of the inhibitory coupling reduces the number of firing neurons by following a Winner Takes All