Do you want to publish a course? Click here

Degree assortativity in networks of spiking neurons

203   0   0.0 ( 0 )
 Added by Carlo Laing
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Degree assortativity refers to the increased or decreased probability of connecting two neurons based on their in- or out-degrees, relative to what would be expected by chance. We investigate the effects of such assortativity in a network of theta neurons. The Ott/Antonsen ansatz is used to derive equations for the expected state of each neuron, and these equations are then coarse-grained in degree space. We generate families of effective connectivity matrices parametrised by assortativity coefficient and use SVD decompositions of these to efficiently perform numerical bifurcation analysis of the coarse-grained equations. We find that of the four possible types of degree assortativity, two have no effect on the networks dynamics, while the other two can have a significant effect.

rate research

Read More

We consider the effects of correlations between the in- and out-degrees of individual neurons on the dynamics of a network of neurons. By using theta neurons, we can derive a set of coupled differential equations for the expected dynamics of neurons with the same in-degree. A Gaussian copula is used to introduce correlations between a neurons in- and out-degree and numerical bifurcation analysis is used determine the effects of these correlations on the networks dynamics. For excitatory coupling we find that inducing positive correlations has a similar effect to increasing the coupling strength between neurons, while for inhibitory coupling it has the opposite effect. We also determine the propensity of various two- and three-neuron motifs to occur as correlations are varied and give a plausible explanation for the observed changes in dynamics.
We train spiking deep networks using leaky integrate-and-fire (LIF) neurons, and achieve state-of-the-art results for spiking networks on the CIFAR-10 and MNIST datasets. This demonstrates that biologically-plausible spiking LIF neurons can be integrated into deep networks can perform as well as other spiking models (e.g. integrate-and-fire). We achieved this result by softening the LIF response function, such that its derivative remains bounded, and by training the network with noise to provide robustness against the variability introduced by spikes. Our method is general and could be applied to other neuron types, including those used on modern neuromorphic hardware. Our work brings more biological realism into modern image classification models, with the hope that these models can inform how the brain performs this difficult task. It also provides new methods for training deep networks to run on neuromorphic hardware, with the aim of fast, power-efficient image classification for robotics applications.
We consider the deterministic evolution of a time-discretized spiking network of neurons with connection weights having delays, modeled as a discretized neural network of the generalized integrate and fire (gIF) type. The purpose is to study a class of algorithmic methods allowing to calculate the proper parameters to reproduce exactly a given spike train generated by an hidden (unknown) neural network. This standard problem is known as NP-hard when delays are to be calculated. We propose here a reformulation, now expressed as a Linear-Programming (LP) problem, thus allowing to provide an efficient resolution. This allows us to back-engineer a neural network, i.e. to find out, given a set of initial conditions, which parameters (i.e., connection weights in this case), allow to simulate the network spike dynamics. More precisely we make explicit the fact that the back-engineering of a spike train, is a Linear (L) problem if the membrane potentials are observed and a LP problem if only spike times are observed, with a gIF model. Numerical robustness is discussed. We also explain how it is the use of a generalized IF neuron model instead of a leaky IF model that allows us to derive this algorithm. Furthermore, we point out how the L or LP adjustment mechanism is local to each unit and has the same structure as an Hebbian rule. A step further, this paradigm is easily generalizable to the design of input-output spike train transformations. This means that we have a practical method to program a spiking network, i.e. find a set of parameters allowing us to exactly reproduce the network output, given an input. Numerical verifications and illustrations are provided.
We show that the degree distributions of graphs do not suffice to characterize the synchronization of systems evolving on them. We prove that, for any given degree sequence satisfying certain conditions, there exists a connected graph having that degree sequence for which the first nontrivial eigenvalue of the graph Laplacian is arbitrarily close to zero. Consequently, complex dynamical systems defined on such graphs have poor synchronization properties. The result holds under quite mild assumptions, and shows that there exists classes of random, scale-free, regular, small-world, and other common network architectures which impede synchronization. The proof is based on a construction that also serves as an algorithm for building non-synchronizing networks having a prescribed degree distribution.
By constructing a multicanonical Monte Carlo simulation, we obtain the full probability distribution $rho_N(r)$ of the degree assortativity coefficient $r$ on configuration networks of size $N$ by using the multiple histogram reweighting method. We suggest that $rho_N(r)$ obeys a large deviation principle, $rho_N left( r- r_N^* right) asymp {e^{ - {N^xi }Ileft( {r- r_N^* } right)}}$, where the rate function $I$ is convex and possesses its unique minimum at $r=r_N^*$, and $xi$ is an exponent that scales $rho_N$s with $N$. We show that $xi=1$ for Poisson random graphs, and $xigeq1$ for scale-free networks in which $xi$ is a decreasing function of the degree distribution exponent $gamma$. Our results reveal that the fluctuations of $r$ exhibits an anomalous scaling with $N$ in highly heterogeneous networks.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا