No Arabic abstract
In this work we studied the combined action of chemical and electrical synapses in small networks of Hindmarsh-Rose (HR) neurons on the synchronous behaviour and on the rate of information produced (per time unit) by the networks. We show that if the chemical synapse is excitatory, the larger the chemical synapse strength used the smaller the electrical synapse strength needed to achieve complete synchronisation, and for moderate synaptic strengths one should expect to find desynchronous behaviour. Otherwise, if the chemical synapse is inhibitory, the larger the chemical synapse strength used the larger the electrical synapse strength needed to achieve complete synchronisation, and for moderate synaptic strengths one should expect to find synchronous behaviours. Finally, we show how to calculate semi-analytically an upper bound for the rate of information produced per time unit (Kolmogorov-Sinai entropy) in larger networks. As an application, we show that this upper bound is linearly proportional to the number of neurons in a network whose neurons are highly connected.
A great deal of research has been devoted on the investigation of neural dynamics in various network topologies. However, only a few studies have focused on the influence of autapses, synapses from a neuron onto itself via closed loops, on neural synchronisation. Here, we build a random network with adaptive exponential integrate-and-fire neurons coupled with chemical synapses, equipped with autapses, to study the effect of the latter on synchronous behaviour. We consider time delay in the conductance of the pre-synaptic neuron for excitatory and inhibitory connections. Interestingly, in neural networks consisting of both excitatory and inhibitory neurons, we uncover that synchronous behaviour depends on their synapse type. Our results provide evidence on the synchronous and desynchronous activities that emerge in random neural networks with chemical, inhibitory and excitatory synapses where neurons are equipped with autapses.
In this paper we argue that, in addition to electrical and chemical signals propagating in the neurons of the brain, signal propagation takes place in the form of biophoton production. This statement is supported by recent experimental confirmation of photon guiding properties of a single neuron. We have investigated the interaction of mitochondrial biophotons with microtubules from a quantum mechanical point of view. Our theoretical analysis indicates that the interaction of biophotons and microtubules causes transitions/fluctuations of microtubules between coherent and incoherent states. A significant relationship between the fluctuation function of microtubules and alpha-EEG diagrams is elaborated on in this paper. We argue that the role of biophotons in the brain merits special attention.
We show that discrete synaptic weights can be efficiently used for learning in large scale neural systems, and lead to unanticipated computational performance. We focus on the representative case of learning random patterns with binary synapses in single layer networks. The standard statistical analysis shows that this problem is exponentially dominated by isolated solutions that are extremely hard to find algorithmically. Here, we introduce a novel method that allows us to find analytical evidence for the existence of subdominant and extremely dense regions of solutions. Numerical experiments confirm these findings. We also show that the dense regions are surprisingly accessible by simple learning protocols, and that these synaptic configurations are robust to perturbations and generalize better than typical solutions. These outcomes extend to synapses with multiple states and to deeper neural architectures. The large deviation measure also suggests how to design novel algorithmic schemes for optimization based on local entropy maximization.
In this work, we study the dynamic range in a neuronal network modelled by cellular automaton. We consider deterministic and non-deterministic rules to simulate electrical and chemical synapses. Chemical synapses have an intrinsic time-delay and are susceptible to parameter variations guided by learning Hebbian rules of behaviour. Our results show that chemical synapses can abruptly enhance sensibility of the neural network, a manifestation that can become even more predominant if learning rules of evolution are applied to the chemical synapses.
We present a mathematical analysis of a networks with Integrate-and-Fire neurons and adaptive conductances. Taking into account the realistic fact that the spike time is only known within some textit{finite} precision, we propose a model where spikes are effective at times multiple of a characteristic time scale $delta$, where $delta$ can be textit{arbitrary} small (in particular, well beyond the numerical precision). We make a complete mathematical characterization of the model-dynamics and obtain the following results. The asymptotic dynamics is composed by finitely many stable periodic orbits, whose number and period can be arbitrary large and can diverge in a region of the synaptic weights space, traditionally called the edge of chaos, a notion mathematically well defined in the present paper. Furthermore, except at the edge of chaos, there is a one-to-one correspondence between the membrane potential trajectories and the raster plot. This shows that the neural code is entirely in the spikes in this case. As a key tool, we introduce an order parameter, easy to compute numerically, and closely related to a natural notion of entropy, providing a relevant characterization of the computational capabilities of the network. This allows us to compare the computational capabilities of leaky and Integrate-and-Fire models and conductance based models. The present study considers networks with constant input, and without time-dependent plasticity, but the framework has been designed for both extensions.