ﻻ يوجد ملخص باللغة العربية
In this work, we study the dynamic range in a neuronal network modelled by cellular automaton. We consider deterministic and non-deterministic rules to simulate electrical and chemical synapses. Chemical synapses have an intrinsic time-delay and are susceptible to parameter variations guided by learning Hebbian rules of behaviour. Our results show that chemical synapses can abruptly enhance sensibility of the neural network, a manifestation that can become even more predominant if learning rules of evolution are applied to the chemical synapses.
Excessively high, neural synchronisation has been associated with epileptic seizures, one of the most common brain diseases worldwide. A better understanding of neural synchronisation mechanisms can thus help control or even treat epilepsy. In this p
Computers has been endowed with a part of human-like intelligence owing to the rapid development of the artificial intelligence technology represented by the neural networks. Facing the challenge to make machines more imaginative, we consider a quant
We show that discrete synaptic weights can be efficiently used for learning in large scale neural systems, and lead to unanticipated computational performance. We focus on the representative case of learning random patterns with binary synapses in si
Spiking neural networks (SNNs) has attracted much attention due to its great potential of modeling time-dependent signals. The firing rate of spiking neurons is decided by control rate which is fixed manually in advance, and thus, whether the firing
This paper presents a new approach for assembling graph neural networks based on framelet transforms. The latter provides a multi-scale representation for graph-structured data. We decompose an input graph into low-pass and high-pass frequencies coef