No Arabic abstract
Computers has been endowed with a part of human-like intelligence owing to the rapid development of the artificial intelligence technology represented by the neural networks. Facing the challenge to make machines more imaginative, we consider a quantum stochastic neural network (QSNN), and propose a learning algorithm to update the parameters governing the network evolution. The QSNN can be applied to a class of classification problems, we investigate its performance in sentence classification and find that the coherent part of the quantum evolution can accelerate training, and improve the accuracy of verses recognition which can be deemed as a quantum enhanced associative memory. In addition, the coherent QSNN is found more robust against both label noise and device noise so that it is a more adaptive option for practical implementation.
The task of classifying the entanglement properties of a multipartite quantum state poses a remarkable challenge due to the exponentially increasing number of ways in which quantum systems can share quantum correlations. Tackling such challenge requires a combination of sophisticated theoretical and computational techniques. In this paper we combine machine-learning tools and the theory of quantum entanglement to perform entanglement classification for multipartite qubit systems in pure states. We use a parameterisation of quantum systems using artificial neural networks in a restricted Boltzmann machine (RBM) architecture, known as Neural Network Quantum States (NNS), whose entanglement properties can be deduced via a constrained, reinforcement learning procedure. In this way, Separable Neural Network States (SNNS) can be used to build entanglement witnesses for any target state.
Spiking neural networks (SNNs) has attracted much attention due to its great potential of modeling time-dependent signals. The firing rate of spiking neurons is decided by control rate which is fixed manually in advance, and thus, whether the firing rate is adequate for modeling actual time series relies on fortune. Though it is demanded to have an adaptive control rate, it is a non-trivial task because the control rate and the connection weights learned during the training process are usually entangled. In this paper, we show that the firing rate is related to the eigenvalue of the spike generation function. Inspired by this insight, by enabling the spike generation function to have adaptable eigenvalues rather than parametric control rates, we develop the Bifurcation Spiking Neural Network (BSNN), which has an adaptive firing rate and is insensitive to the setting of control rates. Experiments validate the effectiveness of BSNN on a broad range of tasks, showing that BSNN achieves superior performance to existing SNNs and is robust to the setting of control rates.
Excessively high, neural synchronisation has been associated with epileptic seizures, one of the most common brain diseases worldwide. A better understanding of neural synchronisation mechanisms can thus help control or even treat epilepsy. In this paper, we study neural synchronisation in a random network where nodes are neurons with excitatory and inhibitory synapses, and neural activity for each node is provided by the adaptive exponential integrate-and-fire model. In this framework, we verify that the decrease in the influence of inhibition can generate synchronisation originating from a pattern of desynchronised spikes. The transition from desynchronous spikes to synchronous bursts of activity, induced by varying the synaptic coupling, emerges in a hysteresis loop due to bistability where abnormal (excessively high synchronous) regimes exist. We verify that, for parameters in the bistability regime, a square current pulse can trigger excessively high (abnormal) synchronisation, a process that can reproduce features of epileptic seizures. Then, we show that it is possible to suppress such abnormal synchronisation by applying a small-amplitude external current on less than 10% of the neurons in the network. Our results demonstrate that external electrical stimulation not only can trigger synchronous behaviour, but more importantly, it can be used as a means to reduce abnormal synchronisation and thus, control or treat effectively epileptic seizures.
In this work, we study the dynamic range in a neuronal network modelled by cellular automaton. We consider deterministic and non-deterministic rules to simulate electrical and chemical synapses. Chemical synapses have an intrinsic time-delay and are susceptible to parameter variations guided by learning Hebbian rules of behaviour. Our results show that chemical synapses can abruptly enhance sensibility of the neural network, a manifestation that can become even more predominant if learning rules of evolution are applied to the chemical synapses.
Sensory predictions by the brain in all modalities take place as a result of bottom-up and top-down connections both in the neocortex and between the neocortex and the thalamus. The bottom-up connections in the cortex are responsible for learning, pattern recognition, and object classification, and have been widely modelled using artificial neural networks (ANNs). Here, we present a neural network architecture modelled on the top-down corticothalamic connections and the behaviour of the thalamus: a corticothalamic neural network (CTNN), consisting of an auto-encoder connected to a difference engine with a threshold. We demonstrate that the CTNN is input agnostic, multi-modal, robust during partial occlusion of one or more sensory inputs, and has significantly higher processing efficiency than other predictive coding models, proportional to the number of sequentially similar inputs in a sequence. This increased efficiency could be highly significant in more complex implementations of this architecture, where the predictive nature of the cortex will allow most of the incoming data to be discarded.