ترغب بنشر مسار تعليمي؟ اضغط هنا

44 - Mingkun Xu , Yujie Wu , Lei Deng 2021
Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain for processing multimodal information in complex environments. Despite recent tremendous progress in spiking neural network s (SNNs) for handling Euclidean-space tasks, it still remains challenging to exploit SNNs in processing non-Euclidean-space data represented by graph data, mainly due to the lack of effective modeling framework and useful training techniques. Here we present a general spike-based modeling framework that enables the direct training of SNNs for graph learning. Through spatial-temporal unfolding for spiking data flows of node features, we incorporate graph convolution filters into spiking dynamics and formalize a synergistic learning paradigm. Considering the unique features of spike representation and spiking dynamics, we propose a spatial-temporal feature normalization (STFN) technique suitable for SNN to accelerate convergence. We instantiate our methods into two spiking graph models, including graph convolution SNNs and graph attention SNNs, and validate their performance on three node-classification benchmarks, including Cora, Citeseer, and Pubmed. Our model can achieve comparable performance with the state-of-the-art graph neural network (GNN) models with much lower computation costs, demonstrating great benefits for the execution on neuromorphic hardware and prompting neuromorphic applications in graphical scenarios.
343 - Hanle Zheng , Yujie Wu , Lei Deng 2020
Spiking neural networks (SNNs) are promising in a bio-plausible coding for spatio-temporal information and event-driven signal processing, which is very suited for energy-efficient implementation in neuromorphic hardware. However, the unique working mode of SNNs makes them more difficult to train than traditional networks. Currently, there are two main routes to explore the training of deep SNNs with high performance. The first is to convert a pre-trained ANN model to its SNN version, which usually requires a long coding window for convergence and cannot exploit the spatio-temporal features during training for solving temporal tasks. The other is to directly train SNNs in the spatio-temporal domain. But due to the binary spike activity of the firing function and the problem of gradient vanishing or explosion, current methods are restricted to shallow architectures and thereby difficult in harnessing large-scale datasets (e.g. ImageNet). To this end, we propose a threshold-dependent batch normalization (tdBN) method based on the emerging spatio-temporal backpropagation, termed STBP-tdBN, enabling direct training of a very deep SNN and the efficient implementation of its inference on neuromorphic hardware. With the proposed method and elaborated shortcut connection, we significantly extend directly-trained SNNs from a shallow structure ( < 10 layer) to a very deep structure (50 layers). Furthermore, we theoretically analyze the effectiveness of our method based on Block Dynamical Isometry theory. Finally, we report superior accuracy results including 93.15 % on CIFAR-10, 67.8 % on DVS-CIFAR10, and 67.05% on ImageNet with very few timesteps. To our best knowledge, its the first time to explore the directly-trained deep SNNs with high performance on ImageNet.
Orchestration of diverse synaptic plasticity mechanisms across different timescales produces complex cognitive processes. To achieve comparable cognitive complexity in memristive neuromorphic systems, devices that are capable to emulate short- and lo ng-term plasticity (STP and LTP, respectively) concomitantly are essential. However, this fundamental bionic trait has not been reported in any existing memristors where STP and LTP can only be induced selectively because of the inability to be decoupled using different loci and mechanisms. In this work, we report the first demonstration of truly concomitant STP and LTP in a three-terminal memristor that uses independent physical phenomena to represent each form of plasticity. The emerging layered material Bi2O2Se is used in memristor for the first time, opening up the prospects for ultra-thin, high-speed and low-power neuromorphic devices. The concerted action of STP and LTP in our memristor allows full-range modulation of the transient synaptic efficacy, from depression to facilitation, by stimulus frequency or intensity, providing a versatile device platform for neuromorphic function implementation. A recurrent neural circuitry model is developed to simulate the intricate sleep-wake cycle autoregulation process, in which the concomitance of STP and LTP is posited as a key factor in enabling this neural homeostasis. This work sheds new light on the highly sophisticated computational capabilities of memristors and their prospects for realization of advanced neuromorphic functions.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا