ﻻ يوجد ملخص باللغة العربية
The standard nature of computing is currently being challenged by a range of problems that start to hinder technological progress. One of the strategies being proposed to address some of these problems is to develop novel brain-inspired processing methods and technologies, and apply them to a wide range of application scenarios. This is an extremely challenging endeavor that requires researchers in multiple disciplines to combine their efforts and co-design at the same time the processing methods, the supporting computing architectures, and their underlying technologies. The journal ``Neuromorphic Computing and Engineering (NCE) has been launched to support this new community in this effort and provide a forum and repository for presenting and discussing its latest advances. Through close collaboration with our colleagues on the editorial team, the scope and characteristics of NCE have been designed to ensure it serves a growing transdisciplinary and dynamic community across academia and industry.
Neuromorphic computing systems uses non-volatile memory (NVM) to implement high-density and low-energy synaptic storage. Elevated voltages and currents needed to operate NVMs cause aging of CMOS-based transistors in each neuron and synapse circuit in
Modern computation based on the von Neumann architecture is today a mature cutting-edge science. In this architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer
Neuromorphic computing systems such as DYNAPs and Loihi have recently been introduced to the computing community to improve performance and energy efficiency of machine learning programs, especially those that are implemented using Spiking Neural Net
The deployment of the next generation computing platform at ExaFlops scale requires to solve new technological challenges mainly related to the impressive number (up to 10^6) of compute elements required. This impacts on system power consumption, in
Scientific computing sometimes involves computation on sensitive data. Depending on the data and the execution environment, the HPC (high-performance computing) user or data provider may require confidentiality and/or integrity guarantees. To study t