ترغب بنشر مسار تعليمي؟ اضغط هنا

All-Chalcogenide Programmable All-Optical Deep Neural Networks

165   0   0.0 ( 0 )
 نشر من قبل Volker Sorger
 تاريخ النشر 2021
والبحث باللغة English




اسأل ChatGPT حول البحث

Deeplearning algorithms are revolutionising many aspects of modern life. Typically, they are implemented in CMOS-based hardware with severely limited memory access times and inefficient data-routing. All-optical neural networks without any electro-optic


قيم البحث

اقرأ أيضاً

Optical implementation of artificial neural networks has been attracting great attention due to its potential in parallel computation at speed of light. Although all-optical deep neural networks (AODNNs) with a few neurons have been experimentally de monstrated with acceptable errors recently, the feasibility of large scale AODNNs remains unknown because error might accumulate inevitably with increasing number of neurons and connections. Here, we demonstrate a scalable AODNN with programmable linear operations and tunable nonlinear activation functions. We verify its scalability by measuring and analyzing errors propagating from a single neuron to the entire network. The feasibility of AODNNs is further confirmed by recognizing handwritten digits and fashions respectively.
Software-implementation, via neural networks, of brain-inspired computing approaches underlie many important modern-day computational tasks, from image processing to speech recognition, artificial intelligence and deep learning applications. Yet, dif fering from real neural tissue, traditional computing architectures physically separate the core computing functions of memory and processing, making fast, efficient and low-energy brain-like computing difficult to achieve. To overcome such limitations, an attractive and alternative goal is to design direct hardware mimics of brain neurons and synapses which, when connected in appropriate networks (or neuromorphic systems), process information in a way more fundamentally analogous to that of real brains. Here we present an all-optical approach to achieving such a goal. Specifically, we demonstrate an all-optical spiking neuron device and connect it, via an integrated photonics network, to photonic synapses to deliver a small-scale all-optical neurosynaptic system capable of supervised and unsupervised learning. Moreover, we exploit wavelength division multiplexing techniques to implement a scalable circuit architecture for photonic neural networks, successfully demonstrating pattern recognition directly in the optical domain using a photonic system comprising 140 elements. Such optical implementations of neurosynaptic networks promise access to the high speed and bandwidth inherent to optical systems, which would be very attractive for the direct processing of telecommunication and visual data in the optical domain.
73 - Shikang Li , Baohua Ni , Xue Feng 2021
An optical neural network is proposed and demonstrated with programmable matrix transformation and nonlinear activation function of photodetection (square-law detection). Based on discrete phase-coherent spatial modes, the dimensionality of programma ble optical matrix operations is 30~37, which is implemented by spatial light modulators. With this architecture, all-optical classification tasks of handwritten digits, objects and depth images are performed on the same platform with high accuracy. Due to the parallel nature of matrix multiplication, the processing speed of our proposed architecture is potentially as high as7.4T~74T FLOPs per second (with 10~100GHz detector)
We introduce an all-optical Diffractive Deep Neural Network (D2NN) architecture that can learn to implement various functions after deep learning-based design of passive diffractive layers that work collectively. We experimentally demonstrated the su ccess of this framework by creating 3D-printed D2NNs that learned to implement handwritten digit classification and the function of an imaging lens at terahertz spectrum. With the existing plethora of 3D-printing and other lithographic fabrication methods as well as spatial-light-modulators, this all-optical deep learning framework can perform, at the speed of light, various complex functions that computer-based neural networks can implement, and will find applications in all-optical image analysis, feature detection and object classification, also enabling new camera designs and optical components that can learn to perform unique tasks using D2NNs.
Metasurfaces have become a promising means for manipulating optical wavefronts in flat and high-performance optical devices. Conventional metasurface device design relies on trial-and-error methods to obtain target electromagnetic (EM) response, an a pproach that demands significant efforts to investigate the enormous number of possible meta-atom structures. In this paper, a deep neural network approach is introduced that significantly improves on both speed and accuracy compared to techniques currently used to assemble metasurface-based devices. Our neural network approach overcomes three key challenges that have limited previous neural-network-based design schemes: input/output vector dimensional mismatch, accurate EM-wave phase prediction, as well as adaptation to 3-D dielectric structures, and can be generically applied to a wide variety of metasurface device designs across the entire electromagnetic spectrum. Using this new methodology, examples of neural networks capable of producing on-demand designs for meta-atoms, metasurface filters, and phase-change reconfigurable metasurfaces are demonstrated.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا