No Arabic abstract
Unitary learning is a backpropagation that serves to unitary weights update in deep complex-valued neural network with full connections, meeting a physical unitary prior in diffractive deep neural network ([DN]2). However, the square matrix property of unitary weights induces that the function signal has a limited dimension that could not generalize well. To address the overfitting problem that comes from the small samples loaded to [DN]2, an optical phase dropout trick is implemented. Phase dropout in unitary space that is evolved from a complex dropout and has a statistical inference is formulated for the first time. A synthetic mask recreated from random point apertures with random phase-shifting and its smothered modulation tailors the redundant links through incompletely sampling the input optical field at each diffractive layer. The physical features about the synthetic mask using different nonlinear activations are elucidated in detail. The equivalence between digital and diffractive model determines compound modulations that could successfully circumvent the nonlinear activations physically implemented in [DN]2. The numerical experiments verify the superiority of optical phase dropout in [DN]2 to enhance accuracy in 2D classification and recognition tasks-oriented.
We introduce an all-optical Diffractive Deep Neural Network (D2NN) architecture that can learn to implement various functions after deep learning-based design of passive diffractive layers that work collectively. We experimentally demonstrated the success of this framework by creating 3D-printed D2NNs that learned to implement handwritten digit classification and the function of an imaging lens at terahertz spectrum. With the existing plethora of 3D-printing and other lithographic fabrication methods as well as spatial-light-modulators, this all-optical deep learning framework can perform, at the speed of light, various complex functions that computer-based neural networks can implement, and will find applications in all-optical image analysis, feature detection and object classification, also enabling new camera designs and optical components that can learn to perform unique tasks using D2NNs.
Realization of deep learning with coherent diffraction has achieved remarkable development nowadays, which benefits on the fact that matrix multiplication can be optically executed in parallel as well as with little power consumption. Coherent optical field propagated in the form of complex-value entity can be manipulated into a task-oriented output with statistical inference. In this paper, we present a unitary learning protocol on deep diffractive neural network, meeting the physical unitary prior in coherent diffraction. Unitary learning is a backpropagation serving to unitary weights update through the gradient translation between Euclidean and Riemannian space. The temporal-space evolution characteristic in unitary learning is formulated and elucidated. Particularly a compatible condition on how to select the nonlinear activations in complex space is unveiled, encapsulating the fundamental sigmoid, tanh and quasi-ReLu in complex space. As a preliminary application, deep diffractive neural network with unitary learning is tentatively implemented on the 2D classification and verification tasks.
In optical devices like diffraction gratings and Fresnel lenses, light wavefront is engineered through the structuring of device surface morphology, within thicknesses comparable to the light wavelength. Fabrication of such diffractive optical elements involves highly accurate multi-step lithographic processes that in fact set into stone both the device morphology and optical functionality. In this work, we introduce shapeshifting diffractive optical elements directly written on an erasable photoresist. We first develop a lithographic configuration that allows writing/erasing cycles of aligned optical elements directly in the light path. Then, we show the realization of complex diffractive gratings with arbitrary combinations of grating vectors. Finally, we demonstrate a shapeshifting diffractive lens that is reconfigured in the light-path in order to change the imaging parameters of an optical system.
Atomic systems have long provided a useful material platform with unique quantum properties. The efficient light-matter interaction in atomic vapors has led to numerous seminal scientific achievements including accurate and precise metrology and quantum devices. In the last few decades, the field of thin optical elements with miniscule features has been extensively studied demonstrating an unprecedented ability to control photonic degrees of freedom, both linearly and non-linearly, with applications spanning from photography and spatial light modulators to cataract surgery implants. Hybridization of atoms with such thin devices may offer a new material system allowing traditional vapor cells with enhanced functionality. Here, we fabricate and demonstrate chip-scale, quantum diffractive optical elements which map atomic states to the spatial distribution of diffracted light. Two foundational diffractive elements, lamellar gratings and Fresnel lenses, are hybridized with atomic channels containing hot atomic vapors which demonstrate exceptionally strong frequency dependent behaviors. Providing the design tools for chip-scale atomic diffractive optical elements develops a path for a variety of compact thin quantum-optical elements.
Artificial neural networks (ANNs) have now been widely used for industry applications and also played more important roles in fundamental researches. Although most ANN hardware systems are electronically based, optical implementation is particularly attractive because of its intrinsic parallelism and low energy consumption. Here, we propose and demonstrate fully-functioned all optical neural networks (AONNs), in which linear operations are programmed by spatial light modulators and Fourier lenses, and optical nonlinear activation functions are realized with electromagnetically induced transparency in laser-cooled atoms. Moreover, all the errors from different optical neurons here are independent, thus the AONN could scale up to a larger system size with final error still maintaining in a similar level of a single neuron. We confirm its capability and feasibility in machine learning by successfully classifying the order and disorder phases of a typical statistic Ising model. The demonstrated AONN scheme can be used to construct various ANNs of different architectures with the intrinsic parallel computation at the speed of light.