ترغب بنشر مسار تعليمي؟ اضغط هنا

242 - Jiaxi Cheng , Zhenhao Cen , 2021
Plasmon-induced transparency (PIT) displays complex nonlinear dynamics that find critical phenomena in areas such as nonlinear waves. However, such a nonlinear solution depends sensitively on the selection of parameters and different potentials in th e Schrodinger equation. Despite this complexity, the machine learning community has developed remarkable efficiencies in predicting complicated datasets by regression. Here, we consider a recurrent neural network (RNN) approach to predict the complex propagation of nonlinear solitons in plasmon-induced transparency metamaterial systems with applied potentials bypassing the need for analytical and numerical approaches of a guiding model. We demonstrate the success of this scheme on the prediction of the propagation of the nonlinear solitons solely from a given initial condition and potential. We prove the prominent agreement of results in simulation and prediction by long short-term memory (LSTM) artificial neural networks. The framework presented in this work opens up a new perspective for the application of RNN in quantum systems and nonlinear waves using Schrodinger-type equations, for example, the nonlinear dynamics in cold-atom systems and nonlinear fiber optics.
Plasmon-induced transparency (PIT) in advanced materials has attracted extensive attention for both theoretical and applied physics. Here, we considered a scheme that can produce PIT and studied the characteristics of ultraslow low-power magnetic sol itons. The PIT metamaterial is constructed as an array of unit cells that consist of two coupled varactor-loaded split-ring resonators. Simulations verified that ultraslow magnetic solitons can be generated in this type of metamaterial. To solve nonlinear equations, various types of numerical methods can be applied by virtue of exact solutions, which are always difficult to acquire. However, the initial conditions and propagation distance impact the ultimate results. In this article, an artificial neural network (ANN) was used as a supervised learning model to predict the evolution and final mathematical expressions through training based on samples with disparate initial conditions. Specifically, the influences of the number of hidden layers were discussed. Additionally, the learning results obtained by employing several training algorithms were analyzed and compared. Our research opens a route for employing machine learning algorithms to save time in both physical and engineering applications of Schrodinger-type systems.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا