ترغب بنشر مسار تعليمي؟ اضغط هنا

Stochastic Physics-Informed Neural Networks (SPINN): A Moment-Matching Framework for Learning Hidden Physics within Stochastic Differential Equations

281   0   0.0 ( 0 )
 نشر من قبل Ali Mesbah
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Stochastic differential equations (SDEs) are used to describe a wide variety of complex stochastic dynamical systems. Learning the hidden physics within SDEs is crucial for unraveling fundamental understanding of the stochastic and nonlinear behavior of these systems. We propose a flexible and scalable framework for training deep neural networks to learn constitutive equations that represent hidden physics within SDEs. The proposed stochastic physics-informed neural network framework (SPINN) relies on uncertainty propagation and moment-matching techniques along with state-of-the-art deep learning strategies. SPINN first propagates stochasticity through the known structure of the SDE (i.e., the known physics) to predict the time evolution of statistical moments of the stochastic states. SPINN learns (deep) neural network representations of the hidden physics by matching the predicted moments to those estimated from data. Recent advances in automatic differentiation and mini-batch gradient descent are leveraged to establish the unknown parameters of the neural networks. We demonstrate SPINN on three benchmark in-silico case studies and analyze the frameworks robustness and numerical stability. SPINN provides a promising new direction for systematically unraveling the hidden physics of multivariate stochastic dynamical systems with multiplicative noise.



قيم البحث

اقرأ أيضاً

Recently, researchers have utilized neural networks to accurately solve partial differential equations (PDEs), enabling the mesh-free method for scientific computation. Unfortunately, the network performance drops when encountering a high nonlinearit y domain. To improve the generalizability, we introduce the novel approach of employing multi-task learning techniques, the uncertainty-weighting loss and the gradients surgery, in the context of learning PDE solutions. The multi-task scheme exploits the benefits of learning shared representations, controlled by cross-stitch modules, between multiple related PDEs, which are obtainable by varying the PDE parameterization coefficients, to generalize better on the original PDE. Encouraging the network pay closer attention to the high nonlinearity domain regions that are more challenging to learn, we also propose adversarial training for generating supplementary high-loss samples, similarly distributed to the original training distribution. In the experiments, our proposed methods are found to be effective and reduce the error on the unseen data points as compared to the previous approaches in various PDE examples, including high-dimensional stochastic PDEs.
365 - Ling Guo , Hao Wu , Tao Zhou 2021
We introduce in this work the normalizing field flows (NFF) for learning random fields from scattered measurements. More precisely, we construct a bijective transformation (a normalizing flow characterizing by neural networks) between a Gaussian rand om field with the Karhunen-Lo`eve (KL) expansion structure and the target stochastic field, where the KL expansion coefficients and the invertible networks are trained by maximizing the sum of the log-likelihood on scattered measurements. This NFF model can be used to solve data-driven forward, inverse, and mixed forward/inverse stochastic partial differential equations in a unified framework. We demonstrate the capability of the proposed NFF model for learning Non Gaussian processes and different types of stochastic partial differential equations.
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems. The concept of PINNs is expanded to learn not only the solution of one particular differential equation but the solutions to a class of problems. We demonstrate this idea by estimating the coercive field of permanent magnets which depends on the width and strength of local defects. When the neural network incorporates the physics of magnetization reversal, training can be achieved in an unsupervised way. There is no need to generate labeled training data. The presented test cases have been rigorously studied in the past. Thus, a detailed and easy comparison with analytical solutions is made. We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
128 - Xu Liu , Xiaoya Zhang , Wei Peng 2021
Physics-informed neural networks (PINNs) have been widely used to solve various scientific computing problems. However, large training costs limit PINNs for some real-time applications. Although some works have been proposed to improve the training e fficiency of PINNs, few consider the influence of initialization. To this end, we propose a New Reptile initialization based Physics-Informed Neural Network (NRPINN). The original Reptile algorithm is a meta-learning initialization method based on labeled data. PINNs can be trained with less labeled data or even without any labeled data by adding partial differential equations (PDEs) as a penalty term into the loss function. Inspired by this idea, we propose the new Reptile initialization to sample more tasks from the parameterized PDEs and adapt the penalty term of the loss. The new Reptile initialization can acquire initialization parameters from related tasks by supervised, unsupervised, and semi-supervised learning. Then, PINNs with initialization parameters can efficiently solve PDEs. Besides, the new Reptile initialization can also be used for the variants of PINNs. Finally, we demonstrate and verify the NRPINN considering both forward problems, including solving Poisson, Burgers, and Schrodinger equations, as well as inverse problems, where unknown parameters in the PDEs are estimated. Experimental results show that the NRPINN training is much faster and achieves higher accuracy than PINNs with other initialization methods.
Motivated by recent research on Physics-Informed Neural Networks (PINNs), we make the first attempt to introduce the PINNs for numerical simulation of the elliptic Partial Differential Equations (PDEs) on 3D manifolds. PINNs are one of the deep learn ing-based techniques. Based on the data and physical models, PINNs introduce the standard feedforward neural networks (NNs) to approximate the solutions to the PDE systems. By using automatic differentiation, the PDEs system could be explicitly encoded into NNs and consequently, the sum of mean squared residuals from PDEs could be minimized with respect to the NN parameters. In this study, the residual in the loss function could be constructed validly by using the automatic differentiation because of the relationship between the surface differential operators $ abla_S/Delta_S$ and the standard Euclidean differential operators $ abla/Delta$. We first consider the unit sphere as surface to investigate the numerical accuracy and convergence of the PINNs with different training example sizes and the depth of the NNs. Another examples are provided with different complex manifolds to verify the robustness of the PINNs.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا