ﻻ يوجد ملخص باللغة العربية
Many engineering problems involve learning hidden dynamics from indirect observations, where the physical processes are described by systems of partial differential equations (PDE). Gradient-based optimization methods are considered scalable and efficient to learn hidden dynamics. However, one of the most time-consuming and error-prone tasks is to derive and implement the gradients, especially in systems of PDEs where gradients from different systems must be correctly integrated together. To that purpose, we present a novel technique, called intelligent automatic differentiation (IAD), to leverage the modern machine learning tool $texttt{TensorFlow}$ for computing gradients automatically and conducting optimization efficiently. Moreover, IAD allows us to integrate specially designed state adjoint method codes to achieve better performance. Numerical tests demonstrate the feasibility of IAD for learning hidden dynamics in complicated systems of PDEs; additionally, by incorporating custom built state adjoint method codes in IAD, we significantly accelerate the forward and inverse simulation.
Based on a class of associative algebras with zero-divisors which are called real-like algebras by us, we introduce a way of defining automatic differentiation and present different ways of doing automatic differentiation to compute the first, the se
In this note, we report the back propagation formula for complex valued singular value decompositions (SVD). This formula is an important ingredient for a complete automatic differentiation(AD) infrastructure in terms of complex numbers, and it is al
We propose a method to describe consistent equations of state (EOS) for arbitrary systems. Complex EOS are traditionally obtained by fitting suitable analytical expressions to thermophysical data. A key aspect of EOS are that the relationships betwee
The successes of deep learning, variational inference, and many other fields have been aided by specialized implementations of reverse-mode automatic differentiation (AD) to compute gradients of mega-dimensional objectives. The AD techniques underlyi
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes