ترغب بنشر مسار تعليمي؟ اضغط هنا

Learning a Reduced Basis of Dynamical Systems using an Autoencoder

112   0   0.0 ( 0 )
 نشر من قبل David Sondak
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Machine learning models have emerged as powerful tools in physics and engineering. Although flexible, a fundamental challenge remains on how to connect new machine learning models with known physics. In this work, we present an autoencoder with latent space penalization, which discovers finite dimensional manifolds underlying the partial differential equations of physics. We test this method on the Kuramoto-Sivashinsky (K-S), Korteweg-de Vries (KdV), and damped KdV equations. We show that the resulting optimal latent space of the K-S equation is consistent with the dimension of the inertial manifold. The results for the KdV equation imply that there is no reduced latent space, which is consistent with the truly infinite dimensional dynamics of the KdV equation. In the case of the damped KdV equation, we find that the number of active dimensions decreases with increasing damping coefficient. We then uncover a nonlinear basis representing the manifold of the latent space for the K-S equation.



قيم البحث

اقرأ أيضاً

We study the problem of predicting rare critical transition events for a class of slow-fast nonlinear dynamical systems. The state of the system of interest is described by a slow process, whereas a faster process drives its evolution and induces cri tical transitions. By taking advantage of recent advances in reservoir computing, we present a data-driven method to predict the future evolution of the state. We show that our method is capable of predicting a critical transition event at least several numerical time steps in advance. We demonstrate the success as well as the limitations of our method using numerical experiments on three examples of systems, ranging from low dimensional to high dimensional. We discuss the mathematical and broader implications of our results.
Stochastic dynamical systems with continuous symmetries arise commonly in nature and often give rise to coherent spatio-temporal patterns. However, because of their random locations, these patterns are not well captured by current order reduction tec hniques and a large number of modes is typically necessary for an accurate solution. In this work, we introduce a new methodology for efficient order reduction of such systems by combining (i) the method of slices, a symmetry reduction tool, with (ii) any standard order reduction technique, resulting in efficient mixed symmetry-dimensionality reduction schemes. In particular, using the Dynamically Orthogonal (DO) equations in the second step, we obtain a novel nonlinear Symmetry-reduced Dynamically Orthogonal (SDO) scheme. We demonstrate the performance of the SDO scheme on stochastic solutions of the 1D Korteweg-de Vries and 2D Navier-Stokes equations.
Scaling regions -- intervals on a graph where the dependent variable depends linearly on the independent variable -- abound in dynamical systems, notably in calculations of invariants like the correlation dimension or a Lyapunov exponent. In these ap plications, scaling regions are generally selected by hand, a process that is subjective and often challenging due to noise, algorithmic effects, and confirmation bias. In this paper, we propose an automated technique for extracting and characterizing such regions. Starting with a two-dimensional plot -- e.g., the values of the correlation integral, calculated using the Grassberger-Procaccia algorithm over a range of scales -- we create an ensemble of intervals by considering all possible combinations of endpoints, generating a distribution of slopes from least-squares fits weighted by the length of the fitting line and the inverse square of the fit error. The mode of this distribution gives an estimate of the slope of the scaling region (if it exists). The endpoints of the intervals that correspond to the mode provide an estimate for the extent of that region. When there is no scaling region, the distributions will be wide and the resulting error estimates for the slope will be large. We demonstrate this method for computations of dimension and Lyapunov exponent for several dynamical systems, and show that it can be useful in selecting values for the parameters in time-delay reconstructions.
We extend the scope of the dynamical theory of extreme values to cover phenomena that do not happen instantaneously, but evolve over a finite, albeit unknown at the onset, time interval. We consider complex dynamical systems, composed of many individ ual subsystems linked by a network of interactions. As a specific example of the general theory, a model of neural network, introduced to describe the electrical activity of the cerebral cortex, is analyzed in detail: on the basis of this analysis we propose a novel definition of neuronal cascade, a physiological phenomenon of primary importance. We derive extreme value laws for the statistics of these cascades, both from the point of view of exceedances (that satisfy critical scaling theory) and of block maxima.
In this paper, we consider modeling missing dynamics with a nonparametric non-Markovian model, constructed using the theory of kernel embedding of conditional distributions on appropriate Reproducing Kernel Hilbert Spaces (RKHS), equipped with orthon ormal basis functions. Depending on the choice of the basis functions, the resulting closure model from this nonparametric modeling formulation is in the form of parametric model. This suggests that the success of various parametric modeling approaches that were proposed in various domains of applications can be understood through the RKHS representations. When the missing dynamical terms evolve faster than the relevant observable of interest, the proposed approach is consistent with the effective dynamics derived from the classical averaging theory. In the linear Gaussian case without the time-scale gap, we will show that the proposed non-Markovian model with a very long memory yields an accurate estimation of the nontrivial autocovariance function for the relevant variable of the full dynamics. Supporting numerical results on instructive nonlinear dynamics show that the proposed approach is able to replicate high-dimensional missing dynamical terms on problems with and without the separation of temporal scales.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا