ترغب بنشر مسار تعليمي؟ اضغط هنا

On the approximation of basins of attraction using deep neural networks

133   0   0.0 ( 0 )
 نشر من قبل Joniald Shena
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The basin of attraction is the set of initial points that will eventually converge to some attracting set. Its knowledge is important in understanding the dynamical behavior of a given dynamical system of interest. In this work, we address the problem of reconstructing the basins of attraction of a multistable system, using only labeled data. To this end, we view this problem as a classification task and use a deep neural network as a classifier for predicting the attractor that corresponds to any given initial condition. Additionally, we provide a method for obtaining an approximation of the basin boundary of the underlying system, using the trained classification model. Finally, we provide evidence relating the complexity of the structure of the basins of attraction with the quality of the obtained reconstructions, via the concept of basin entropy. We demonstrate the application of the proposed method on the Lorenz system in a bistable regime.



قيم البحث

اقرأ أيضاً

We study partition of networks into basins of attraction based on a steepest ascent search for the node of highest degree. Each node is associated with, or attracted to its neighbor of maximal degree, as long as the degree is increasing. A node that has no neighbors of higher degree is a peak, attracting all the nodes in its basin. Maximally random scale-free networks exhibit different behavior based on their degree distribution exponent $gamma$: for small $gamma$ (broad distribution) networks are dominated by a giant basin, whereas for large $gamma$ (narrow distribution) there are numerous basins, with peaks attracting mainly their nearest neighbors. We derive expressions for the first two moments of the number of basins. We also obtain the complete distribution of basin sizes for a class of hierarchical deterministic scale-free networks that resemble random nets. Finally, we generalize the problem to regular networks and lattices where all degrees are equal, and thus the attractiveness of a node must be determined by an assigned weight, rather than the degree. We derive the complete distribution of basins of attraction resulting from randomly assigned weights in one-dimensional chains.
We study the expressivity of deep neural networks. Measuring a networks complexity by its number of connections or by its number of neurons, we consider the class of functions for which the error of best approximation with networks of a given complex ity decays at a certain rate when increasing the complexity budget. Using results from classical approximation theory, we show that this class can be endowed with a (quasi)-norm that makes it a linear function space, called approximation space. We establish that allowing the networks to have certain types of skip connections does not change the resulting approximation spaces. We also discuss the role of the networks nonlinearity (also known as activation function) on the resulting spaces, as well as the role of depth. For the popular ReLU nonlinearity and its powers, we relate the newly constructed spaces to classical Besov spaces. The established embeddings highlight that some functions of very low Besov smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.
We present an experiment that systematically probes the basins of attraction of two fixed points of a nonlinear nanomechanical resonator and maps them out with high resolution. We observe a separatrix which progressively alters shape for varying driv e strength and changes the relative areas of the two basins of attraction. The observed separatrix is blurred due to ambient fluctuations, including residual noise in the drive system, which cause uncertainty in the preparation of an initial state close to the separatrix. We find a good agreement between the experimentally mapped and theoretically calculated basins of attraction.
Recently, physics-driven deep learning methods have shown particular promise for the prediction of physical fields, especially to reduce the dependency on large amounts of pre-computed training data. In this work, we target the physics-driven learnin g of complex flow fields with high resolutions. We propose the use of emph{Convolutional neural networks} (CNN) based U-net architectures to efficiently represent and reconstruct the input and output fields, respectively. By introducing Navier-Stokes equations and boundary conditions into loss functions, the physics-driven CNN is designed to predict corresponding steady flow fields directly. In particular, this prevents many of the difficulties associated with approaches employing fully connected neural networks. Several numerical experiments are conducted to investigate the behavior of the CNN approach, and the results indicate that a first-order accuracy has been achieved. Specifically for the case of a flow around a cylinder, different flow regimes can be learned and the adhered twin-vortices are predicted correctly. The numerical results also show that the training for multiple cases is accelerated significantly, especially for the difficult cases at low Reynolds numbers, and when limited reference solutions are used as supplementary learning targets.
In this paper, we investigate geometric properties of monotone systems by studying their isostables and basins of attraction. Isostables are boundaries of specific forward-invariant sets defined by the so-called Koopman operator, which provides a lin ear infinite-dimensional description of a nonlinear system. First, we study the spectral properties of the Koopman operator and the associated semigroup in the context of monotone systems. Our results generalize the celebrated Perron-Frobenius theorem to the nonlinear case and allow us to derive geometric properties of isostables and basins of attraction. Additionally, we show that under certain conditions we can characterize the bounds on the basins of attraction under parametric uncertainty in the vector field. We discuss computational approaches to estimate isostables and basins of attraction and illustrate the results on two and four state monotone systems.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا