Do you want to publish a course? Click here

A Method for Representing Periodic Functions and Enforcing Exactly Periodic Boundary Conditions with Deep Neural Networks

81   0   0.0 ( 0 )
 Added by Suchuan Dong
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

We present a simple and effective method for representing periodic functions and enforcing exactly the periodic boundary conditions for solving differential equations with deep neural networks (DNN). The method stems from some simple properties about function compositions involving periodic functions. It essentially composes a DNN-represented arbitrary function with a set of independent periodic functions with adjustable (training) parameters. We distinguish two types of periodic conditions: those imposing the periodicity requirement on the function and all its derivatives (to infinite order), and those imposing periodicity on the function and its derivatives up to a finite order $k$ ($kgeqslant 0$). The former will be referred to as $C^{infty}$ periodic conditions, and the latter $C^{k}$ periodic conditions. We define operations that constitute a $C^{infty}$ periodic layer and a $C^k$ periodic layer (for any $kgeqslant 0$). A deep neural network with a $C^{infty}$ (or $C^k$) periodic layer incorporated as the second layer automatically and exactly satisfies the $C^{infty}$ (or $C^k$) periodic conditions. We present extensive numerical experiments on ordinary and partial differential equations with $C^{infty}$ and $C^k$ periodic boundary conditions to verify and demonstrate that the proposed method indeed enforces exactly, to the machine accuracy, the periodicity for the DNN solution and its derivatives.



rate research

Read More

In recent work it has been established that deep neural networks are capable of approximating solutions to a large class of parabolic partial differential equations without incurring the curse of dimension. However, all this work has been restricted to problems formulated on the whole Euclidean domain. On the other hand, most problems in engineering and the sciences are formulated on finite domains and subjected to boundary conditions. The present paper considers an important such model problem, namely the Poisson equation on a domain $Dsubset mathbb{R}^d$ subject to Dirichlet boundary conditions. It is shown that deep neural networks are capable of representing solutions of that problem without incurring the curse of dimension. The proofs are based on a probabilistic representation of the solution to the Poisson equation as well as a suitable sampling method.
78 - M Gadella , LP Lara , J. Negro 2016
We compare three different methods to obtain solutions of Sturm-Liouville problems: a successive approximation method and two other iterative methods. We look for solutions with periodic or anti periodic boundary conditions. With some numerical test over the Mathieu equation, we compare the efficiency of these three methods. As an application, we make a numerical analysis on a model for carbon nanotubes.
Implicitly defined, continuous, differentiable signal representations parameterized by neural networks have emerged as a powerful paradigm, offering many possible benefits over conventional representations. However, current network architectures for such implicit neural representations are incapable of modeling signals with fine detail, and fail to represent a signals spatial and temporal derivatives, despite the fact that these are essential to many physical signals defined implicitly as the solution to partial differential equations. We propose to leverage periodic activation functions for implicit neural representations and demonstrate that these networks, dubbed sinusoidal representation networks or Sirens, are ideally suited for representing complex natural signals and their derivatives. We analyze Siren activation statistics to propose a principled initialization scheme and demonstrate the representation of images, wavefields, video, sound, and their derivatives. Further, we show how Sirens can be leveraged to solve challenging boundary value problems, such as particular Eikonal equations (yielding signed distance functions), the Poisson equation, and the Helmholtz and wave equations. Lastly, we combine Sirens with hypernetworks to learn priors over the space of Siren functions.
In this paper, we introduce a new approach based on distance fields to exactly impose boundary conditions in physics-informed deep neural networks. The challenges in satisfying Dirichlet boundary conditions in meshfree and particle methods are well-known. This issue is also pertinent in the development of physics informed neural networks (PINN) for the solution of partial differential equations. We introduce geometry-aware trial functions in artifical neural networks to improve the training in deep learning for partial differential equations. To this end, we use concepts from constructive solid geometry (R-functions) and generalized barycentric coordinates (mean value potential fields) to construct $phi$, an approximate distance function to the boundary of a domain. To exactly impose homogeneous Dirichlet boundary conditions, the trial function is taken as $phi$ multiplied by the PINN approximation, and its generalization via transfinite interpolation is used to a priori satisfy inhomogeneous Dirichlet (essential), Neumann (natural), and Robin boundary conditions on complex geometries. In doing so, we eliminate modeling error associated with the satisfaction of boundary conditions in a collocation method and ensure that kinematic admissibility is met pointwise in a Ritz method. We present numerical solutions for linear and nonlinear boundary-value problems over domains with affine and curved boundaries. Benchmark problems in 1D for linear elasticity, advection-diffusion, and beam bending; and in 2D for the Poisson equation, biharmonic equation, and the nonlinear Eikonal equation are considered. The approach extends to higher dimensions, and we showcase its use by solving a Poisson problem with homogeneneous Dirichlet boundary conditions over the 4D hypercube. This study provides a pathway for meshfree analysis to be conducted on the exact geometry without domain discretization.
We developed a micromagnetic method for modeling magnetic systems with periodic boundary conditions along an arbitrary number of dimensions. The main feature is an adaptation of the Ewald summation technique for evaluation of long-range dipolar interactions. The method was applied to investigate the hysteresis process in hard-soft magnetic nanocomposites with various geometries. The dependence of the results on different micromagnetic parameters was studied. We found that for layered structures with an out-of-plane hard phase easy axis the hysteretic properties are very sensitive to the strength of the interlayer exchange coupling, as long as the spontaneous magnetization for the hard phase is significantly smaller than for the soft phase. The origin of this behavior was discussed. Additionally, we investigated the soft phase size optimizing the energy product of hard-soft nanocomposites.

suggested questions

comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا