ترغب بنشر مسار تعليمي؟ اضغط هنا

A user-guide to Gridap -- grid-based approximation of partial differential equations in Julia

67   0   0.0 ( 0 )
 نشر من قبل Francesc Verdugo Phd
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

We present Gridap, a new scientific software library for the numerical approximation of partial differential equations (PDEs) using grid-based approximations. Gridap is an open-source software project exclusively written in the Julia programming language. The main motivation behind the development of this library is to provide an easy-to-use framework for the development of complex PDE solvers in a dynamically typed style without sacrificing the performance of statically typed languages. This work is a tutorial-driven user guide to the library. It covers some popular linear and nonlinear PDE systems for scalar and vector fields, single and multi-field problems, conforming and nonconforming finite element discretizations, on structured and unstructured meshes of simplices and hexahedra.

قيم البحث

اقرأ أيضاً

Phase retrieval refers to the recovery of signals from the magnitudes (and not the phases) of linear measurements. While there has been a recent explosion in development of phase retrieval methods, the lack of a common interface has made it difficult to compare new methods against the current state-of-the-art. PhasePack is a software library that creates a common interface for a wide range of phase retrieval schemes. PhasePack also provides a test bed for phase retrieval methods using both synthetic data and publicly available empirical datasets.
In this paper we present the theoretical framework needed to justify the use of a kernel-based collocation method (meshfree approximation method) to estimate the solution of high-dimensional stochastic partial differential equations (SPDEs). Using an implicit time stepping scheme, we transform stochastic parabolic equations into stochastic elliptic equations. Our main attention is concentrated on the numerical solution of the elliptic equations at each time step. The estimator of the solution of the elliptic equations is given as a linear combination of reproducing kernels derived from the differential and boundary operators of the SPDE centered at collocation points to be chosen by the user. The random expansion coefficients are computed by solving a random system of linear equations. Numerical experiments demonstrate the feasibility of the method.
Applications that exploit the architectural details of high-performance computing (HPC) systems have become increasingly invaluable in academia and industry over the past two decades. The most important hardware development of the last decade in HPC has been the General Purpose Graphics Processing Unit (GPGPU), a class of massively parallel devices that now contributes the majority of computational power in the top 500 supercomputers. As these systems grow, small costs such as latency---due to the fixed cost of memory accesses and communication---accumulate in a large simulation and become a significant barrier to performance. The swept time-space decomposition rule is a communication-avoiding technique for time-stepping stencil update formulas that attempts to reduce latency costs. This work extends the swept rule by targeting heterogeneous, CPU/GPU architectures representing current and future HPC systems. We compare our approach to a naive decomposition scheme with two test equations using an MPI+CUDA pattern on 40 processes over two nodes containing one GPU. The swept rule produces a factor of 1.9 to 23 speedup for the heat equation and a factor of 1.1 to 2.0 speedup for the Euler equations, using the same processors and work distribution, and with the best possible configurations. These results show the potential effectiveness of the swept rule for different equations and numerical schemes on massively parallel computing systems that incur substantial latency costs.
DiffEqFlux.jl is a library for fusing neural networks and differential equations. In this work we describe differential equations from the viewpoint of data science and discuss the complementary nature between machine learning models and differential equations. We demonstrate the ability to incorporate DifferentialEquations.jl-defined differential equation problems into a Flux-defined neural network, and vice versa. The advantages of being able to use the entire DifferentialEquations.jl suite for this purpose is demonstrated by counter examples where simple integration strategies fail, but the sophisticated integration strategies provided by the DifferentialEquations.jl library succeed. This is followed by a demonstration of delay differential equations and stochastic differential equations inside of neural networks. We show high-level functionality for defining neural ordinary differential equations (neural networks embedded into the differential equation) and describe the extra models in the Flux model zoo which includes neural stochastic differential equations. We conclude by discussing the various adjoint methods used for backpropogation of the differential equation solvers. DiffEqFlux.jl is an important contribution to the area, as it allows the full weight of the differential equation solvers developed from decades of research in the scientific computing field to be readily applied to the challenges posed by machine learning and data science.
We propose a physical analogy between finding the solution of an ordinary differential equation (ODE) and a $N$ particle problem in statistical mechanics. It uses the fact that the solution of an ODE is equivalent to obtain the minimum of a functiona l. Then, we link these two notions, proposing this functional to be the interaction potential energy or thermodynamic potential of an equivalent particle problem. Therefore, solving this statistical mechanics problem amounts to solve the ODE. If only one solution exists, our method provides the unique solution of the ODE. In case we treat an eigenvalue equation, where infinite solutions exist, we obtain the absolute minimum of the corresponding functional or fundamental mode. As a result, it is possible to establish a general relationship between statistical mechanics and ODEs which allows not only to solve them from a physical perspective but also to obtain all relevant thermodynamical equilibrium variables of that particle system related to the differential equation.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا