ﻻ يوجد ملخص باللغة العربية
Relying on the classical connection between Backward Stochastic Differential Equations (BSDEs) and non-linear parabolic partial differential equations (PDEs), we propose a new probabilistic learning scheme for solving high-dimensional semi-linear parabolic PDEs. This scheme is inspired by the approach coming from machine learning and developed using deep neural networks in Han and al. [32]. Our algorithm is based on a Picard iteration scheme in which a sequence of linear-quadratic optimisation problem is solved by means of stochastic gradient descent (SGD) algorithm. In the framework of a linear specification of the approximation space, we manage to prove a convergence result for our scheme, under some smallness condition. In practice, in order to be able to treat high-dimensional examples, we employ sparse grid approximation spaces. In the case of periodic coefficients and using pre-wavelet basis functions, we obtain an upper bound on the global complexity of our method. It shows in particular that the curse of dimensionality is tamed in the sense that in order to achieve a root mean squared error of order ${epsilon}$, for a prescribed precision ${epsilon}$, the complexity of the Picard algorithm grows polynomially in ${epsilon}^{-1}$ up to some logarithmic factor $ |log({epsilon})| $ which grows linearly with respect to the PDE dimension. Various numerical results are presented to validate the performance of our method and to compare them with some recent machine learning schemes proposed in Han and al. [20] and Hure and al. [37].
In this paper we introduce a new approach to compute rigorously solutions of Cauchy problems for a class of semi-linear parabolic partial differential equations. Expanding solutions with Chebyshev series in time and Fourier series in space, we introd
This work is a follow-up to our previous contribution (Convergence of sparse collocation for functions of countably many Gaussian random variables (with application to elliptic PDEs), SIAM J. Numer. Anal., 2018), and contains further insights on some
We consider adaptive approximations of the parameter-to-solution map for elliptic operator equations depending on a large or infinite number of parameters, comparing approximation strategies of different degrees of nonlinearity: sparse polynomial exp
Hessian operators arising in inverse problems governed by partial differential equations (PDEs) play a critical role in delivering efficient, dimension-independent convergence for both Newton solution of deterministic inverse problems, as well as Mar
The ubiquity of semilinear parabolic equations has been illustrated in their numerous applications ranging from physics, biology, to materials and social sciences. In this paper, we consider a practically desirable property for a class of semilinear