Do you want to publish a course? Click here

Weak convergence rates of spectral Galerkin approximations for SPDEs with nonlinear diffusion coefficients

119   0   0.0 ( 0 )
 Added by Ryan Kurniawan
 Publication date 2014
  fields
and research's language is English




Ask ChatGPT about the research

Strong convergence rates for (temporal, spatial, and noise) numerical approximations of semilinear stochastic evolution equations (SEEs) with smooth and regular nonlinearities are well understood in the scientific literature. Weak convergence rates for numerical approximations of such SEEs have been investigated since about 11 years and are far away from being well understood: roughly speaking, no essentially sharp weak convergence rates are known for parabolic SEEs with nonlinear diffusion coefficient functions; see Remark 2.3 in [A. Debussche, Weak approximation of stochastic partial differential equations: the nonlinear case, Math. Comp. 80 (2011), no. 273, 89-117] for details. In this article we solve the weak convergence problem emerged from Debussches article in the case of spectral Galerkin approximations and establish essentially sharp weak convergence rates for spatial spectral Galerkin approximations of semilinear SEEs with nonlinear diffusion coefficient functions. Our solution to the weak convergence problem does not use Malliavin calculus. Rather, key ingredients in our solution to the weak convergence problem emerged from Debussches article are the use of appropriately modifie



rate research

Read More

A new class of explicit Milstein schemes, which approximate stochastic differential equations (SDEs) with superlinearly growing drift and diffusion coefficients, is proposed in this article. It is shown, under very mild conditions, that these explicit schemes converge in $mathcal L^p$ to the solution of the corresponding SDEs with optimal rate.
Motivated by the results of cite{sabanis2015}, we propose explicit Euler-type schemes for SDEs with random coefficients driven by Levy noise when the drift and diffusion coefficients can grow super-linearly. As an application of our results, one can construct explicit Euler-type schemes for SDEs with delays (SDDEs) which are driven by Levy noise and have super-linear coefficients. Strong convergence results are established and their rate of convergence is shown to be equal to that of the classical Euler scheme. It is proved that the optimal rate of convergence is achieved for $mathcal{L}^2$-convergence which is consistent with the corresponding results available in the literature.
A conjecture appears in cite{milsteinscheme}, in the form of a remark, where it is stated that it is possible to construct, in a specified way, any high order explicit numerical schemes to approximate the solutions of SDEs with superlinear coefficients. We answer this conjecture affirmatively for the case of order 1.5 approximations and show that the suggested methodology works. Moreover, we explore the case of having H{o}lder continuous derivatives for the diffusion coefficients.
Shot noise processes have been extensively studied due to their mathematical properties and their relevance in several applications. Here, we consider nonnegative shot noise processes and prove their weak convergence to Levy-driven Ornstein-Uhlenbeck (OU), whose features depend on the underlying jump distributions. Among others, we obtain the OU-Gamma and OU-Inverse Gaussian processes, having gamma and inverse gaussian processes as background Levy processes, respectively. Then, we derive the necessary conditions guaranteeing the diffusion limit to a Gaussian OU process, show that they are not met unless allowing for negative jumps happening with probability going to zero, and quantify the error occurred when replacing the shot noise with the OU process and the non-Gaussian OU processes. The results offer a new class of models to be used instead of the commonly applied Gaussian OU processes to approximate synaptic input currents, membrane voltages or conductances modelled by shot noise in single neuron modelling.
We consider on the torus the scaling limit of stochastic 2D (inviscid) fluid dynamical equations with transport noise to deterministic viscous equations. Quantitative estimates on the convergence rates are provided by combining analytic and probabilistic arguments, especially heat kernel properties and maximal estimates for stochastic convolutions. Similar ideas are applied to the stochastic 2D Keller-Segel model, yielding explicit choice of noise to ensure that the blow-up probability is less than any given threshold. Our approach also gives rise to some mixing property for stochastic linear transport equations and dissipation enhancement in the viscous case.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا