Do you want to publish a course? Click here

Optimal convergence rates in the averaging principle for slow-fast SPDEs driven by multiplicative noise

122   0   0.0 ( 0 )
 Added by Xiaobin Sun
 Publication date 2021
  fields
and research's language is English




Ask ChatGPT about the research

In this paper, we study a class of slow-fast stochastic partial differential equations with multiplicative Wiener noise. Under some appropriate conditions, we prove the slow component converges to the solution of the corresponding averaged equation with optimal orders 1/2 and 1 in the strong and weak sense respectively. The main technique is based on the Poisson equation.



rate research

Read More

133 - Wei Hong , Shihu Li , Wei Liu 2021
In this paper, we aim to study the asymptotic behaviour for a class of McKean-Vlasov stochastic partial differential equations with slow and fast time-scales. Using the variational approach and classical Khasminskii time discretization, we show that the slow component strongly converges to the solution of the associated averaged equation. In particular, the corresponding convergence rates are also obtained. The main results can be applied to demonstrate the averaging principle for various McKean-Vlasov nonlinear SPDEs such as stochastic porous media type equation, stochastic $p$-Laplace type equation and also some McKean-Vlasov stochastic differential equations.
In this paper, the strong averaging principle is researched for a class of H{o}lder continuous drift slow-fast SPDEs with $alpha$-stable process by the Zvonkins transformation and the classical Khasminkiis time discretization method. As applications, an example is also provided to explain our result.
219 - Xiaobin Sun , Yingchao Xie 2021
In this paper, the averaging principle is studied for a class of multiscale stochastic partial differential equations driven by $alpha$-stable process, where $alphain(1,2)$. Using the technique of Poisson equation, the orders of strong and weak convergence are given $1-1/alpha$ and $1-r$ for any $rin (0,1)$ respectively. The main results extend Wiener noise considered by Br{e}hier in [6] and Ge et al. in [17] to $alpha$-stable process, and the finite dimensional case considered by Sun et al. in [39] to the infinite dimensional case.
We consider on the torus the scaling limit of stochastic 2D (inviscid) fluid dynamical equations with transport noise to deterministic viscous equations. Quantitative estimates on the convergence rates are provided by combining analytic and probabilistic arguments, especially heat kernel properties and maximal estimates for stochastic convolutions. Similar ideas are applied to the stochastic 2D Keller-Segel model, yielding explicit choice of noise to ensure that the blow-up probability is less than any given threshold. Our approach also gives rise to some mixing property for stochastic linear transport equations and dissipation enhancement in the viscous case.
In this paper, we study the averaging principle for a class of stochastic differential equations driven by $alpha$-stable processes with slow and fast time-scales, where $alphain(1,2)$. We prove that the strong and weak convergence order are $1-1/alpha$ and $1$ respectively. We show, by a simple example, that $1-1/alpha$ is the optimal strong convergence rate.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا