ترغب بنشر مسار تعليمي؟ اضغط هنا

Central Limit Theorem And Moderate Deviation Principle For Inviscid Stochastic Burgers Equation

175   0   0.0 ( 0 )
 نشر من قبل Zhengyan Wu
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We establish a central limit theorem and prove a moderate deviation principle for inviscid stochastic Burgers equation. Due to the lack of viscous term, this is done in the framework of kinetic solution. The weak convergence method and doubling variables method play a key role.

قيم البحث

اقرأ أيضاً

Our purpose is to prove central limit theorem for countable nonhomogeneous Markov chain under the condition of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chain in Ces`aro sense. Furthermore, we obtain a corresponding moderate deviation theorem for countable nonhomogeneous Markov chain by Gartner-Ellis theorem and exponential equivalent method.
The present work deals with the global solvability as well as asymptotic analysis of stochastic generalized Burgers-Huxley (SGBH) equation perturbed by space-time white noise in a bounded interval of $mathbb{R}$. We first prove the existence of uniqu e mild as well as strong solution to SGBH equation and then obtain the existence of an invariant measure. Later, we establish two major properties of the Markovian semigroup associated with the solutions of SGBH equation, that is, irreducibility and strong Feller property. These two properties guarantees the uniqueness of invariant measures and ergodicity also. Then, under further assumptions on the noise coefficient, we discuss the ergodic behavior of the solution of SGBH equation by providing a Large Deviation Principle (LDP) for the occupation measure for large time (Donsker-Varadhan), which describes the exact rate of exponential convergence.
88 - Xiaobin Sun , Ran Wang , Lihu Xu 2018
A Freidlin-Wentzell type large deviation principle is established for stochastic partial differential equations with slow and fast time-scales, where the slow component is a one-dimensional stochastic Burgers equation with small noise and the fast co mponent is a stochastic reaction-diffusion equation. Our approach is via the weak convergence criterion developed in [3].
155 - A. Guillin , R. Liptser 2005
Taking into account some likeness of moderate deviations (MD) and central limit theorems (CLT), we develop an approach, which made a good showing in CLT, for MD analysis of a family $$ S^kappa_t=frac{1}{t^kappa}int_0^tH(X_s)ds, ttoinfty $$ for an er godic diffusion process $X_t$ under $0.5<kappa<1$ and appropriate $H$. We mean a decomposition with ``corrector: $$ frac{1}{t^kappa}int_0^tH(X_s)ds={rm corrector}+frac{1}{t^kappa}underbrace{M_t}_{rm martingale}. $$ and show that, as in the CLT analysis, the corrector is negligible but in the MD scale, and the main contribution in the MD brings the family ``$ frac{1}{t^kappa}M_t, ttoinfty. $ Starting from Bayer and Freidlin, cite{BF}, and finishing by Wus papers cite{Wu1}-cite{WuH}, in the MD study Laplaces transform dominates. In the paper, we replace the Laplace technique by one, admitting to give the conditions, providing the MD, in terms of ``drift-diffusion parameters and $H$. However, a verification of these conditions heavily depends on a specificity of a diffusion model. That is why the paper is named ``Examples ....
156 - Zihua Guo , Baoxiang Wang 2008
Considering the Cauchy problem for the Korteweg-de Vries-Burgers equation begin{eqnarray*} u_t+u_{xxx}+epsilon |partial_x|^{2alpha}u+(u^2)_x=0, u(0)=phi, end{eqnarray*} where $0<epsilon,alphaleq 1$ and $u$ is a real-valued function, we show that it is globally well-posed in $H^s (s>s_alpha)$, and uniformly globally well-posed in $H^s (s>-3/4)$ for all $epsilon in (0,1)$. Moreover, we prove that for any $T>0$, its solution converges in $C([0,T]; H^s)$ to that of the KdV equation if $epsilon$ tends to 0.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا