Do you want to publish a course? Click here

85 - Peng Chen , Qi-Man Shao , Lihu Xu 2020
We view the classical Lindeberg principle in a Markov process setting to establish a universal probability approximation framework by It^{o}s formula and Markov semigroup. As applications, we consider approximating a family of online stochastic gradient descents (SGDs) by a stochastic differential equation (SDE) driven by additive Brownian motion, and obtain an approximation error with explicit dependence on the dimension which makes it possible to analyse high dimensional models. We also apply our framework to study stable approximation and normal approximation and obtain their optimal convergence rates (up to a logarithmic correction for normal approximation).
88 - Xiaobin Sun , Ran Wang , Lihu Xu 2018
A Freidlin-Wentzell type large deviation principle is established for stochastic partial differential equations with slow and fast time-scales, where the slow component is a one-dimensional stochastic Burgers equation with small noise and the fast component is a stochastic reaction-diffusion equation. Our approach is via the weak convergence criterion developed in [3].
95 - Xiao Fang , Qi-Man Shao , Lihu Xu 2018
Steins method has been widely used for probability approximations. However, in the multi-dimensional setting, most of the results are for multivariate normal approximation or for test functions with bounded second- or higher-order derivatives. For a class of multivariate limiting distributions, we use Bismuts formula in Malliavin calculus to control the derivatives of the Stein equation solutions by the first derivative of the test function. Combined with Steins exchangeable pair approach, we obtain a general theorem for multivariate approximations with near optimal error bounds on the Wasserstein distance.We apply the theorem to the unadjusted Langevin algorithm.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا