Do you want to publish a course? Click here

Convergence of the Population Dynamics algorithm in the Wasserstein metric

92   0   0.0 ( 0 )
 Publication date 2017
  fields
and research's language is English




Ask ChatGPT about the research

We study the convergence of the population dynamics algorithm, which produces sample pools of random variables having a distribution that closely approximates that of the {em special endogenous solution} to a stochastic fixed-point equation of the form: $$Rstackrel{mathcal D}{=} Phi( Q, N, { C_i }, {R_i}),$$ where $(Q, N, {C_i})$ is a real-valued random vector with $N in mathbb{N}$, and ${R_i}_{i in mathbb{N}}$ is a sequence of i.i.d. copies of $R$, independent of $(Q, N, {C_i})$; the symbol $stackrel{mathcal{D}}{=}$ denotes equality in distribution. Specifically, we show its convergence in the Wasserstein metric of order $p$ ($p geq 1$) and prove the consistency of estimators based on the sample pool produced by the algorithm.



rate research

Read More

We analyze the convergence properties of the Wang-Landau algorithm. This sampling method belongs to the general class of adaptive importance sampling strategies which use the free energy along a chosen reaction coordinate as a bias. Such algorithms are very helpful to enhance the sampling properties of Markov Chain Monte Carlo algorithms, when the dynamics is metastable. We prove the convergence of the Wang-Landau algorithm and an associated central limit theorem.
115 - Feng-Yu Wang 2021
The convergence rate in Wasserstein distance is estimated for the empirical measures of symmetric semilinear SPDEs. Unlike in the finite-dimensional case that the convergence is of algebraic order in time, in the present situation the convergence is of log order with a power given by eigenvalues of the underlying linear operator.
We consider a sequence of identically independently distributed random samples from an absolutely continuous probability measure in one dimension with unbounded density. We establish a new rate of convergence of the $infty-$Wasserstein distance between the empirical measure of the samples and the true distribution, which extends the previous convergence result by Trilllos and Slepv{c}ev to the case that the true distribution has an unbounded density.
81 - Panpan Ren , Feng-Yu Wang 2020
The following type exponential convergence is proved for (non-degenerate or degenerate) McKean-Vlasov SDEs: $$W_2(mu_t,mu_infty)^2 +{rm Ent}(mu_t|mu_infty)le c {rm e}^{-lambda t} minbig{W_2(mu_0, mu_infty)^2,{rm Ent}(mu_0|mu_infty)big}, tge 1,$$ where $c,lambda>0$ are constants, $mu_t$ is the distribution of the solution at time $t$, $mu_infty$ is the unique invariant probability measure, ${rm Ent}$ is the relative entropy and $W_2$ is the $L^2$-Wasserstein distance. In particular, this type exponential convergence holds for some (non-degenerate or degenerate) granular media type equations generalizing those studied in [CMV, GLW] on the exponential convergence in a mean field entropy.
81 - Feng-Yu Wang 2020
Let $X_t$ be the (reflecting) diffusion process generated by $L:=Delta+ abla V$ on a complete connected Riemannian manifold $M$ possibly with a boundary $partial M$, where $Vin C^1(M)$ such that $mu(d x):= e^{V(x)}d x$ is a probability measure. We estimate the convergence rate for the empirical measure $mu_t:=frac 1 t int_0^t delta_{X_sd s$ under the Wasserstein distance. As a typical example, when $M=mathbb R^d$ and $V(x)= c_1- c_2 |x|^p$ for some constants $c_1in mathbb R, c_2>0$ and $p>1$, the explicit upper and lower bounds are present for the convergence rate, which are of sharp order when either $d<frac{4(p-1)}p$ or $dge 4$ and $ptoinfty$.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا