ترغب بنشر مسار تعليمي؟ اضغط هنا

We present a large deviation principle for the entropy penalized Mather problem when the Lagrangian L is generic (in this case the Mather measure $mu$ is unique and the support of $mu$ is the Aubry set). Consider, for each value of $epsilon $ and h, the entropy penalized Mather problem $min {int_{tntimesrn} L(x,v)dmu(x,v)+epsilon S[mu]},$ where the entropy S is given by $S[mu]=int_{tntimesrn}mu(x,v)lnfrac{mu(x,v)}{int_{rn}mu(x,w)dw}dxdv,$ and the minimization is performed over the space of probability densities $mu(x,v)$ that satisfy the holonomy constraint It follows from D. Gomes and E. Valdinoci that there exists a minimizing measure $mu_{epsilon, h}$ which converges to the Mather measure $mu$. We show a LDP $lim_{epsilon,hto0} epsilon ln mu_{epsilon,h}(A),$ where $Asubset mathbb{T}^Ntimesmathbb{R}^N$. The deviation function I is given by $I(x,v)= L(x,v)+ ablaphi_0(x)(v)-bar{H}_{0},$ where $phi_0$ is the unique viscosity solution for L.
In this paper we present an upper bound for the decay of correlation for the stationary stochastic process associated with the Entropy Penalized Method. Let $L(x, v):Tt^ntimesRr^nto Rr$ be a Lagrangian of the form L(x,v) = {1/2}|v|^2 - U(x) + < P, v>. For each value of $epsilon $ and $h$, consider the operator Gg[phi](x):= -epsilon h {ln}[int_{re^N} e ^{-frac{hL(x,v)+phi(x+hv)}{epsilon h}}dv], as well as the reversed operator bar Gg[phi](x):= -epsilon h {ln}[int_{re^N} e^{-frac{hL(x+hv,-v)+phi(x+hv)}{epsilon h}}dv], both acting on continuous functions $phi:Tt^nto Rr$. Denote by $phi_{epsilon,h} $ the solution of $Gg[phi_{epsilon,h}]=phi_{epsilon,h}+lambda_{epsilon,h}$, and by $bar phi_{epsilon,h} $ the solution of $bar Gg[phi_{epsilon,h}]=bar phi_{epsilon,h}+lambda_{epsilon,h}$. In order to analyze the decay of correlation for this process we show that the operator $ {cal L} (phi) (x) = int e^{- frac{h L (x,v)}{epsilon}} phi(x+h v) d v,$ has a maximal eigenvalue isolated from the rest of the spectrum.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا