ترغب بنشر مسار تعليمي؟ اضغط هنا

Flows, currents, and cycles for Markov Chains: large deviation asymptotics

71   0   0.0 ( 0 )
 نشر من قبل Alessandra Faggionato
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We consider a continuous time Markov chain on a countable state space. We prove a joint large deviation principle (LDP) of the empirical measure and current in the limit of large time interval. The proof is based on results on the joint large deviations of the empirical measure and flow obtained in cite{BFG}. By improving such results we also show, under additional assumptions, that the LDP holds with the strong L^1 topology on the space of currents. We deduce a general version of the Gallavotti-Cohen (GC) symmetry for the current field and show that it implies the so-called fluctuation theorem for the GC functional. We also analyze the large deviation properties of generalized empirical currents associated to a fundamental basis in the cycle space, which, as we show, are given by the first class homological coefficients in the graph underlying the Markov chain. Finally, we discuss in detail some examples.

قيم البحث

اقرأ أيضاً

We recover the Donsker-Varadhan large deviations principle (LDP) for the empirical measure of a continuous time Markov chain on a countable (finite or infinite) state space from the joint LDP for the empirical measure and the empirical flow proved in [2].
257 - C. Landim 2018
We review recent results on the metastable behavior of continuous-time Markov chains derived through the characterization of Markov chains as unique solutions of martingale problems.
Our purpose is to prove central limit theorem for countable nonhomogeneous Markov chain under the condition of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chain in Ces`aro sense. Furthermore, we obtain a corresponding moderate deviation theorem for countable nonhomogeneous Markov chain by Gartner-Ellis theorem and exponential equivalent method.
We improve upon all known lower bounds on the critical fugacity and critical density of the hard sphere model in dimensions two and higher. As the dimension tends to infinity our improvements are by factors of $2$ and $1.7$, respectively. We make the se improvements by utilizing techniques from theoretical computer science to show that a certain Markov chain for sampling from the hard sphere model mixes rapidly at low enough fugacities. We then prove an equivalence between optimal spatial and temporal mixing for hard spheres to deduce our results.
Computing the stationary distributions of a continuous-time Markov chain (CTMC) involves solving a set of linear equations. In most cases of interest, the number of equations is infinite or too large, and the equations cannot be solved analytically o r numerically. Several approximation schemes overcome this issue by truncating the state space to a manageable size. In this review, we first give a comprehensive theoretical account of the stationary distributions and their relation to the long-term behaviour of CTMCs that is readily accessible to non-experts and free of irreducibility assumptions made in standard texts. We then review truncation-based approximation schemes for CTMCs with infinite state spaces paying particular attention to the schemes convergence and the errors they introduce, and we illustrate their performance with an example of a stochastic reaction network of relevance in biology and chemistry. We conclude by discussing computational trade-offs associated with error control and several open questions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا