ترغب بنشر مسار تعليمي؟ اضغط هنا

Statistics of transitions for Markov chains with periodic forcing

266   0   0.0 ( 0 )
 نشر من قبل Damien Landon
 تاريخ النشر 2013
  مجال البحث
والبحث باللغة English
 تأليف Samuel Herrmann




اسأل ChatGPT حول البحث

The influence of a time-periodic forcing on stochastic processes can essentially be emphasized in the large time behaviour of their paths. The statistics of transition in a simple Markov chain model permits to quantify this influence. In particular the first Floquet multiplier of the associated generating function can be explicitly computed and related to the equilibrium probability measure of an associated process in higher dimension. An application to the stochastic resonance is presented.



قيم البحث

اقرأ أيضاً

We prove a new concentration inequality for U-statistics of order two for uniformly ergodic Markov chains. Working with bounded and $pi$-canonical kernels, we show that we can recover the convergence rate of Arcones and Gin{e} who proved a concentrat ion result for U-statistics of independent random variables and canonical kernels. Our result allows for a dependence of the kernels $h_{i,j}$ with the indexes in the sums, which prevents the use of standard blocking tools. Our proof relies on an inductive analysis where we use martingale techniques, uniform ergodicity, Nummelin splitting and Bernsteins type inequality. Assuming further that the Markov chain starts from its invariant distribution, we prove a Bernstein-type concentration inequality that provides sharper convergence rate for small variance terms.
Dealing with finite Markov chains in discrete time, the focus often lies on convergence behavior and one tries to make different copies of the chain meet as fast as possible and then stick together. There is, however, a very peculiar kind of discrete finite Markov chain, for which two copies started in different states can be coupled to meet almost surely in finite time, yet their distributions keep a total variation distance bounded away from 0, even in the limit as time goes off to infinity. We show that the supremum of total variation distance kept in this context is $frac12$.
257 - C. Landim 2018
We review recent results on the metastable behavior of continuous-time Markov chains derived through the characterization of Markov chains as unique solutions of martingale problems.
We introduce the space of virtual Markov chains (VMCs) as a projective limit of the spaces of all finite state space Markov chains (MCs), in the same way that the space of virtual permutations is the projective limit of the spaces of all permutations of finite sets. We introduce the notions of virtual initial distribution (VID) and a virtual transition matrix (VTM), and we show that the law of any VMC is uniquely characterized by a pair of a VID and VTM which have to satisfy a certain compatibility condition. Lastly, we study various properties of compact convex sets associated to the theory of VMCs, including that the Birkhoff-von Neumann theorem fails in the virtual setting.
We formulate some simple conditions under which a Markov chain may be approximated by the solution to a differential equation, with quantifiable error probabilities. The role of a choice of coordinate functions for the Markov chain is emphasised. The general theory is illustrated in three examples: the classical stochastic epidemic, a population process model with fast and slow variables, and core-finding algorithms for large random hypergraphs.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا