Do you want to publish a course? Click here

Statistics of transitions for Markov chains with periodic forcing

255   0   0.0 ( 0 )
 Added by Damien Landon
 Publication date 2013
  fields
and research's language is English




Ask ChatGPT about the research

The influence of a time-periodic forcing on stochastic processes can essentially be emphasized in the large time behaviour of their paths. The statistics of transition in a simple Markov chain model permits to quantify this influence. In particular the first Floquet multiplier of the associated generating function can be explicitly computed and related to the equilibrium probability measure of an associated process in higher dimension. An application to the stochastic resonance is presented.



rate research

Read More

We prove a new concentration inequality for U-statistics of order two for uniformly ergodic Markov chains. Working with bounded and $pi$-canonical kernels, we show that we can recover the convergence rate of Arcones and Gin{e} who proved a concentration result for U-statistics of independent random variables and canonical kernels. Our result allows for a dependence of the kernels $h_{i,j}$ with the indexes in the sums, which prevents the use of standard blocking tools. Our proof relies on an inductive analysis where we use martingale techniques, uniform ergodicity, Nummelin splitting and Bernsteins type inequality. Assuming further that the Markov chain starts from its invariant distribution, we prove a Bernstein-type concentration inequality that provides sharper convergence rate for small variance terms.
Dealing with finite Markov chains in discrete time, the focus often lies on convergence behavior and one tries to make different copies of the chain meet as fast as possible and then stick together. There is, however, a very peculiar kind of discrete finite Markov chain, for which two copies started in different states can be coupled to meet almost surely in finite time, yet their distributions keep a total variation distance bounded away from 0, even in the limit as time goes off to infinity. We show that the supremum of total variation distance kept in this context is $frac12$.
257 - C. Landim 2018
We review recent results on the metastable behavior of continuous-time Markov chains derived through the characterization of Markov chains as unique solutions of martingale problems.
We introduce the space of virtual Markov chains (VMCs) as a projective limit of the spaces of all finite state space Markov chains (MCs), in the same way that the space of virtual permutations is the projective limit of the spaces of all permutations of finite sets. We introduce the notions of virtual initial distribution (VID) and a virtual transition matrix (VTM), and we show that the law of any VMC is uniquely characterized by a pair of a VID and VTM which have to satisfy a certain compatibility condition. Lastly, we study various properties of compact convex sets associated to the theory of VMCs, including that the Birkhoff-von Neumann theorem fails in the virtual setting.
We formulate some simple conditions under which a Markov chain may be approximated by the solution to a differential equation, with quantifiable error probabilities. The role of a choice of coordinate functions for the Markov chain is emphasised. The general theory is illustrated in three examples: the classical stochastic epidemic, a population process model with fast and slow variables, and core-finding algorithms for large random hypergraphs.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا