ترغب بنشر مسار تعليمي؟ اضغط هنا

Almost triangular Markov chains on $mathbb{N}$

55   0   0.0 ( 0 )
 نشر من قبل Jean-Francois Marckert
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

A transition matrix $[U_{i,j}]_{i,jgeq 0}$ on $mathbb{N}$ is said to be almost upper triangular if $U_{i,j}geq 0Rightarrow jgeq i-1$, so that the increments of the corresponding Markov chains are at least $-1$; a transition matrix $[L_{i,j}]_{i,jgeq 0}$ is said to be almost lower triangular if $L_{i,j}geq 0Rightarrow jleq i+1$, and then, the increments of the corresponding Markov chains are at most $+1$. In the present paper, we characterize the recurrence, positive recurrence and invariant distribution for the class of almost triangular transition matrices. The upper case appears to be the simplest in many ways, with existence and uniqueness of invariant measures, when in the lower case, existence as well as uniqueness are not guaranteed. We present the time-reversal connection between upper and lower almost triangular transition matrices, which provides classes of integrable lower triangular transition matrices. These results encompass the case of birth and death processes (BDP) that are famous Markov chains (or processes) taking their values in $mathbb{N}$, which are simultaneously almost upper and almost lower triangular, and whose study has been initiated by Karlin & McGregor in the 1950s. They found invariant measures, criteria for recurrence, null recurrence, among others; their approach relies on some profound connections they discovered between the theory of BDP, the spectral properties of their transition matrices, the moment problem, and the theory of orthogonal polynomials. Our approach is mainly combinatorial and uses elementary algebraic methods; it is somehow more direct and does not use the same tools.

قيم البحث

اقرأ أيضاً

Dealing with finite Markov chains in discrete time, the focus often lies on convergence behavior and one tries to make different copies of the chain meet as fast as possible and then stick together. There is, however, a very peculiar kind of discrete finite Markov chain, for which two copies started in different states can be coupled to meet almost surely in finite time, yet their distributions keep a total variation distance bounded away from 0, even in the limit as time goes off to infinity. We show that the supremum of total variation distance kept in this context is $frac12$.
257 - C. Landim 2018
We review recent results on the metastable behavior of continuous-time Markov chains derived through the characterization of Markov chains as unique solutions of martingale problems.
We introduce the space of virtual Markov chains (VMCs) as a projective limit of the spaces of all finite state space Markov chains (MCs), in the same way that the space of virtual permutations is the projective limit of the spaces of all permutations of finite sets. We introduce the notions of virtual initial distribution (VID) and a virtual transition matrix (VTM), and we show that the law of any VMC is uniquely characterized by a pair of a VID and VTM which have to satisfy a certain compatibility condition. Lastly, we study various properties of compact convex sets associated to the theory of VMCs, including that the Birkhoff-von Neumann theorem fails in the virtual setting.
The aim of this paper is to develop a general theory for the class of skip-free Markov chains on denumerable state space. This encompasses their potential theory via an explicit characterization of their potential kernel expressed in terms of family of fundamental excessive functions, which are defined by means of the theory of Martin boundary. We also describe their fluctuation theory generalizing the celebrated fluctuations identities that were obtained by using the Wiener-Hopf factorization for the specific skip-free random walks. We proceed by resorting to the concept of similarity to identify the class of skip-free Markov chains whose transition operator has only real and simple eigenvalues. We manage to find a set of sufficient and easy-to-check conditions on the one-step transition probability for a Markov chain to belong to this class. We also study several properties of this class including their spectral expansions given in terms of Riesz basis, derive a necessary and sufficient condition for this class to exhibit a separation cutoff, and give a tighter bound on its convergence rate to stationarity than existing results.
106 - G. Morvai , B. Weiss 2007
We describe estimators $chi_n(X_0,X_1,...,X_n)$, which when applied to an unknown stationary process taking values from a countable alphabet ${cal X}$, converge almost surely to $k$ in case the process is a $k$-th order Markov chain and to infinity otherwise.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا