ﻻ يوجد ملخص باللغة العربية
We prove a new concentration inequality for U-statistics of order two for uniformly ergodic Markov chains. Working with bounded and $pi$-canonical kernels, we show that we can recover the convergence rate of Arcones and Gin{e} who proved a concentration result for U-statistics of independent random variables and canonical kernels. Our result allows for a dependence of the kernels $h_{i,j}$ with the indexes in the sums, which prevents the use of standard blocking tools. Our proof relies on an inductive analysis where we use martingale techniques, uniform ergodicity, Nummelin splitting and Bernsteins type inequality. Assuming further that the Markov chain starts from its invariant distribution, we prove a Bernstein-type concentration inequality that provides sharper convergence rate for small variance terms.
In this note, we present a version of Hoeffdings inequality in a continuous-time setting, where the data stream comes from a uniformly ergodic diffusion process. Similar to the well-studied case of Hoeffdings inequality for discrete-time uniformly er
We prove that moderate deviations for empirical measures for countable nonhomogeneous Markov chains hold under the assumption of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chains in Ces`aro sense.
The influence of a time-periodic forcing on stochastic processes can essentially be emphasized in the large time behaviour of their paths. The statistics of transition in a simple Markov chain model permits to quantify this influence. In particular t
Our purpose is to prove central limit theorem for countable nonhomogeneous Markov chain under the condition of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chain in Ces`aro sense. Furthermore, we obtain a
We describe estimators $chi_n(X_0,X_1,...,X_n)$, which when applied to an unknown stationary process taking values from a countable alphabet ${cal X}$, converge almost surely to $k$ in case the process is a $k$-th order Markov chain and to infinity otherwise.