ترغب بنشر مسار تعليمي؟ اضغط هنا

On estimating the memory for finitarily Markovian processes

563   0   0.0 ( 0 )
 نشر من قبل Gusztav Morvai
 تاريخ النشر 2007
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Finitarily Markovian processes are those processes ${X_n}_{n=-infty}^{infty}$ for which there is a finite $K$ ($K = K({X_n}_{n=-infty}^0$) such that the conditional distribution of $X_1$ given the entire past is equal to the conditional distribution of $X_1$ given only ${X_n}_{n=1-K}^0$. The least such value of $K$ is called the memory length. We give a rather complete analysis of the problems of universally estimating the least such value of $K$, both in the backward sense that we have just described and in the forward sense, where one observes successive values of ${X_n}$ for $n geq 0$ and asks for the least value $K$ such that the conditional distribution of $X_{n+1}$ given ${X_i}_{i=n-K+1}^n$ is the same as the conditional distribution of $X_{n+1}$ given ${X_i}_{i=-infty}^n$. We allow for finite or countably infinite alphabet size.



قيم البحث

اقرأ أيضاً

We prove several results concerning classifications, based on successive observations $(X_1,..., X_n)$ of an unknown stationary and ergodic process, for membership in a given class of processes, such as the class of all finite order Markov chains.
This paper introduces the concept of random context representations for the transition probabilities of a finite-alphabet stochastic process. Processes with these representations generalize context tree processes (a.k.a. variable length Markov chains ), and are proven to coincide with processes whose transition probabilities are almost surely continuous functions of the (infinite) past. This is similar to a classical result by Kalikow about continuous transition probabilities. Existence and uniqueness of a minimal random context representation are proven, and an estimator of the transition probabilities based on this representation is shown to have very good pastwise adaptativity properties. In particular, it achieves minimax performance, up to logarithmic factors, for binary renewal processes with bounded $2+gamma$ moments.
106 - Yiming Ding , Xuyan Xiang 2016
Long memory or long range dependency is an important phenomenon that may arise in the analysis of time series or spatial data. Most of the definitions of long memory of a stationary process $X={X_1, X_2,cdots,}$ are based on the second-order properti es of the process. The excess entropy of a stationary process is the summation of redundancies which relates to the rate of convergence of the conditional entropy $H(X_n|X_{n-1},cdots, X_1)$ to the entropy rate. It is proved that the excess entropy is identical to the mutual information between the past and the future when the entropy $H(X_1)$ is finite. We suggest the definition that a stationary process is long memory if the excess entropy is infinite. Since the definition of excess entropy of a stationary process requires very weak moment condition on the distribution of the process, it can be applied to processes whose distributions without bounded second moment. A significant property of excess entropy is that it is invariant under invertible transformation, which enables us to know the excess entropy of a stationary process from the excess entropy of other process. For stationary Guassian process, the excess entropy characterization of long memory relates to popular characterization well. It is proved that the excess entropy of fractional Gaussian noise is infinite if the Hurst parameter $H in (1/2, 1)$.
171 - Salvatore Miccich`e 2008
In this paper we give explicit examples of power-law correlated stationary Markovian processes y(t) where the stationary pdf shows tails which are gaussian or exponential. These processes are obtained by simply performing a coordinate transformation of a specific power-law correlated additive process x(t), already known in the literature, whose pdf shows power-law tails 1/x^a. We give analytical and numerical evidence that although the new processes (i) are Markovian and (ii) have gaussian or exponential tails their autocorrelation function still shows a power-law decay <y(t) y(t+T)>=1/T^b where b grows with a with a law which is compatible with b=a/2-c, where c is a numerical constant. When a<2(1+c) the process y(t), although Markovian, is long-range correlated. Our results help in clarifying that even in the context of Markovian processes long-range dependencies are not necessarily associated to the occurrence of extreme events. Moreover, our results can be relevant in the modeling of complex systems with long memory. In fact, we provide simple processes associated to Langevin equations thus showing that long-memory effects can be modeled in the context of continuous time stationary Markovian processes.
163 - G. Morvai , B. Weiss 2008
The problem of extracting as much information as possible from a sequence of observations of a stationary stochastic process $X_0,X_1,...X_n$ has been considered by many authors from different points of view. It has long been known through the work o f D. Bailey that no universal estimator for $textbf{P}(X_{n+1}|X_0,X_1,...X_n)$ can be found which converges to the true estimator almost surely. Despite this result, for restricted classes of processes, or for sequences of estimators along stopping times, universal estimators can be found. We present here a survey of some of the recent work that has been done along these lines.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا