ترغب بنشر مسار تعليمي؟ اضغط هنا

A binary renewal process is a stochastic process ${X_n}$ taking values in ${0,1}$ where the lengths of the runs of 1s between successive zeros are independent. After observing ${X_0,X_1,...,X_n}$ one would like to predict the future behavior, and the problem of universal estimators is to do so without any prior knowledge of the distribution. We prove a variety of results of this type, including universal estimates for the expected time to renewal as well as estimates for the conditional distribution of the time to renewal. Some of our results require a moment condition on the time to renewal and we show by an explicit construction how some moment condition is necessary.
For a stationary stochastic process ${X_n}$ with values in some set $A$, a finite word $w in A^K$ is called a memory word if the conditional probability of $X_0$ given the past is constant on the cylinder set defined by $X_{-K}^{-1}=w$. It is a calle d a minimal memory word if no proper suffix of $w$ is also a memory word. For example in a $K$-step Markov processes all words of length $K$ are memory words but not necessarily minimal. We consider the problem of determining the lengths of the longest minimal memory words and the shortest memory words of an unknown process ${X_n}$ based on sequentially observing the outputs of a single sample ${xi_1,xi_2,...xi_n}$. We will give a universal estimator which converges almost surely to the length of the longest minimal memory word and show that no such universal estimator exists for the length of the shortest memory word. The alphabet $A$ may be finite or countable.
Finitarily Markovian processes are those processes ${X_n}_{n=-infty}^{infty}$ for which there is a finite $K$ ($K = K({X_n}_{n=-infty}^0$) such that the conditional distribution of $X_1$ given the entire past is equal to the conditional distribution of $X_1$ given only ${X_n}_{n=1-K}^0$. The least such value of $K$ is called the memory length. We give a rather complete analysis of the problems of universally estimating the least such value of $K$, both in the backward sense that we have just described and in the forward sense, where one observes successive values of ${X_n}$ for $n geq 0$ and asks for the least value $K$ such that the conditional distribution of $X_{n+1}$ given ${X_i}_{i=n-K+1}^n$ is the same as the conditional distribution of $X_{n+1}$ given ${X_i}_{i=-infty}^n$. We allow for finite or countably infinite alphabet size.
The forward estimation problem for stationary and ergodic time series ${X_n}_{n=0}^{infty}$ taking values from a finite alphabet ${cal X}$ is to estimate the probability that $X_{n+1}=x$ based on the observations $X_i$, $0le ile n$ without prior know ledge of the distribution of the process ${X_n}$. We present a simple procedure $g_n$ which is evaluated on the data segment $(X_0,...,X_n)$ and for which, ${rm error}(n) = |g_{n}(x)-P(X_{n+1}=x |X_0,...,X_n)|to 0$ almost surely for a subclass of all stationary and ergodic time series, while for the full class the Cesaro average of the error tends to zero almost surely and moreover, the error tends to zero in probability.
The forecasting problem for a stationary and ergodic binary time series ${X_n}_{n=0}^{infty}$ is to estimate the probability that $X_{n+1}=1$ based on the observations $X_i$, $0le ile n$ without prior knowledge of the distribution of the process ${X_ n}$. It is known that this is not possible if one estimates at all values of $n$. We present a simple procedure which will attempt to make such a prediction infinitely often at carefully selected stopping times chosen by the algorithm. We show that the proposed procedure is consistent under certain conditions, and we estimate the growth rate of the stopping times.
We prove several results concerning classifications, based on successive observations $(X_1,..., X_n)$ of an unknown stationary and ergodic process, for membership in a given class of processes, such as the class of all finite order Markov chains.
Bailey showed that the general pointwise forecasting for stationary and ergodic time series has a negative solution. However, it is known that for Markov chains the problem can be solved. Morvai showed that there is a stopping time sequence ${lambda_ n}$ such that $P(X_{lambda_n+1}=1|X_0,...,X_{lambda_n}) $ can be estimated from samples $(X_0,...,X_{lambda_n})$ such that the difference between the conditional probability and the estimate vanishes along these stoppping times for all stationary and ergodic binary time series. We will show it is not possible to estimate the above conditional probability along a stopping time sequence for all stationary and ergodic binary time series in a pointwise sense such that if the time series turns out to be a Markov chain, the predictor will predict eventually for all $n$.
87 - Gusztav Morvai 2007
The forward prediction problem for a binary time series ${X_n}_{n=0}^{infty}$ is to estimate the probability that $X_{n+1}=1$ based on the observations $X_i$, $0le ile n$ without prior knowledge of the distribution of the process ${X_n}$. It is known that this is not possible if one estimates at all values of $n$. We present a simple procedure which will attempt to make such a prediction infinitely often at carefully selected stopping times chosen by the algorithm. The growth rate of the stopping times is also exhibited.
Consider a stationary real-valued time series ${X_n}_{n=0}^{infty}$ with a priori unknown distribution. The goal is to estimate the conditional expectation $E(X_{n+1}|X_0,..., X_n)$ based on the observations $(X_0,..., X_n)$ in a pointwise consistent way. It is well known that this is not possible at all values of $n$. We will estimate it along stopping times.
This paper considers estimation of a univariate density from an individual numerical sequence. It is assumed that (i) the limiting relative frequencies of the numerical sequence are governed by an unknown density, and (ii) there is a known upper boun d for the variation of the density on an increasing sequence of intervals. A simple estimation scheme is proposed, and is shown to be $L_1$ consistent when (i) and (ii) apply. In addition it is shown that there is no consistent estimation scheme for the set of individual sequences satisfying only condition (i).
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا