ترغب بنشر مسار تعليمي؟ اضغط هنا

The dimension of ergodic random sequences

223   0   0.0 ( 0 )
 نشر من قبل Mathieu Hoyrup
 تاريخ النشر 2011
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Mathieu Hoyrup




اسأل ChatGPT حول البحث

Let mu be a computable ergodic shift-invariant measure over the Cantor space. Providing a constructive proof of Shannon-McMillan-Breiman theorem, Vyugin proved that if a sequence x is Martin-Lof random w.r.t. mu then the strong effective dimension Dim(x) of x equals the entropy of mu. Whether its effective dimension dim(x) also equals the entropy was left as an problem question. In this paper we settle this problem, providing a positive answer. A key step in the proof consists in extending recent results on Birkhoffs ergodic theorem for Martin-Lof random sequences.



قيم البحث

اقرأ أيضاً

206 - Daniil Ryabko 2012
In this work a method for statistical analysis of time series is proposed, which is used to obtain solutions to some classical problems of mathematical statistics under the only assumption that the process generating the data is stationary ergodic. N amely, three problems are considered: goodness-of-fit (or identity) testing, process classification, and the change point problem. For each of the problems a test is constructed that is asymptotically accurate for the case when the data is generated by stationary ergodic processes. The tests are based on empirical estimates of distributional distance.
We investigate the asymptotic rates of length-$n$ binary codes with VC-dimension at most $dn$ and minimum distance at least $delta n$. Two upper bounds are obtained, one as a simple corollary of a result by Haussler and the other via a shortening app roach combining Sauer-Shelah lemma and the linear programming bound. Two lower bounds are given using Gilbert-Varshamov type arguments over constant-weight and Markov-type sets.
In 1959, Renyi proposed the information dimension and the $d$-dimensional entropy to measure the information content of general random variables. This paper proposes a generalization of information dimension to stochastic processes by defining the in formation dimension rate as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer step size $1/m$ in the limit as $mtoinfty$. It is demonstrated that the information dimension rate coincides with the rate-distortion dimension, defined as twice the rate-distortion function $R(D)$ of the stochastic process divided by $-log(D)$ in the limit as $Ddownarrow 0$. It is further shown that, among all multivariate stationary processes with a given (matrix-valued) spectral distribution function (SDF), the Gaussian process has the largest information dimension rate, and that the information dimension rate of multivariate stationary Gaussian processes is given by the average rank of the derivative of the SDF. The presented results reveal that the fundamental limits of almost zero-distortion recovery via compressible signal pursuit and almost lossless analog compression are different in general.
93 - Daniel J. Katz 2018
Pseudorandom sequences are used extensively in communications and remote sensing. Correlation provides one measure of pseudorandomness, and low correlation is an important factor determining the performance of digital sequences in applications. We co nsider the problem of constructing pairs $(f,g)$ of sequences such that both $f$ and $g$ have low mean square autocorrelation and $f$ and $g$ have low mean square mutual crosscorrelation. We focus on aperiodic correlation of binary sequences, and review recent contributions along with some historical context.
The authors have recently defined the Renyi information dimension rate $d({X_t})$ of a stationary stochastic process ${X_t,,tinmathbb{Z}}$ as the entropy rate of the uniformly-quantized process divided by minus the logarithm of the quantizer step siz e $1/m$ in the limit as $mtoinfty$ (B. Geiger and T. Koch, On the information dimension rate of stochastic processes, in Proc. IEEE Int. Symp. Inf. Theory (ISIT), Aachen, Germany, June 2017). For Gaussian processes with a given spectral distribution function $F_X$, they showed that the information dimension rate equals the Lebesgue measure of the set of harmonics where the derivative of $F_X$ is positive. This paper extends this result to multivariate Gaussian processes with a given matrix-valued spectral distribution function $F_{mathbf{X}}$. It is demonstrated that the information dimension rate equals the average rank of the derivative of $F_{mathbf{X}}$. As a side result, it is shown that the scale and translation invariance of information dimension carries over from random variables to stochastic processes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا