ﻻ يوجد ملخص باللغة العربية
Quantifying the similarity between symbolic sequences is a traditional problem in Information Theory which requires comparing the frequencies of symbols in different sequences. In numerous modern applications, ranging from DNA over music to texts, the distribution of symbol frequencies is characterized by heavy-tailed distributions (e.g., Zipfs law). The large number of low-frequency symbols in these distributions poses major difficulties to the estimation of the similarity between sequences, e.g., they hinder an accurate finite-size estimation of entropies. Here we show analytically how the systematic (bias) and statistical (fluctuations) errors in these estimations depend on the sample size~$N$ and on the exponent~$gamma$ of the heavy-tailed distribution. Our results are valid for the Shannon entropy $(alpha=1)$, its corresponding similarity measures (e.g., the Jensen-Shanon divergence), and also for measures based on the generalized entropy of order $alpha$. For small $alpha$s, including $alpha=1$, the errors decay slower than the $1/N$-decay observed in short-tailed distributions. For $alpha$ larger than a critical value $alpha^* = 1+1/gamma leq 2$, the $1/N$-decay is recovered. We show the practical significance of our results by quantifying the evolution of the English language over the last two centuries using a complete $alpha$-spectrum of measures. We find that frequent words change more slowly than less frequent words and that $alpha=2$ provides the most robust measure to quantify language change.
Real-world complex systems often comprise many distinct types of elements as well as many more types of networked interactions between elements. When the relative abundances of types can be measured well, we further observe heavy-tailed categorical d
In their recent work Scale-free networks are rare, Broido and Clauset address the problem of the analysis of degree distributions in networks to classify them as scale-free at different strengths of scale-freeness. Over the last two decades, a multit
A comparison of two english texts from Lewis Carroll, one (Alice in wonderland), also translated into esperanto, the other (Through a looking glass) are discussed in order to observe whether natural and artificial languages significantly differ from
Financial time series have been investigated to follow fat-tailed distributions. Further, an empirical probability distribution sometimes shows cut-off shapes on its tails. To describe this stylized fact, we incorporate the cut-off effect in supersta
Probability distributions of human displacements has been fit with exponentially truncated Levy flights or fat tailed Pareto inverse power law probability distributions. Thus, people usually stay within a given location (for example, the city of resi