ﻻ يوجد ملخص باللغة العربية
Symbolic sequences with long-range correlations are expected to result in a slow regression to a steady state of entropy increase. However, we prove that also in this case a fast transition to a constant rate of entropy increase can be obtained, provided that the extensive entropy of Tsallis with entropic index q is adopted, thereby resulting in a new form of entropy that we shall refer to as Kolmogorov-Sinai-Tsallis (KST) entropy. We assume that the same symbols, either 1 or -1, are repeated in strings of length l, with the probability distribution p(l) proportional to 1/(l^mu). The numerical evaluation of the KST entropy suggests that at the value mu = 2 a sort of abrupt transition might occur. For the values of mu in the range 1<mu<2 the entropic index q is expected to vanish, as a consequence of the fact that in this case the average length <l> diverges, thereby breaking the balance between determinism and randomness in favor of determinism. In the region mu > 2 the entropic index q seems to depend on mu through the power law expression q = (mu-2)^(alpha) with alpha approximately 0.13 (q = 1 with mu > 3). It is argued that this phase-transition like property signals the onset of the thermodynamical regime at mu = 2.
Symbolic relative entropy, an efficient nonlinear complexity parameter measuring probabilistic divergences of symbolic sequences, is proposed in our nonlinear dynamics analysis of heart rates considering equal states. Equalities are not rare in discr
Symbolic execution is a powerful technique for program analysis. However, it has many limitations in practical applicability: the path explosion problem encumbers scalability, the need for language-specific implementation, the inability to handle com
Statistical thermodynamics of small systems shows dramatic differences from normal systems. Parallel to the recently presented steady-state thermodynamic formalism for master equation and Fokker-Planck equation, we show that a ``thermodynamic theory
A measure called Physical Complexity is established and calculated for a population of sequences, based on statistical physics, automata theory, and information theory. It is a measure of the quantity of information in an organisms genome. It is base
In quantum many-body systems, a Hamiltonian is called an ``extensive entropy generator if starting from a random product state the entanglement entropy obeys a volume law at long times with overwhelming probability. We prove that (i) any Hamiltonian