ﻻ يوجد ملخص باللغة العربية
The entropy of Boltzmann-Gibbs, as proved by Shannon and Khinchin, is based on four axioms, where the fourth one concerns additivity. The group theoretic entropies make use of formal group theory to replace this axiom with a more general composability axiom. As has been pointed out before, generalised entropies crucially depend on the number of allowed number degrees of freedom $N$. The functional form of group entropies is restricted (though not uniquely determined) by assuming extensivity on the equal probability ensemble, which leads to classes of functionals corresponding to sub-exponential, exponential or super-exponential dependence of the phase space volume $W$ on $N$. We review the ensuing entropies, discuss the composability axiom, relate to the Gibbs paradox discussion and explain why group entropies may be particularly relevant from an information theoretic perspective.
We introduce a class of information measures based on group entropies, allowing us to describe the information-theoretical properties of complex systems. These entropic measures are nonadditive, and are mathematically deduced from a series of natural
The Marginally Rigid State is a candidate paradigm for what makes granular material a state of matter distinct from both liquid and solid. Coordination number is identified as a discriminating characteristic, and for rough-surfaced particles we show
Sharp two- and three-dimensional phase transitional magnetization curves are obtained by an iterative renormalization-group coupling of Ising chains, which are solved exactly. The chains by themselves do not have a phase transition or non-zero magnet
Discrete amorphous materials are best described in terms of arbitrary networks which can be embedded in three dimensional space. Investigating the thermodynamic equilibrium as well as non-equilibrium behavior of such materials around second order pha
The stochastic entropy generated during the evolution of a system interacting with an environment may be separated into three components, but only two of these have a non-negative mean. The third component of entropy production is associated with the