Zipfs law, unbounded complexity and open-ended evolution


Abstract in English

A major problem for evolutionary theory is understanding the so called {em open-ended} nature of evolutionary change, from its definition to its origins. Open-ended evolution (OEE) refers to the unbounded increase in complexity that seems to characterise evolution on multiple scales. This property seems to be a characteristic feature of biological and technological evolution and is strongly tied to the generative potential associated with combinatorics, which allows the system to grow and expand their available state spaces. Interestingly, many complex systems presumably displaying OEE, from language to proteins, share a common statistical property: the presence of Zipfs law. Given an inventory of basic items (such as words or protein domains) required to build more complex structures (sentences or proteins) Zipfs law tells us that most of these elements are rare whereas a few of them are extremely common. Using Algorithmic Information Theory, in this paper we provide a fundamental definition for open-endedness, which can be understood as {em postulates}. Its statistical counterpart, based on standard Shannon Information theory, has the structure of a variational problem which is shown to lead to Zipfs law as the expected consequence of an evolutionary process displaying OEE. We further explore the problem of information conservation through an OEE process and we conclude that statistical information (standard Shannon information) is not conserved, resulting into the paradoxical situation in which the increase of information content has the effect of erasing itself. We prove that this paradox is solved if we consider non-statistical forms of information. This last result implies that standard information theory may not be a suitable theoretical framework to explore the persistence and increase of the information content in OEE systems.

Download