Do you want to publish a course? Click here

Maximum entropy generation in open systems: the Fourth Law?

213   0   0.0 ( 0 )
 Added by Umberto Lucia prof.
 Publication date 2010
  fields Physics
and research's language is English
 Authors Umberto Lucia




Ask ChatGPT about the research

This paper develops an analytical and rigorous formulation of the maximum entropy generation principle. The result is suggested as the Fourth Law of Thermodynamics.



rate research

Read More

162 - D. Sornette 2007
We review briefly the concepts underlying complex systems and probability distributions. The later are often taken as the first quantitative characteristics of complex systems, allowing one to detect the possible occurrence of regularities providing a step toward defining a classification of the different levels of organization (the ``universality classes). A rapid survey covers the Gaussian law, the power law and the stretched exponential distributions. The fascination for power laws is then explained, starting from the statistical physics approach to critical phenomena, out-of-equilibrium phase transitions, self-organized criticality, and ending with a large but not exhaustive list of mechanisms leading to power law distributions. A check-list for testing and qualifying a power law distribution from your data is described in 7 steps. This essay enlarges the description of distributions by proposing that ``kings, i.e., events even beyond the extrapolation of the power law tail, may reveal an information which is complementary and perhaps sometimes even more important than the power law distribution. We conclude a list of future directions.
A theory of symbolic dynamic systems with long-range correlations based on the consideration of the binary N-step Markov chains developed earlier in Phys. Rev. Lett. 90, 110601 (2003) is generalized to the biased case (non equal numbers of zeros and unities in the chain). In the model, the conditional probability that the i-th symbol in the chain equals zero (or unity) is a linear function of the number of unities (zeros) among the preceding N symbols. The correlation and distribution functions as well as the variance of number of symbols in the words of arbitrary length L are obtained analytically and verified by numerical simulations. A self-similarity of the studied stochastic process is revealed and the similarity group transformation of the chain parameters is presented. The diffusion Fokker-Planck equation governing the distribution function of the L-words is explored. If the persistent correlations are not extremely strong, the distribution function is shown to be the Gaussian with the variance being nonlinearly dependent on L. An equation connecting the memory and correlation function of the additive Markov chain is presented. This equation allows reconstructing a memory function using a correlation function of the system. Effectiveness and robustness of the proposed method is demonstrated by simple model examples. Memory functions of concrete coarse-grained literary texts are found and their universal power-law behavior at long distances is revealed.
In the last years, researchers have realized the difficulties of fitting power-law distributions properly. These difficulties are higher in Zipfs systems, due to the discreteness of the variables and to the existence of two representations for these systems, i.e., t
A theory of additive Markov chains with long-range memory, proposed earlier in Phys. Rev. E 68, 06117 (2003), is developed and used to describe statistical properties of long-range correlated systems. The convenient characteristics of such systems, a memory function, and its relation to the correlation properties of the systems are examined. Various methods for finding the memory function via the correlation function are proposed. The inverse problem (calculation of the correlation function by means of the prescribed memory function) is also solved. This is demonstrated for the analytically solvable model of the system with a step-wise memory function.
A theory of additive Markov chains with long-range memory is used for description of correlation properties of coarse-grained literary texts. The complex structure of the correlations in texts is revealed. Antipersistent correlations at small distances, L < 300, and persistent ones at L > 300 define this nontrivial structure. For some concrete examples of literary texts, the memory functions are obtained and their power-law behavior at long distances is disclosed. This property is shown to be a cause of self-similarity of texts with respect to the decimation procedure.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا