ترغب بنشر مسار تعليمي؟ اضغط هنا

Thermodynamics and computation during collective motion near criticality

104   0   0.0 ( 0 )
 نشر من قبل Emanuele Crosato
 تاريخ النشر 2017
والبحث باللغة English




اسأل ChatGPT حول البحث

We study self-organisation of collective motion as a thermodynamic phenomenon, in the context of the first law of thermodynamics. It is expected that the coherent ordered motion typically self-organises in the presence of changes in the (generalised) internal energy and of (generalised) work done on, or extracted from, the system. We aim to explicitly quantify changes in these two quantities in a system of simulated self-propelled particles, and contrast them with changes in the systems configuration entropy. In doing so, we adapt a thermodynamic formulation of the curvatures of the internal energy and the work, with respect to two parameters that control the particles alignment. This allows us to systematically investigate the behaviour of the system by varying the two control parameters to drive the system across a kinetic phase transition. Our results identify critical regimes and show that during the phase transition, where the configuration entropy of the system decreases, the rates of change of the work and of the internal energy also decrease, while their curvatures diverge. Importantly, the reduction of entropy achieved through expenditure of work is shown to peak at criticality. We relate this both to a thermodynamic efficiency and the significance of the increased order with respect to a computational path. Additionally, this study provides an information-geometric interpretation of the curvature of the internal energy as the difference between two curvatures: the curvature of the free entropy, captured by the Fisher information, and the curvature of the configuration entropy.

قيم البحث

اقرأ أيضاً

Information processing typically occurs via the composition of modular units, such as universal logic gates. The benefit of modular information processing, in contrast to globally integrated information processing, is that complex global computations are more easily and flexibly implemented via a series of simpler, localized information processing operations which only control and change local degrees of freedom. We show that, despite these benefits, there are unavoidable thermodynamic costs to modularity---costs that arise directly from the operation of localized processing and that go beyond Landauers dissipation bound for erasing information. Integrated computations can achieve Landauers bound, however, when they globally coordinate the control of all of an information reservoirs degrees of freedom. Unfortunately, global correlations among the information-bearing degrees of freedom are easily lost by modular implementations. This is costly since such correlations are a thermodynamic fuel. We quantify the minimum irretrievable dissipation of modular computations in terms of the difference between the change in global nonequilibrium free energy, which captures these global correlations, and the local (marginal) change in nonequilibrium free energy, which bounds modular work production. This modularity dissipation is proportional to the amount of additional work required to perform the computational task modularly. It has immediate consequences for physically embedded transducers, known as information ratchets. We show how to circumvent modularity dissipation by designing internal ratchet states that capture the global correlations and patterns in the ratchets information reservoir. Designed in this way, information ratchets match the optimum thermodynamic efficiency of globally integrated computations.
We review the observations and the basic laws describing the essential aspects of collective motion -- being one of the most common and spectacular manifestation of coordinated behavior. Our aim is to provide a balanced discussion of the various face ts of this highly multidisciplinary field, including experiments, mathematical methods and models for simulations, so that readers with a variety of background could get both the basics and a broader, more detailed picture of the field. The observations we report on include systems consisting of units ranging from macromolecules through metallic rods and robots to groups of animals and people. Some emphasis is put on models that are simple and realistic enough to reproduce the numerous related observations and are useful for developing concepts for a better understanding of the complexity of systems consisting of many simultaneously moving entities. As such, these models allow the establishing of a few fundamental principles of flocking. In particular, it is demonstrated, that in spite of considerable differences, a number of deep analogies exist between equilibrium statistical physics systems and those made of self-propelled (in most cases living) units. In both cases only a few well defined macroscopic/collective states occur and the transitions between these states follow a similar scenario, involving discontinuity and algebraic divergences.
We give exact formulae for a wide family of complexity measures that capture the organization of hidden nonlinear processes. The spectral decomposition of operator-valued functions leads to closed-form expressions involving the full eigenvalue spectr um of the mixed-state presentation of a processs epsilon-machine causal-state dynamic. Measures include correlation functions, power spectra, past-future mutual information, transient and synchronization informations, and many others. As a result, a direct and complete analysis of intrinsic computation is now available for the temporal organization of finitary hidden Markov models and nonlinear dynamical systems with generating partitions and for the spatial organization in one-dimensional systems, including spin systems, cellular automata, and complex materials via chaotic crystallography.
66 - David H. Wolpert 2019
One of the major resource requirements of computers - ranging from biological cells to human brains to high-performance (engineered) computers - is the energy used to run them. Those costs of performing a computation have long been a focus of researc h in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
Path-dependent stochastic processes are often non-ergodic and observables can no longer be computed within the ensemble picture. The resulting mathematical difficulties pose severe limits to the analytical understanding of path-dependent processes. T heir statistics is typically non-multinomial in the sense that the multiplicities of the occurrence of states is not a multinomial factor. The maximum entropy principle is tightly related to multinomial processes, non-interacting systems, and to the ensemble picture; It loses its meaning for path-dependent processes. Here we show that an equivalent to the ensemble picture exists for path-dependent processes, such that the non-multinomial statistics of the underlying dynamical process, by construction, is captured correctly in a functional that plays the role of a relative entropy. We demonstrate this for self-reinforcing Polya urn processes, which explicitly generalise multinomial statistics. We demonstrate the adequacy of this constructive approach towards non-multinomial pendants of entropy by computing frequency and rank distributions of Polya urn processes. We show how microscopic update rules of a path-dependent process allow us to explicitly construct a non-multinomial entropy functional, that, when maximized, predicts the time-dependent distribution function.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا