No Arabic abstract
Structure-forming systems are ubiquitous in nature, ranging from atoms building molecules to self-assembly of colloidal amphibolic particles. The understanding of the underlying thermodynamics of such systems remains an important problem. Here we derive the entropy for structure-forming systems that differs from Boltzmann-Gibbs entropy by a term that explicitly captures clustered states. For large systems and low concentrations, the approach is equivalent to the grand-canonical ensemble; for small systems, we find significant deviations. We derive the detailed fluctuation theorem and Crooks work fluctuation theorem for structure-forming systems. The connection to the theory of particle self-assembly is discussed. We apply the results to several physical systems. We present the phase diagram for patchy particles described by the Kern-Frenkel potential. We show that the Curie-Weiss model with molecule structures exhibits a first-order phase transition.
We study the thermodynamics and critical behavior of su($m|n$) supersymmetric spin chains of Haldane-Shastry type with a chemical potential term. We obtain a closed-form expression for the partition function and deduce a description of the spectrum in terms of the supersymmetric version of Haldanes motifs, which we apply to obtain an analytic expression for the free energy per site in the thermodynamic limit. By studying the low-temperature behavior of the free energy, we characterize the critical behavior of the chains with $1le m,nle2$, determining the critical regions and the corresponding central charge. We also show that in the su($2|1$), su($1|2$) and su($2|2$) chains the bosonic or fermionic densities can undergo first-order (discontinuous) phase transitions at $T=0$, in contrast with the previously studied su(2) case.
Understanding the thermodynamics of the duplication process is a fundamental step towards a comprehensive physical theory of biological systems. However, the immense complexity of real cells obscures the fundamental tensions between energy gradients and entropic contributions that underlie duplication. The study of synthetic, feasible systems reproducing part of the key ingredients of living entities but overcoming major sources of biological complexity is of great relevance to deepen the comprehension of the fundamental thermodynamic processes underlying life and its prevalence. In this paper an abstract -- yet realistic -- synthetic system made of small synthetic protocell aggregates is studied in detail. A fundamental relation between free energy and entropic gradients is derived for a general, non-equilibrium scenario, setting the thermodynamic conditions for the occurrence and prevalence of duplication phenomena. This relation sets explicitly how the energy gradients invested in creating and maintaining structural -- and eventually, functional -- elements of the system must always compensate the entropic gradients, whose contributions come from changes in the translational, configurational and macrostate entropies, as well as from dissipation due to irreversible transitions. Work/energy relations are also derived, defining lower bounds on the energy required for the duplication event to take place. A specific example including real ternary emulsions is provided in order to grasp the orders of magnitude involved in the problem. It is found that the minimal work invested over the system to trigger a duplication event is around $sim 10^{-13}{rm J}$. Without aiming to describe a truly biological process of duplication, this theoretical contribution seeks to explicitly define and identify the key actors that participate in it.
When noninteracting fermions are confined in a $D$-dimensional region of volume $mathrm{O}(L^D)$ and subjected to a continuous (or piecewise continuous) potential $V$ which decays sufficiently fast with distance, in the thermodynamic limit, the ground state energy of the system does not depend on $V$. Here, we discuss this theorem from several perspectives and derive a proof for radially symmetric potentials valid in $D$ dimensions. We find that this universality property holds under a quite mild condition on $V$, with or without bounded states, and extends to thermal states. Moreover, it leads to an interesting analogy between Andersons orthogonality catastrophe and first-order quantum phase transitions.
We report a rigorous theory to show the origin of the unexpected periodic behavior seen in the consecutive differences between prime numbers. We also check numerically our findings to ensure that they hold for finite sequences of primes, that would eventually appear in applications. Finally, our theory allows us to link with three different but important topics: the Hardy-Littlewood conjecture, the statistical mechanics of spin systems, and the celebrated Sierpinski fractal.
The nature of the behaviour of an isolated many-body quantum system periodically driven in time has been an open question since the beginning of quantum mechanics. After an initial transient, such a system is known to synchronize with the driving; in contrast to the non-driven case, no fundamental principle has been proposed for constructing the resulting non-equilibrium state. Here, we analytically show that, for a class of integrable systems, the relevant ensemble is constructed by maximizing an appropriately defined entropy subject to constraints, which we explicitly identify. This result constitutes a generalisation of the concepts of equilibrium statistical mechanics to a class of far-from-equilibrium-systems, up to now mainly accessible using ad-hoc methods.