No Arabic abstract
We discuss a non-reversible Markov chain Monte Carlo (MCMC) algorithm for particle systems, in which the direction of motion evolves deterministically. This sequential direction-sweep MCMC generalizes the widely spread MCMC sweep methods for particle or spin indices. The sequential direction-sweep MCMC can be applied to a wide range of original reversible or non-reversible Markov chains, such as the Metropolis algorithm or the event-chain Monte Carlo algorithm. For a simplified two-dimensional dipole model, we show rigorously that sequential MCMC leaves the stationary probability distribution unchanged, yet it profoundly modifies the Markov-chain trajectory. Long excursions, with persistent rotation in one direction, alternate with long sequences of rapid zigzags resulting in persistent rotation in the opposite direction. We show that sequential MCMC can have shorter mixing times than the algorithms with random updates of directions. We point out possible applications of sequential MCMC in polymer physics and in molecular simulation.
The event-chain Monte Carlo (ECMC) method is an irreversible Markov process based on the factorized Metropolis filter and the concept of lifted Markov chains. Here, ECMC is applied to all-atom models of multi-particle interactions that include the long-ranged Coulomb potential. We discuss a line-charge model for the Coulomb potential and demonstrate its equivalence with the standard Coulomb model with tin-foil boundary conditions. Efficient factorization schemes for the potentials used in all-atom water models are presented, before we discuss the best choice for lifting schemes for factors of more than three particles. The factorization and lifting schemes are then applied to simulations of point-charge and charged-dipole Coulomb gases, as well as to small systems of liquid water. For a locally charge-neutral system in three dimensions, the algorithmic complexity is O(N log N) in the number N of particles. In ECMC, a Particle-Particle method, it is achieved without the interpolating mesh required for the efficient implementation of other modern Coulomb algorithms. An event-driven, cell-veto-based implementation samples the equilibrium Boltzmann distribution using neither time-step approximations nor spatial cutoffs on the range of the interaction potentials. We discuss prospects and challenges for ECMC in soft condensed-matter and biological physics.
We describe a simple method that can be used to sample the rare fluctuations of discrete-time Markov chains. We focus on the case of Markov chains with well-defined steady-state measures, and derive expressions for the large-deviation rate functions (and upper bounds on such functions) for dynamical quantities extensive in the length of the Markov chain. We illustrate the method using a series of simple examples, and use it to study the fluctuations of a lattice-based model of active matter that can undergo motility-induced phase separation.
We analyze the convergence of the irreversible event-chain Monte Carlo algorithm for continuous spin models in the presence of topological excitations. In the two-dimensional XY model, we show that the local nature of the Markov-chain dynamics leads to slow decay of vortex-antivortex correlations while spin waves decorrelate very quickly. Using a Frechet description of the maximum vortex-antivortex distance, we quantify the contributions of topological excitations to the equilibrium correlations, and show that they vary from a dynamical critical exponent z sim 2 at the critical temperature to z sim 0 in the limit of zero temperature. We confirm the event-chain algorithms fast relaxation (corresponding to z = 0) of spin waves in the harmonic approximation to the XY model. Mixing times (describing the approach towards equilibrium from the least favorable initial state) however remain much larger than equilibrium correlation times at low temperatures. We also describe the respective influence of topological monopole-antimonopole excitations and of spin waves on the event-chain dynamics in the three-dimensional Heisenberg model.
We study the continuous one-dimensional hard-sphere model and present irreversible local Markov chains that mix on faster time scales than the reversible heatbath or Metropolis algorithms. The mixing time scales appear to fall into two distinct universality classes, both faster than for reversible local Markov chains. The event-chain algorithm, the infinitesimal limit of one of these Markov chains, belongs to the class presenting the fastest decay. For the lattice-gas limit of the hard-sphere model, reversible local Markov chains correspond to the symmetric simple exclusion process (SEP) with periodic boundary conditions. The two universality classes for irreversible Markov chains are realized by the totally asymmetric simple exclusion process (TASEP), and by a faster variant (lifted TASEP) that we propose here. Lifted Markov chains and the recently introduced factorized Metropolis acceptance rule extend the irreversible Markov chains discussed here to general pair interactions and to higher dimensions.
An overarching action principle, the principle of minimal free action, exists for ergodic Markov chain dynamics. Using this principle and the Detailed Fluctuation Theorem, we construct a dynamic ensemble theory for non-equilibrium steady states (NESS) of Markov chains, which is in full analogy with equilibrium canonical ensemble theory. Concepts such as energy, free energy, Boltzmann macro-sates, entropy, and thermodynamic limit all have their dynamic counterparts. For reversible Markov chains, minimization of Boltzmann free action yields thermal equilibrium states, and hence provide a dynamic justification of the principle of minimal free energy. For irreversible Markov chains, minimization of Boltzmann free action selects the stable NESS, and determines its macroscopic properties, including entropy production. A quadratic approximation of free action leads to linear-response theory with reciprocal relations built-in. Hence, in so much as non-equilibrium phenomena can be modeled as Markov processes, minimal free action serves as a basic principle for both equilibrium and non-equilibrium statistical physics.