No Arabic abstract
Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.
Tensor network (TN) techniques - often used in the context of quantum many-body physics - have shown promise as a tool for tackling machine learning (ML) problems. The application of TNs to ML, however, has mostly focused on supervised and unsupervised learning. Yet, with their direct connection to hidden Markov chains, TNs are also naturally suited to Markov decision processes (MDPs) which provide the foundation for reinforcement learning (RL). Here we introduce a general TN formulation of finite, episodic and discrete MDPs. We show how this formulation allows us to exploit algorithms developed for TNs for policy optimisation, the key aim of RL. As an application we consider the issue - formulated as an RL problem - of finding a stochastic evolution that satisfies specific dynamical conditions, using the simple example of random walk excursions as an illustration.
This PhD thesis deals with the Markov picture of developed turbulence from the theoretical point of view. The thesis consists of two parts. The first part introduces stochastic thermodynamics, the second part aims at transferring the concepts of stochastic thermodynamics to developed turbulence. / Central in stochastic thermodynamics are Markov processes. An elementary example is Brownian motion. In contrast to macroscopic thermodynamics, the work done and the entropy produced for single trajectories of the Brownian particles are random quantities. Statistical properties of such fluctuating quantities are central in the field of stochastic thermodynamics. Prominent results are so-called fluctuation theorems which express the balance between production and consumption of entropy and generalise the second law. / Turbulent cascades of eddies are assumed to be the predominant mechanism of turbulence generation and fix the statistical properties of developed turbulent flows. An intriguing phenomenon of developed turbulence, known as small-scale intermittency, are violent small-scale fluctuations in flow velocity that exceed any Gaussian prediction. / In analogy to Brownian motion, it is demonstrated in the thesis how the assumption of the Markov property leads to a Markov process for the turbulent cascade that is equivalent to the seminal K62 model. In addition to the K62 model, it is demonstrated how many other models of turbulence can be written as a Markov process, including scaling laws, multiplicative cascades, multifractal models and field-theoretic approaches. Based on the various Markov processes, the production of entropy along the cascade and the corresponding fluctuation theorems is discussed. In particular, experimental data indicates that entropy consumption is linked to small-scale intermittency, and a connection between entropy consumption and an inverse cascade is suggestive.
Non-equilibrium Markov State Modeling (MSM) has recently been proposed [Phys. Rev. E 94, 053001 (2016)] as a possible route to construct a physical theory of sliding friction from a long steady state atomistic simulation: the approach builds a small set of collective variables, which obey a transition-matrix based equation of motion, faithfully describing the slow motions of the system. A crucial question is whether this approach can be extended from the original 1D small size demo to larger and more realistic size systems, without an inordinate increase of the number and complexity of the collective variables. Here we present a direct application of the MSM scheme to the sliding of an island made of over 1000 harmonically bound particles over a 2D periodic potential. Based on a totally unprejudiced phase space metric and without requiring any special doctoring, we find that here too the scheme allows extracting a very small number of slow variables, necessary and sufficient to describe the dynamics of island sliding.
Markov State Modeling has recently emerged as a key technique for analyzing rare events in thermal equilibrium molecular simulations and finding metastable states. Here we export this technique to the study of friction, where strongly non-equilibrium events are induced by an external force. The approach is benchmarked on the well-studied Frenkel-Kontorova model, where we demonstrate the unprejudiced identification of the minimal basis microscopic states necessary for describing sliding, stick-slip and dissipation. The steps necessary for the application to realistic frictional systems are highlighted.
For a finite state Markov process and a finite collection ${ Gamma_k, k in K }$ of subsets of its state space, let $tau_k$ be the first time the process visits the set $Gamma_k$. We derive explicit/recursive formulas for the joint density and tail probabilities of the stopping times ${ tau_k, k in K}$. The formulas are natural generalizations of those associated with the jump times of a simple Poisson process. We give a numerical example and indicate the relevance of our results to credit risk modeling.