No Arabic abstract
Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.
In this thesis we present few theoretical studies of the models of self-organized criticality. Following a brief introduction of self-organized criticality, we discuss three main problems. The first problem is about growing patterns formed in the abelian sandpile model (ASM). The patterns exhibit proportionate growth where different parts of the pattern grow in same rate, keeping the overall shape unchanged. This non-trivial property, often found in biological growth, has received increasing attention in recent years. In this thesis, we present a mathematical characterization of a large class of such patterns in terms of discrete holomorphic functions. In the second problem, we discuss a well known model of self-organized criticality introduced by Zhang in 1989. We present an exact analysis of the model and quantitatively explain an intriguing property known as the emergence of quasi-units. In the third problem, we introduce an operator algebra to determine the steady state of a class of stochastic sandpile models.
Many stochastic complex systems are characterized by the fact that their configuration space doesnt grow exponentially as a function of the degrees of freedom. The use of scaling expansions is a natural way to measure the asymptotic growth of the configuration space volume in terms of the scaling exponents of the system. These scaling exponents can, in turn, be used to define universality classes that uniquely determine the statistics of a system. Every system belongs to one of these classes. Here we derive the information geometry of scaling expansions of sample spaces. In particular, we present the deformed logarithms and the metric in a systematic and coherent way. We observe a phase transition for the curvature. The phase transition can be well measured by the characteristic length r, corresponding to a ball with radius 2r having the same curvature as the statistical manifold. Increasing characteristic length with respect to the size of the system is associated with sub-exponential sample space growth is associated with strongly constrained and correlated complex systems. Decreasing of the characteristic length corresponds to super-exponential sample space growth that occurs for example in systems that develop structure as they evolve. Constant curvature means exponential sample space growth that is associated with multinomial statistics, and traditional Boltzmann-Gibbs, or Shannon statistics applies. This allows us to characterize transitions between statistical manifolds corresponding to different families of probability distributions.
Stochastic entropy production, which quantifies the difference between the probabilities of trajectories of a stochastic dynamics and its time reversals, has a central role in nonequilibrium thermodynamics. In the theory of probability, the change in the statistical properties of observables can be represented by a change in the probability measure. We consider operators on the space of probability measure that induce changes in the statistical properties of a process, and formulate entropy productions in terms of these change-of-probability-measure (CPM) operators. This mathematical underpinning of the origin of entropy productions allows us to achieve an organization of various forms of fluctuation relations: All entropy productions have a non-negative mean value, admit the integral fluctuation theorem, and satisfy a rather general fluctuation relation. Other results such as the transient fluctuation theorem and detailed fluctuation theorems then are derived from the general fluctuation relation with more constraints on the operator. We use a discrete-time, discrete-state-space Markov process to draw the contradistinction among three reversals of a process: time reversal, protocol reversal and the dual process. The properties of their corresponding CPM operators are examined, and the domains of validity of various fluctuation relations for entropy productions in physics and chemistry are revealed. We also show that our CPM operator formalism can help us rather easily extend other fluctuations relations for excess work and heat, discuss the martingale properties of entropy productions, and derive the stochastic integral formulas for entropy productions in constant-noise diffusion process with Girsanov theorem. Our formalism provides a general and concise way to study the properties of entropy-related quantities in stochastic thermodynamics and information theory.
We study the time-averaged flow in a model of particles that randomly hop on a finite directed graph. In the limit as the number of particles and the time window go to infinity but the graph remains finite, the large-deviation rate functional of the average flow is given by a variational formulation involving paths of the density and flow. We give sufficient conditions under which the large deviations of a given time averaged flow is determined by paths that are constant in time. We then consider a class of models on a discrete ring for which it is possible to show that a better strategy is obtained producing a time-dependent path. This phenomenon, called a dynamical phase transition, is known to occur for some particle systems in the hydrodynamic scaling limit, which is thus extended to the setting of a finite graph.
Large deviation theory and instanton calculus for stochastic systems are widely used to gain insight into the evolution and probability of rare events. At its core lies the realization that rare events are, under the right circumstances, dominated by their least unlikely realization. Their computation through a saddle-point approximation of the path integral for the corresponding stochastic field theory then reduces an inefficient stochastic sampling problem into a deterministic optimization problem: finding the path of smallest action, the instanton. In the presence of heavy tails, though, standard algorithms to compute the instanton critically fail to converge. The reason for this failure is the divergence of the scaled cumulant generating function (CGF) due to a non-convex large deviation rate function. We propose a solution to this problem by convexifying the rate function through nonlinear reparametrization of the observable, which allows us to compute instantons even in the presence of super-exponential or algebraic tail decay. The approach is generalizable to other situations where the existence of the CGF is required, such as exponential tilting in importance sampling for Monte-Carlo algorithms. We demonstrate the proposed formalism by applying it to rare events in several stochastic systems with heavy tails, including extreme power spikes in fiber optics induced by soliton formation.