No Arabic abstract
Big data often has emergent structure that exists at multiple levels of abstraction, which are useful for characterizing complex interactions and dynamics of the observations. Here, we consider multiple levels of abstraction via a multiresolution geometry of data points at different granularities. To construct this geometry we define a time-inhomogeneous diffusion process that effectively condenses data points together to uncover nested groupings at larger and larger granularities. This inhomogeneous process creates a deep cascade of intrinsic low pass filters on the data affinity graph that are applied in sequence to gradually eliminate local variability while adjusting the learned data geometry to increasingly coarser resolutions. We provide visualizations to exhibit our method as a continuously-hierarchical clustering with directions of eliminated variation highlighted at each step. The utility of our algorithm is demonstrated via neuronal data condensation, where the constructed multiresolution data geometry uncovers the organization, grouping, and connectivity between neurons.
A classic theorem in the theory of connections on principal fiber bundles states that the evaluation of all holonomy functions gives enough information to characterize the bundle structure (among those sharing the same structure group and base manifold) and the connection up to a bundle equivalence map. This result and other important properties of holonomy functions has encouraged their use as the primary ingredient for the construction of families of quantum gauge theories. However, in these applications often the set of holonomy functions used is a discrete proper subset of the set of holonomy functions needed for the characterization theorem to hold. We show that the evaluation of a discrete set of holonomy functions does not characterize the bundle and does not constrain the connection modulo gauge appropriately. We exhibit a discrete set of functions of the connection and prove that in the abelian case their evaluation characterizes the bundle structure (up to equivalence), and constrains the connection modulo gauge up to local details ignored when working at a given scale. The main ingredient is the Lie algebra valued curvature function $F_S (A)$ defined below. It covers the holonomy function in the sense that $exp{F_S (A)} = {rm Hol}(l= partial S, A)$.
We consider the application of fluctuation relations to the dynamics of coarse-grained systems, as might arise in a hypothetical experiment in which a system is monitored with a low-resolution measuring apparatus. We analyze a stochastic, Markovian jump process with a specific structure that lends itself naturally to coarse-graining. A perturbative analysis yields a reduced stochastic jump process that approximates the coarse-grained dynamics of the original system. This leads to a non-trivial fluctuation relation that is approximately satisfied by the coarse-grained dynamics. We illustrate our results by computing the large deviations of a particular stochastic jump process. Our results highlight the possibility that observed deviations from fluctuation relations might be due to the presence of unobserved degrees of freedom.
Developing effective descriptions of the microscopic dynamics of many physical phenomena can both dramatically enhance their computational exploration and lead to a more fundamental understanding of the underlying physics. Previously, an effective description of a driven interface in the presence of mobile impurities, based on an Ising variant model and a single empirical coarse variable, was partially successful; yet it underlined the necessity of selecting additional coarse variables in certain parameter regimes. In this paper we use a data mining approach to help identify the coarse variables required. We discuss the implementation of this diffusion map approach, the selection of a similarity measure between system snapshots required in the approach, and the correspondence between empirically selected and automatically detected coarse variables. We conclude by illustrating the use of the diffusion map variables in assisting the atomistic simulations, and we discuss the translation of information between fine and coarse descriptions using lifting and restriction operators.
The fluctuation-dissipation theorem is a central result in statistical mechanics and is usually formulated for systems described by diffusion processes. In this paper, we propose a generalization for a wider class of stochastic processes, namely the class of Markov processes that satisfy detailed balance and a large-deviation principle. The generalized fluctuation-dissipation theorem characterizes the deterministic limit of such a Markov process as a generalized gradient flow, a mathematical tool to model a purely irreversible dynamics via a dissipation potential and an entropy function: these are expressed in terms of the large-deviation dynamic rate function of the Markov process and its stationary distribution. We exploit the generalized fluctuation-dissipation theorem to develop a new method of coarse-graining and test it in the context of the passage from the diffusion in a double-well potential to the jump process that describes the simple reaction $A rightleftarrows B$ (Kramers escape problem).
We study the coarse-graining approach to derive a generator for the evolution of an open quantum system over a finite time interval. The approach does not require a secular approximation but nevertheless generally leads to a Lindblad-Gorini-Kossakowski-Sudarshan generator. By combining the formalism with Full Counting Statistics, we can demonstrate a consistent thermodynamic framework, once the switching work required for the coupling and decoupling with the reservoir is included. Particularly, we can write the second law in standard form, with the only difference that heat currents must be defined with respect to the reservoir. We exemplify our findings with simple but pedagogical examples.