ﻻ يوجد ملخص باللغة العربية
We provide an Information-Geometric formulation of Classical Mechanics on the Riemannian manifold of probability distributions, which is an affine manifold endowed with a dually-flat connection. In a non-parametric formalism, we consider the full set of positive probability functions on a finite sample space, and we provide a specific expression for the tangent and cotangent spaces over the statistical manifold, in terms of a Hilbert bundle structure that we call the Statistical Bundle. In this setting, we compute velocities and accelerations of a one-dimensional statistical model using the canonical dual pair of parallel transports and define a coherent formalism for Lagrangian and Hamiltonian mechanics on the bundle. Finally, in a series of examples, we show how our formalism provides a consistent framework for accelerated natural gradient dynamics on the probability simplex, paving the way for direct applications in optimization, game theory and neural networks.
A framework for statistical-mechanical analysis of quantum Hamiltonians is introduced. The approach is based upon a gradient flow equation in the space of Hamiltonians such that the eigenvectors of the initial Hamiltonian evolve toward those of the r
The main result of this note is a characterization of the Poisson commutativity of Hamilton functions in terms of their principal action functions.
We show that there exists an underlying manifold with a conformal metric and compatible connection form, and a metric type Hamiltonian (which we call the geometrical picture) that can be put into correspondence with the usual Hamilton-Lagrange mechan
We generalize standard credal set models for imprecise probabilities to include higher order credal sets -- confidences about confidences. In doing so, we specify how an agents higher order confidences (credal sets) update upon observing an event. Ou
A central issue of many statistical learning problems is to select an appropriate model from a set of candidate models. Large models tend to inflate the variance (or overfitting), while small models tend to cause biases (or underfitting) for a given