No Arabic abstract
Matrix scaling is a classical problem with a wide range of applications. It is known that the Sinkhorn algorithm for matrix scaling is interpreted as alternating e-projections from the viewpoint of classical information geometry. Recently, a generalization of matrix scaling to completely positive maps called operator scaling has been found to appear in various fields of mathematics and computer science, and the Sinkhorn algorithm has been extended to operator scaling. In this study, the operator Sinkhorn algorithm is studied from the viewpoint of quantum information geometry through the Choi representation of completely positive maps. The operator Sinkhorn algorithm is shown to coincide with alternating e-projections with respect to the symmetric logarithmic derivative metric, which is a Riemannian metric on the space of quantum states relevant to quantum estimation theory. Other types of alternating e-projections algorithms are also provided by using different information geometric structures on the positive definite cone.
We prove the correspondence between the information geometry of a signal filter and a Kahler manifold. The information geometry of a minimum-phase linear system with a finite complex cepstrum norm is a Kahler manifold. The square of the complex cepstrum norm of the signal filter corresponds to the Kahler potential. The Hermitian structure of the Kahler manifold is explicitly emergent if and only if the impulse response function of the highest degree in $z$ is constant in model parameters. The Kahlerian information geometry takes advantage of more efficient calculation steps for the metric tensor and the Ricci tensor. Moreover, $alpha$-generalization on the geometric tensors is linear in $alpha$. It is also robust to find Bayesian predictive priors, such as superharmonic priors, because Laplace-Beltrami operators on Kahler manifolds are in much simpler forms than those of the non-Kahler manifolds. Several time series models are studied in the Kahlerian information geometry.
This paper develops computable metrics to assign priorities for information collection on network systems made up by binary components. Components are worth inspecting because their condition state is uncertain and the system functioning depends on it. The Value of Information (VoI) allows assessing the impact of information in decision making under uncertainty, including the precision of the observation, the available actions and the expected economic loss. Some VoI-based metrics for system-level and component-level maintenance actions, defined as global and local metrics, respectively, are introduced, analyzed and applied to series and parallel systems. Their computationally complexity of applications to general networks is discussed and, to tame the complexity for the local metric assessment, a heuristic is presented and its performance is compared on some case studies.
This paper is a strongly geometrical approach to the Fisher distance, which is a measure of dissimilarity between two probability distribution functions. The Fisher distance, as well as other divergence measures, are also used in many applications to establish a proper data average. The main purpose is to widen the range of possible interpretations and relations of the Fisher distance and its associated geometry for the prospective applications. It focuses on statistical models of the normal probability distribution functions and takes advantage of the connection with the classical hyperbolic geometry to derive closed forms for the Fisher distance in several cases. Connections with the well-known Kullback-Leibler divergence measure are also devised.
Many stochastic complex systems are characterized by the fact that their configuration space doesnt grow exponentially as a function of the degrees of freedom. The use of scaling expansions is a natural way to measure the asymptotic growth of the configuration space volume in terms of the scaling exponents of the system. These scaling exponents can, in turn, be used to define universality classes that uniquely determine the statistics of a system. Every system belongs to one of these classes. Here we derive the information geometry of scaling expansions of sample spaces. In particular, we present the deformed logarithms and the metric in a systematic and coherent way. We observe a phase transition for the curvature. The phase transition can be well measured by the characteristic length r, corresponding to a ball with radius 2r having the same curvature as the statistical manifold. Increasing characteristic length with respect to the size of the system is associated with sub-exponential sample space growth is associated with strongly constrained and correlated complex systems. Decreasing of the characteristic length corresponds to super-exponential sample space growth that occurs for example in systems that develop structure as they evolve. Constant curvature means exponential sample space growth that is associated with multinomial statistics, and traditional Boltzmann-Gibbs, or Shannon statistics applies. This allows us to characterize transitions between statistical manifolds corresponding to different families of probability distributions.
The Fredholm integral equations of the first kind are a classical example of ill-posed problem in the sense of Hadamard. If the integral operator is self-adjoint and admits a set of eigenfunctions, then a formal solution can be written in terms of eigenfunction expansions. One of the possible methods of regularization consists in truncating this formal expansion after restricting the class of admissible solutions through a-priori global bounds. In this paper we reconsider various possible methods of truncation from the viewpoint of the $varepsilon$-coverings of compact sets.