ﻻ يوجد ملخص باللغة العربية
For generic systems exhibiting power law behaviors, and hence multiscale dependencies, we propose a new, and yet simple, tool to analyze multifractality and intermittency, after noticing that these concepts are directly related to the deformation of a probability density function from Gaussian at large scales to non-Gaussian at smaller scales. Our framework is based on information theory, and uses Shannon entropy and Kullback-Leibler divergence. We propose an extensive application to three-dimensional fully developed turbulence, seen here as a paradigmatic complex system where intermittency was historically defined. Moreover, the concepts of scale invariance and multifractality were extensively studied in this field and, most importantly, benchmarked. We compute our measure on experimental Eulerian velocity measurements, as well as on synthetic processes and a phenomenological model of fluid turbulence.Our approach is very general and does not require any underlying model of the system, although it can probe the relevance of such a model.
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannons entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average app
We introduce hardness in relative entropy, a new notion of hardness for search problems which on the one hand is satisfied by all one-way functions and on the other hand implies both next-block pseudoentropy and inaccessible entropy, two forms of com
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simp
Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two $n$-dimensional Gaussians $math