The classical problem of moments is addressed by the maximum entropy approach for one-dimensional discrete distributions. The numerical technique of adaptive support approximation is proposed to reconstruct the distributions in the region where the main part of probability mass is located.
We develop the complex-analytic viewpoint on the tree convolutions studied by the second author and Weihua Liu in An operad of non-commutative independences defined by trees (Dissertationes Mathematicae, 2020, doi:10.4064/dm797-6-2020), which generalize the free, boolean, monotone, and orthogonal convolutions. In particular, for each rooted subtree $mathcal{T}$ of the $N$-regular tree (with vertices labeled by alternating strings), we define the convolution $boxplus_{mathcal{T}}(mu_1,dots,mu_N)$ for arbitrary probability measures $mu_1$, ..., $mu_N$ on $mathbb{R}$ using a certain fixed-point equation for the Cauchy transforms. The convolution operations respect the operad structure of the tree operad from doi:10.4064/dm797-6-2020. We prove a general limit theorem for iterated $mathcal{T}$-free convolution similar to Bercovici and Patas results in the free case in Stable laws and domains of attraction in free probability (Annals of Mathematics, 1999, doi:10.2307/121080), and we deduce limit theorems for measures in the domain of attraction of each of the classical stable laws.
We consider general systems of ordinary differential equations with monotonic Gibbs entropy, and introduce an entropic scheme that simply imposes an entropy fix after every time step of any existing time integrator. It is proved that in the general case, our entropy fix has only infinitesimal influence on the numerical order of the original scheme, and in many circumstances, it can be shown that the scheme does not affect the numerical order. Numerical experiments on the linear Fokker-Planck equation and nonlinear Boltzmann equation are carried out to support our numerical analysis.
We consider the problem of estimating Shannons entropy $H$ from discrete data, in cases where the number of possible symbols is unknown or even countably infinite. The Pitman-Yor process, a generalization of Dirichlet process, provides a tractable prior distribution over the space of countably infinite discrete distributions, and has found major applications in Bayesian non-parametric statistics and machine learning. Here we show that it also provides a natural family of priors for Bayesian entropy estimation, due to the fact that moments of the induced posterior distribution over $H$ can be computed analytically. We derive formulas for the posterior mean (Bayes least squares estimate) and variance under Dirichlet and Pitman-Yor process priors. Moreover, we show that a fixed Dirichlet or Pitman-Yor process prior implies a narrow prior distribution over $H$, meaning the prior strongly determines the entropy estimate in the under-sampled regime. We derive a family of continuous mixing measures such that the resulting mixture of Pitman-Yor processes produces an approximately flat prior over $H$. We show that the resulting Pitman-Yor Mixture (PYM) entropy estimator is consistent for a large class of distributions. We explore the theoretical properties of the resulting estimator, and show that it performs well both in simulation and in application to real data.
Solutions to conservation laws satisfy the monotonicity property: the number of local extrema is a non-increasing function of time, and local maximum/minimum values decrease/increase monotonically in time. This paper investigates this property from a numerical standpoint. We introduce a class of fully discrete in space and time, high order accurate, difference schemes, called generalized monotone schemes. Convergence toward the entropy solution is proven via a new technique of proof, assuming that the initial data has a finite number of extremum values only, and the flux-function is strictly convex. We define discrete paths of extrema by tracking local extremum values in the approximate solution. In the course of the analysis we establish the pointwise convergence of the trace of the solution along a path of extremum. As a corollary, we obtain a proof of convergence for a MUSCL-type scheme being second order accurate away from sonic points and extrema.
Two-sided bounds are explored for concentration functions and Renyi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy power inequalities.
Alexander Andreychenko
,Linar Mikeev
,Verena Wolf
.
(2014)
.
"Maximum Entropy Reconstruction for Discrete Distributions with Unbounded Support"
.
Alexander Andreychenko
هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا