ﻻ يوجد ملخص باللغة العربية
A common task in physics, information theory, and other fields is the analysis of properties of subsystems of a given system. Given the covariance matrix $M$ of a system of $n$ coupled variables, the covariance matrices of the subsystems are principal submatrices of $M$. The rapid growth with $n$ of the set of principal submatrices makes it impractical to exhaustively study each submatrix for even modestly-sized systems. It is therefore of great interest to derive methods for approximating the distributions of important submatrix properties for a given matrix. Motivated by the importance of differential entropy as a systemic measure of disorder, we study the distribution of log-determinants of principal $ktimes k$ submatrices when the covariance matrix has bounded condition number. We derive upper bounds for the right tail and the variance of the distribution of minors, and we use these in turn to derive upper bounds on the standard error of the sample mean of subsystem entropy. Our results demonstrate that, despite the rapid growth of the set of subsystems with $n$, the number of samples that are needed to bound the sampling error is asymptotically independent of $n$. Instead, it is sufficient to increase the number of samples in linear proportion to $k$ to achieve a desired sampling accuracy.
Two-sided bounds are explored for concentration functions and Renyi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy power inequalities.
We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intri
We consider the task of estimating the entropy of $k$-ary distributions from samples in the streaming model, where space is limited. Our main contribution is an algorithm that requires $Oleft(frac{k log (1/varepsilon)^2}{varepsilon^3}right)$ samples
We establish a discrete analog of the Renyi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within log e of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephar
We prove that if ${(P_x)}_{xin mathscr X}$ is a family of probability measures which satisfy the log-Sobolev inequality and whose pairwise chi-squared divergences are uniformly bounded, and $mu$ is any mixing distribution on $mathscr X$, then the mix