ﻻ يوجد ملخص باللغة العربية
How low can the joint entropy of $n$ $d$-wise independent (for $dge2$) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than $p$, for $p<1$)? This question has been posed and partially answered in a recent work of Babai. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of $n$.
In this note, we prove a tight lower bound on the joint entropy of $n$ unbiased Bernoulli random variables which are $n/2$-wise independent. For general $k$-wise independence, we give new lower bounds by adapting Navon and Samorodnitskys Fourier proo
The total influence of a function is a central notion in analysis of Boolean functions, and characterizing functions that have small total influence is one of the most fundamental questions associated with it. The KKL theorem and the Friedgut junta t
Let $Omega$ be a bounded closed convex set in ${mathbb R}^d$ with non-empty interior, and let ${cal C}_r(Omega)$ be the class of convex functions on $Omega$ with $L^r$-norm bounded by $1$. We obtain sharp estimates of the $epsilon$-entropy of ${cal C
We show that the number of independent sets in cocomparability graphs can be counted in linear time, as can counting cliques in comparability graphs. By contrast, counting cliques in cocomparabilty graphs and counting independent sets in comparabilit
We show that the entropy of a message can be tested in a device-independent way. Specifically, we consider a prepare-and-measure scenario with classical or quantum communication, and develop two different methods for placing lower bounds on the commu