No Arabic abstract
How low can the joint entropy of $n$ $d$-wise independent (for $dge2$) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than $p$, for $p<1$)? This question has been posed and partially answered in a recent work of Babai. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of $n$.
In this note, we prove a tight lower bound on the joint entropy of $n$ unbiased Bernoulli random variables which are $n/2$-wise independent. For general $k$-wise independence, we give new lower bounds by adapting Navon and Samorodnitskys Fourier proof of the `LP bound on error correcting codes. This counts as partial progress on a problem asked by Gavinsky and Pudlak.
The total influence of a function is a central notion in analysis of Boolean functions, and characterizing functions that have small total influence is one of the most fundamental questions associated with it. The KKL theorem and the Friedgut junta theorem give a strong characterization of such functions whenever the bound on the total influence is $o(log n)$. However, both results become useless when the total influence of the function is $omega(log n)$. The only case in which this logarithmic barrier has been broken for an interesting class of functions was proved by Bourgain and Kalai, who focused on functions that are symmetric under large enough subgroups of $S_n$. In this paper, we build and improve on the techniques of the Bourgain-Kalai paper and establish new concentration results on the Fourier spectrum of Boolean functions with small total influence. Our results include: 1. A quantitative improvement of the Bourgain--Kalai result regarding the total influence of functions that are transitively symmetric. 2. A slightly weaker version of the Fourier--Entropy Conjecture of Friedgut and Kalai. This weaker version implies in particular that the Fourier spectrum of a constant variance, Boolean function $f$ is concentrated on $2^{O(I[f]log I[f])}$ characters, improving an earlier result of Friedgut. Removing the $log I[f]$ factor would essentially resolve the Fourier--Entropy Conjecture, as well as settle a conjecture of Mansour regarding the Fourier spectrum of polynomial size DNF formulas. Our concentration result has new implications in learning theory: it implies that the class of functions whose total influence is at most $K$ is agnostically learnable in time $2^{O(Klog K)}$, using membership queries.
Let $Omega$ be a bounded closed convex set in ${mathbb R}^d$ with non-empty interior, and let ${cal C}_r(Omega)$ be the class of convex functions on $Omega$ with $L^r$-norm bounded by $1$. We obtain sharp estimates of the $epsilon$-entropy of ${cal C}_r(Omega)$ under $L^p(Omega)$ metrics, $1le p<rle infty$. In particular, the results imply that the universal lower bound $epsilon^{-d/2}$ is also an upper bound for all $d$-polytopes, and the universal upper bound of $epsilon^{-frac{(d-1)}{2}cdot frac{pr}{r-p}}$ for $p>frac{dr}{d+(d-1)r}$ is attained by the closed unit ball. While a general convex body can be approximated by inscribed polytopes, the entropy rate does not carry over to the limiting body. Our results have applications to questions concerning rates of convergence of nonparametric estimators of high-dimensional shape-constrained functions.
We show that the number of independent sets in cocomparability graphs can be counted in linear time, as can counting cliques in comparability graphs. By contrast, counting cliques in cocomparabilty graphs and counting independent sets in comparability graphs are #P-complete. We extend these results to counting maximal cliques and independent sets. We also consider the fixed-paramet
We show that the entropy of a message can be tested in a device-independent way. Specifically, we consider a prepare-and-measure scenario with classical or quantum communication, and develop two different methods for placing lower bounds on the communication entropy, given observable data. The first method is based on the framework of causal inference networks. The second technique, based on convex optimization, shows that quantum communication provides an advantage over classical, in the sense of requiring a lower entropy to reproduce given data. These ideas may serve as a basis for novel applications in device-independent quantum information processing.