ﻻ يوجد ملخص باللغة العربية
We introduce a notion called entropic independence for distributions $mu$ defined on pure simplicial complexes, i.e., subsets of size $k$ of a ground set of elements. Informally, we call a background measure $mu$ entropically independent if for any (possibly randomly chosen) set $S$, the relative entropy of an element of $S$ drawn uniformly at random carries at most $O(1/k)$ fraction of the relative entropy of $S$, a constant multiple of its ``share of entropy. Entropic independence is the natural analog of spectral independence, another recently established notion, if one replaces variance by entropy. In our main result, we show that $mu$ is entropically independent exactly when a transformed version of the generating polynomial of $mu$ can be upper bounded by its linear tangent, a property implied by concavity of the said transformation. We further show that this concavity is equivalent to spectral independence under arbitrary external fields, an assumption that also goes by the name of fractional log-concavity. Our result can be seen as a new tool to establish entropy contraction from the much simpler variance contraction inequalities. A key differentiating feature of our result is that we make no assumptions on marginals of $mu$ or the degrees of the underlying graphical model when $mu$ is based on one. We leverage our results to derive tight modified log-Sobolev inequalities for multi-step down-up walks on fractionally log-concave distributions. As our main application, we establish the tight mixing time of $O(nlog n)$ for Glauber dynamics on Ising models with interaction matrix of operator norm smaller than $1$, improving upon the prior quadratic dependence on $n$.
We show fully polynomial time randomized approximation schemes (FPRAS) for counting matchings of a given size, or more generally sampling/counting monomer-dimer systems in planar, not-necessarily-bipartite, graphs. While perfect matchings on planar g
The log-concave maximum likelihood estimator (MLE) problem answers: for a set of points $X_1,...X_n in mathbb R^d$, which log-concave density maximizes their likelihood? We present a characterization of the log-concave MLE that leads to an algorithm
In his work about hypocercivity, Villani [18] considers in particular convergence to equilibrium for the kinetic Langevin process. While his convergence results in L 2 are given in a quite general setting, convergence in entropy requires some bounded
We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intri
We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A functional analog of this geometric inequality is