No Arabic abstract
We introduce a notion called entropic independence for distributions $mu$ defined on pure simplicial complexes, i.e., subsets of size $k$ of a ground set of elements. Informally, we call a background measure $mu$ entropically independent if for any (possibly randomly chosen) set $S$, the relative entropy of an element of $S$ drawn uniformly at random carries at most $O(1/k)$ fraction of the relative entropy of $S$, a constant multiple of its ``share of entropy. Entropic independence is the natural analog of spectral independence, another recently established notion, if one replaces variance by entropy. In our main result, we show that $mu$ is entropically independent exactly when a transformed version of the generating polynomial of $mu$ can be upper bounded by its linear tangent, a property implied by concavity of the said transformation. We further show that this concavity is equivalent to spectral independence under arbitrary external fields, an assumption that also goes by the name of fractional log-concavity. Our result can be seen as a new tool to establish entropy contraction from the much simpler variance contraction inequalities. A key differentiating feature of our result is that we make no assumptions on marginals of $mu$ or the degrees of the underlying graphical model when $mu$ is based on one. We leverage our results to derive tight modified log-Sobolev inequalities for multi-step down-up walks on fractionally log-concave distributions. As our main application, we establish the tight mixing time of $O(nlog n)$ for Glauber dynamics on Ising models with interaction matrix of operator norm smaller than $1$, improving upon the prior quadratic dependence on $n$.
We show fully polynomial time randomized approximation schemes (FPRAS) for counting matchings of a given size, or more generally sampling/counting monomer-dimer systems in planar, not-necessarily-bipartite, graphs. While perfect matchings on planar graphs can be counted exactly in polynomial time, counting non-perfect matchings was shown by [Jer87] to be #P-hard, who also raised the question of whether efficient approximate counting is possible. We answer this affirmatively by showing that the multi-site Glauber dynamics on the set of monomers in a monomer-dimer system always mixes rapidly, and that this dynamics can be implemented efficiently on downward-closed families of graphs where counting perfect matchings is tractable. As further applications of our results, we show how to sample efficiently using multi-site Glauber dynamics from partition-constrained strongly Rayleigh distributions, and nonsymmetric determinantal point processes. In order to analyze mixing properties of the multi-site Glauber dynamics, we establish two notions for generating polynomials of discrete set-valued distributions: sector-stability and fractional log-concavity. These notions generalize well-studied properties like real-stability and log-concavity, but unlike them robustly degrade under useful transformations applied to the distribution. We relate these notions to pairwise correlations in the underlying distribution and the notion of spectral independence introduced by [ALO20], providing a new tool for establishing spectral independence based on geometry of polynomials. As a byproduct of our techniques, we show that polynomials avoiding roots in a sector of the complex plane must satisfy what we call fractional log-concavity; this extends a classic result established by [Gar59] who showed homogeneous polynomials that have no roots in a half-plane must be log-concave over the positive orthant.
The log-concave maximum likelihood estimator (MLE) problem answers: for a set of points $X_1,...X_n in mathbb R^d$, which log-concave density maximizes their likelihood? We present a characterization of the log-concave MLE that leads to an algorithm with runtime $poly(n,d, frac 1 epsilon,r)$ to compute a log-concave distribution whose log-likelihood is at most $epsilon$ less than that of the MLE, and $r$ is parameter of the problem that is bounded by the $ell_2$ norm of the vector of log-likelihoods the MLE evaluated at $X_1,...,X_n$.
In his work about hypocercivity, Villani [18] considers in particular convergence to equilibrium for the kinetic Langevin process. While his convergence results in L 2 are given in a quite general setting, convergence in entropy requires some boundedness condition on the Hessian of the Hamiltonian. We will show here how to get rid of this assumption in the study of the hypocoercive entropic relaxation to equilibrium for the Langevin diffusion. Our method relies on a generalization to entropy of the multipliers method and an adequate functional inequality. As a byproduct, we also give tractable conditions for this functional inequality, which is a particular instance of a weighted logarithmic Sobolev inequality, to hold.
We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intrinsic volumes of a convex body, which generalizes and improves a result of Lotz, McCoy, Nourdin, Peccati, and Tropp (2019).
We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A functional analog of this geometric inequality is derived, giving large and small deviation inequalities from a median, in terms of a modulus of regularity. Our methods are of independent interest, we find that log-affine sequences are the extreme points of the set of log-concave sequences belonging to a half-space slice of the simplex. This amounts to a discrete analog of the localization lemma of Lovasz and Simonovits. Further applications of this lemma are used to produce a discrete version of the Prekopa-Leindler inequality, large deviation inequalities for log-concave measures about their mean, and provide insight on the stability of generalized log-concavity under convolution.