No Arabic abstract
We prove upper bounds on deterministic communication complexity in terms of log of the rank and simp
In this work we introduce, both for classical communication complexity and query complexity, a modification of the partition bound introduced by Jain and Klauck [2010]. We call it the public-coin partition bound. We show that (the logarithm to the base two of) its communication complexity and query complexi
Three decades of research in communication complexity have led to the invention of a number of techniques to lower bound randomized communication complexity. The majority of these techniques involve properties of large submatrices (rectangles) of the truth-table matrix defining a communication problem. The only technique that does not quite fit is information complexity, which has been investigated over the last decade. Here, we connect information complexity to one of the most powerful rectangular techniques: the recently-introduced smooth corruption (or smooth rectangle) bound. We show that the former subsumes the latter under rectangular input distributions. We conjecture that this subsumption holds more generally, under arbitrary distributions, which would resolve the long-standing direct sum question for randomized communication. As an application, we obtain an optimal $Omega(n)$ lower bound on the information complexity---under the {em uniform distribution}---of the so-called orthogonality problem (ORT), which is in turn closely related to the much-studied Gap-Hamming-Distance (GHD). The proof of this bound is along the lines of recent communication lower bounds for GHD, but we encounter a surprising amount of additional technical detail.
Let $f: {0,1}^n to {0, 1}$ be a boolean function, and let $f_land (x, y) = f(x land y)$ denote the AND-function of $f$, where $x land y$ denotes bit-wise AND. We study the deterministic communication complexity of $f_land$ and show that, up to a $log n$ factor, it is bounded by a polynomial in the logarithm of the real rank of the communication matrix of $f_land$. This comes within a $log n$ factor of establishing the log-rank conjecturefor AND-functions with no assumptions on $f$. Our result stands in contrast with previous results on special cases of the log-rank conjecture, which needed significant restrictions on $f$ such as monotonicity or low $mathbb{F}_2$-degree. Our techniques can also be used to prove (within a $log n$ factor) a lifting theorem for AND-functions, stating that the deterministic communication complexity of $f_land$ is polynomially-related to the AND-decision tree complexity of $f$. The results rely on a new structural result regarding boolean functions $f:{0, 1}^n to {0, 1}$ with a sparse polynomial representation, which may be of independent interest. We show that if the polynomial computing $f$ has few monomials then the set system of the monomials has a small hitting set, of size poly-logarithmic in its sparsity. We also establish extensions of this result to multi-linear polynomials $f:{0,1}^n to mathbb{R}$ with a larger range.
We study the communication complexity of computing functions $F:{0,1}^ntimes {0,1}^n rightarrow {0,1}$ in the memoryless communication model. Here, Alice is given $xin {0,1}^n$, Bob is given $yin {0,1}^n$ and their goal is to compute F(x,y) subject to the following constraint: at every round, Alice receives a message from Bob and her reply to Bob solely depends on the message received and her input x; the same applies to Bob. The cost of computing F in this model is the maximum number of bits exchanged in any round between Alice and Bob (on the worst case input x,y). In this paper, we also consider variants of our memoryless model wherein one party is allowed to have memory, the parties are allowed to communicate quantum bits, only one player is allowed to send messages. We show that our memoryless communication model capture the garden-hose model of computation by Buhrman et al. (ITCS13), space bounded communication complexity by Brody et al. (ITCS13) and the overlay communication complexity by Papakonstantinou et al. (CCC14). Thus the memoryless communication complexity model provides a unified framework to study space-bounded communication models. We establish the following: (1) We show that the memoryless communication complexity of F equals the logarithm of the size of the smallest bipartite branching program computing F (up to a factor 2); (2) We show that memoryless communication complexity equals garden-hose complexity; (3) We exhibit various exponential separations between these memoryless communication models. We end with an intriguing open question: can we find an explicit function F and universal constant c>1 for which the memoryless communication complexity is at least $c log n$? Note that $cgeq 2+varepsilon$ would imply a $Omega(n^{2+varepsilon})$ lower bound for general formula size, improving upon the best lower bound by Nev{c}iporuk in 1966.
We give improved separations for the query complexity analogue of the log-approximate-rank conjecture i.e. we show that there are a plethora of total Boolean functions on $n$ input bits, each of which has approximate Fourier sparsity at most $O(n^3)$ and randomized parity decision tree complexity $Theta(n)$. This improves upon the recent work of Chattopadhyay, Mande and Sherif (JACM 20) both qualitatively (in terms of designing a large number of examples) and quantitatively (improving the gap from quartic to cubic). We leave open the problem of proving a randomized communication complexity lower bound for XOR compositions of our examples. A linear lower bound would lead to new and improved refutations of the log-approximate-rank conjecture. Moreover, if any of these compositions had even a sub-linear cost randomized communication protocol, it would demonstrate that randomized parity decision tree complexity does not lift to randomized communication complexity in general (with the XOR gadget).