ﻻ يوجد ملخص باللغة العربية
Help bits are some limited trusted information about an instance or instances of a computational problem that may reduce the computational complexity of solving that instance or instances. In this paper, we study the value of help bits in the settings of randomized and average-case complexity. Amir, Beigel, and Gasarch (1990) show that for constant $k$, if $k$ instances of a decision problem can be efficiently solved using less than $k$ bits of help, then the problem is in P/poly. We extend this result to the setting of randomized computation: We show that the decision problem is in P/poly if using $ell$ help bits, $k$ instances of the problem can be efficiently solved with probability greater than $2^{ell-k}$. The same result holds if using less than $k(1 - h(alpha))$ help bits (where $h(cdot)$ is the binary entropy function), we can efficiently solve $(1-alpha)$ fraction of the instances correctly with non-vanishing probability. We also extend these two results to non-constant but logarithmic $k$. In this case however, instead of showing that the problem is in P/poly we show that it satisfies $k$-membership comparability, a notion known to be related to solving $k$ instances using less than $k$ bits of help. Next we consider the setting of average-case complexity: Assume that we can solve $k$ instances of a decision problem using some help bits whose entropy is less than $k$ when the $k$ instances are drawn independently from a particular distribution. Then we can efficiently solve an instance drawn from that distribution with probability better than $1/2$. Finally, we show that in the case where $k$ is super-logarithmic, assuming $k$-membership comparability of a decision problem, one cannot prove that the problem is in P/poly by a black-box proof.
How many bits of information are revealed by a learning algorithm for a concept class of VC-dimension $d$? Previous works have shown that even for $d=1$ the amount of information may be unbounded (tend to $infty$ with the universe size). Can it be th
Information-theoretic methods have proven to be a very powerful tool in communication complexity, in particular giving an elegant proof of the linear lower bound for the two-party disjointness function, and tight lower bounds on disjointness in the m
Three decades of research in communication complexity have led to the invention of a number of techniques to lower bound randomized communication complexity. The majority of these techniques involve properties of large submatrices (rectangles) of the
We define nondeterministic communication complexity in the model of communication complexity with help of Babai, Hayes and Kimmel. We use it to prove logarithmic lower bounds on the NOF communication complexity of explicit graph functions, which are
We study the composition question for bounded-error randomized query complexity: Is R(f o g) = Omega(R(f) R(g)) for all Boolean functions f and g? We show that inserting a simple Boolean function h, whose query complexity is only Theta(log R(g)), in