Do you want to publish a course? Click here

When Is Amplification Necessary for Composition in Randomized Query Complexity?

93   0   0.0 ( 0 )
 Added by Robin Kothari
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Suppose we have randomized decision trees for an outer function $f$ and an inner function $g$. The natural approach for obtaining a randomized decision tree for the composed function $(fcirc g^n)(x^1,ldots,x^n)=f(g(x^1),ldots,g(x^n))$ involves amplifying the success probability of the decision tree for $g$, so that a union bound can be used to bound the error probability over all the coordinates. The amplification introduces a logarithmic factor cost overhead. We study the question: When is this log factor necessary? We show that when the outer function is parity or majority, the log factor can be necessary, even for models that are more powerful than plain randomized decision trees. Our results are related to, but qualitatively strengthen in various ways, known results about decision trees with noisy inputs.



rate research

Read More

Let the randomized query complexity of a relation for error probability $epsilon$ be denoted by $R_epsilon(cdot)$. We prove that for any relation $f subseteq {0,1}^n times mathcal{R}$ and Boolean function $g:{0,1}^m rightarrow {0,1}$, $R_{1/3}(fcirc g^n) = Omega(R_{4/9}(f)cdot R_{1/2-1/n^4}(g))$, where $f circ g^n$ is the relation obtained by composing $f$ and $g$. We also show that $R_{1/3}left(f circ left(g^oplus_{O(log n)}right)^nright)=Omega(log n cdot R_{4/9}(f) cdot R_{1/3}(g))$, where $g^oplus_{O(log n)}$ is the function obtained by composing the xor function on $O(log n)$ bits and $g^t$.
Let $R_epsilon(cdot)$ stand for the bounded-error randomized query complexity with error $epsilon > 0$. For any relation $f subseteq {0,1}^n times S$ and partial Boolean function $g subseteq {0,1}^m times {0,1}$, we show that $R_{1/3}(f circ g^n) in Omega(R_{4/9}(f) cdot sqrt{R_{1/3}(g)})$, where $f circ g^n subseteq ({0,1}^m)^n times S$ is the composition of $f$ and $g$. We give an example of a relation $f$ and partial Boolean function $g$ for which this lower bound is tight. We prove our composition theorem by introducing a new complexity measure, the max conflict complexity $bar chi(g)$ of a partial Boolean function $g$. We show $bar chi(g) in Omega(sqrt{R_{1/3}(g)})$ for any (partial) function $g$ and $R_{1/3}(f circ g^n) in Omega(R_{4/9}(f) cdot bar chi(g))$; these two bounds imply our composition result. We further show that $bar chi(g)$ is always at least as large as the sabotage complexity of $g$, introduced by Ben-David and Kothari.
We prove two new results about the randomized query complexity of composed functions. First, we show that the randomized composition conjecture is false: there are families of partial Boolean functions $f$ and $g$ such that $R(fcirc g)ll R(f) R(g)$. In fact, we show that the left hand side can be polynomially smaller than the right hand side (though in our construction, both sides are polylogarithmic in the input size of $f$). Second, we show that for all $f$ and $g$, $R(fcirc g)=Omega(mathop{noisyR}(f)cdot R(g))$, where $mathop{noisyR}(f)$ is a measure describing the cost of computing $f$ on noisy oracle inputs. We show that this composition theorem is the strongest possible of its type: for any measure $M(cdot)$ satisfying $R(fcirc g)=Omega(M(f)R(g))$ for all $f$ and $g$, it must hold that $mathop{noisyR}(f)=Omega(M(f))$ for all $f$. We also give a clean characterization of the measure $mathop{noisyR}(f)$: it satisfies $mathop{noisyR}(f)=Theta(R(fcirc gapmaj_n)/R(gapmaj_n))$, where $n$ is the input size of $f$ and $gapmaj_n$ is the $sqrt{n}$-gap majority function on $n$ bits.
Let $f:{0,1}^n rightarrow {0,1}$ be a Boolean function. The certificate complexity $C(f)$ is a complexity measure that is quadratically tight for the zero-error randomized query complexity $R_0(f)$: $C(f) leq R_0(f) leq C(f)^2$. In this paper we study a new complexity measure that we call expectational certificate complexity $EC(f)$, which is also a quadratically tight bound on $R_0(f)$: $EC(f) leq R_0(f) = O(EC(f)^2)$. We prove that $EC(f) leq C(f) leq EC(f)^2$ and show that there is a quadratic separation between the two, thus $EC(f)$ gives a tighter upper bound for $R_0(f)$. The measure is also related to the fractional certificate complexity $FC(f)$ as follows: $FC(f) leq EC(f) = O(FC(f)^{3/2})$. This also connects to an open question by Aaronson whether $FC(f)$ is a quadratically tight bound for $R_0(f)$, as $EC(f)$ is in fact a relaxation of $FC(f)$. In the second part of the work, we upper bound the distributed query complexity $D^mu_epsilon(f)$ for product distributions $mu$ by the square of the query corruption bound ($mathrm{corr}_epsilon(f)$) which improves upon a result of Harsha, Jain and Radhakrishnan [2015]. A similar statement for communication complexity is open.
Let $fsubseteq{0,1}^ntimesXi$ be a relation and $g:{0,1}^mto{0,1,*}$ be a promise function. This work investigates the randomised query complexity of the relation $fcirc g^nsubseteq{0,1}^{mcdot n}timesXi$, which can be viewed as one of the most general cases of composition in the query model (letting $g$ be a relation seems to result in a rather unnatural definition of $fcirc g^n$). We show that for every such $f$ and $g$, $$mathcal R(fcirc g^n) in Omega(mathcal R(f)cdotsqrt{mathcal R(g)}),$$ where $mathcal R$ denotes the randomised query complexity. On the other hand, we demonstrate a relation $f_0$ and a promise function $g_0$, such that $mathcal R(f_0)inTheta(sqrt n)$, $mathcal R(g_0)inTheta(n)$ and $mathcal R(f_0circ g_0^n)inTheta(n)$ $-$ that is, our composition statement is tight. To the best of our knowledge, there was no known composition theorem for the randomised query complexity of relations or promise functions (and for the special case of total functions our lower bound gives multiplicative improvement of $sqrt{log n}$).
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا