Do you want to publish a course? Click here

A composition theorem for randomized query complexity via max conflict complexity

374   0   0.0 ( 0 )
 Added by Troy Lee
 Publication date 2018
and research's language is English




Ask ChatGPT about the research

Let $R_epsilon(cdot)$ stand for the bounded-error randomized query complexity with error $epsilon > 0$. For any relation $f subseteq {0,1}^n times S$ and partial Boolean function $g subseteq {0,1}^m times {0,1}$, we show that $R_{1/3}(f circ g^n) in Omega(R_{4/9}(f) cdot sqrt{R_{1/3}(g)})$, where $f circ g^n subseteq ({0,1}^m)^n times S$ is the composition of $f$ and $g$. We give an example of a relation $f$ and partial Boolean function $g$ for which this lower bound is tight. We prove our composition theorem by introducing a new complexity measure, the max conflict complexity $bar chi(g)$ of a partial Boolean function $g$. We show $bar chi(g) in Omega(sqrt{R_{1/3}(g)})$ for any (partial) function $g$ and $R_{1/3}(f circ g^n) in Omega(R_{4/9}(f) cdot bar chi(g))$; these two bounds imply our composition result. We further show that $bar chi(g)$ is always at least as large as the sabotage complexity of $g$, introduced by Ben-David and Kothari.

rate research

Read More

Let the randomized query complexity of a relation for error probability $epsilon$ be denoted by $R_epsilon(cdot)$. We prove that for any relation $f subseteq {0,1}^n times mathcal{R}$ and Boolean function $g:{0,1}^m rightarrow {0,1}$, $R_{1/3}(fcirc g^n) = Omega(R_{4/9}(f)cdot R_{1/2-1/n^4}(g))$, where $f circ g^n$ is the relation obtained by composing $f$ and $g$. We also show that $R_{1/3}left(f circ left(g^oplus_{O(log n)}right)^nright)=Omega(log n cdot R_{4/9}(f) cdot R_{1/3}(g))$, where $g^oplus_{O(log n)}$ is the function obtained by composing the xor function on $O(log n)$ bits and $g^t$.
71 - Swagato Sanyal 2018
Let $R(cdot)$ stand for the bounded-error randomized query complexity. We show that for any relation $f subseteq {0,1}^n times mathcal{S}$ and partial Boolean function $g subseteq {0,1}^n times {0,1}$, $R_{1/3}(f circ g^n) = Omega(R_{4/9}(f) cdot sqrt{R_{1/3}(g)})$. Independently of us, Gavinsky, Lee and Santha cite{newcomp} proved this result. By an example demonstrated in their work, this bound is optimal. We prove our result by introducing a novel complexity measure called the emph{conflict complexity} of a partial Boolean function $g$, denoted by $chi(g)$, which may be of independent interest. We show that $chi(g) = Omega(sqrt{R(g)})$ and $R(f circ g^n) = Omega(R(f) cdot chi(g))$.
We prove two new results about the randomized query complexity of composed functions. First, we show that the randomized composition conjecture is false: there are families of partial Boolean functions $f$ and $g$ such that $R(fcirc g)ll R(f) R(g)$. In fact, we show that the left hand side can be polynomially smaller than the right hand side (though in our construction, both sides are polylogarithmic in the input size of $f$). Second, we show that for all $f$ and $g$, $R(fcirc g)=Omega(mathop{noisyR}(f)cdot R(g))$, where $mathop{noisyR}(f)$ is a measure describing the cost of computing $f$ on noisy oracle inputs. We show that this composition theorem is the strongest possible of its type: for any measure $M(cdot)$ satisfying $R(fcirc g)=Omega(M(f)R(g))$ for all $f$ and $g$, it must hold that $mathop{noisyR}(f)=Omega(M(f))$ for all $f$. We also give a clean characterization of the measure $mathop{noisyR}(f)$: it satisfies $mathop{noisyR}(f)=Theta(R(fcirc gapmaj_n)/R(gapmaj_n))$, where $n$ is the input size of $f$ and $gapmaj_n$ is the $sqrt{n}$-gap majority function on $n$ bits.
Suppose we have randomized decision trees for an outer function $f$ and an inner function $g$. The natural approach for obtaining a randomized decision tree for the composed function $(fcirc g^n)(x^1,ldots,x^n)=f(g(x^1),ldots,g(x^n))$ involves amplifying the success probability of the decision tree for $g$, so that a union bound can be used to bound the error probability over all the coordinates. The amplification introduces a logarithmic factor cost overhead. We study the question: When is this log factor necessary? We show that when the outer function is parity or majority, the log factor can be necessary, even for models that are more powerful than plain randomized decision trees. Our results are related to, but qualitatively strengthen in various ways, known results about decision trees with noisy inputs.
Let $f:{0,1}^n rightarrow {0,1}$ be a Boolean function. The certificate complexity $C(f)$ is a complexity measure that is quadratically tight for the zero-error randomized query complexity $R_0(f)$: $C(f) leq R_0(f) leq C(f)^2$. In this paper we study a new complexity measure that we call expectational certificate complexity $EC(f)$, which is also a quadratically tight bound on $R_0(f)$: $EC(f) leq R_0(f) = O(EC(f)^2)$. We prove that $EC(f) leq C(f) leq EC(f)^2$ and show that there is a quadratic separation between the two, thus $EC(f)$ gives a tighter upper bound for $R_0(f)$. The measure is also related to the fractional certificate complexity $FC(f)$ as follows: $FC(f) leq EC(f) = O(FC(f)^{3/2})$. This also connects to an open question by Aaronson whether $FC(f)$ is a quadratically tight bound for $R_0(f)$, as $EC(f)$ is in fact a relaxation of $FC(f)$. In the second part of the work, we upper bound the distributed query complexity $D^mu_epsilon(f)$ for product distributions $mu$ by the square of the query corruption bound ($mathrm{corr}_epsilon(f)$) which improves upon a result of Harsha, Jain and Radhakrishnan [2015]. A similar statement for communication complexity is open.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا