Do you want to publish a course? Click here

On Exponential Time Lower Bound of Knapsack under Backtracking

284   0   0.0 ( 0 )
 Added by Xin Li
 Publication date 2007
and research's language is English




Ask ChatGPT about the research

M.Aleknovich et al. have recently proposed a model of algorithms, called BT model, which generalizes both the priority model of Borodin, Nielson and Rackoff, as well as a simple dynamic programming model by Woeginger. BT model can be further divided into three kinds of fixed, adaptive and fully adaptive ones. They have proved exponential time lower bounds of exact and approximation algorithms under adaptive BT model for Knapsack problem. Their exact lower bound is $Omega(2^{0.5n}/sqrt{n})$, in this paper, we slightly improve the exact lower bound to about $Omega(2^{0.69n}/sqrt{n})$, by the same technique, with related parameters optimized.



rate research

Read More

MapReduce (and its open source implementation Hadoop) has become the de facto platform for processing large data sets. MapReduce offers a streamlined computational framework by interleaving sequential and parallel computation while hiding underlying system issues from the programmer. Due to the popularity of MapReduce, there have been attempts in the theoretical computer science community to understand the power and limitations of the MapReduce framework. In the most widely studied MapReduce models each machine has memory sub-linear in the input size to the problem, hence cannot see the entire input. This restriction places many limitations on algorithms that can be developed for the model; however, the current understanding of these restrictions is still limited. In this paper, our goal is to work towards understanding problems which do not admit efficient algorithms in the MapReduce model. We study the basic question of determining if a graph is connected or not. We concentrate on instances of this problem where an algorithm is to determine if a graph consists of a single cycle or two disconnected cycles. In this problem, locally every part of the graph is similar and the goal is to determine the global structure of the graph. We consider a natural class of algorithms that can store/process/transfer the information only in the form of paths and show that no randomized algorithm cannot answer the decision question in a sub-logarithmic number of rounds. Currently, there are no absolute super constant lower bounds on the number of rounds known for any problem in MapReduce. We introduce some of the first lower bounds for a natural graph problem, albeit for a restricted class of algorithms. We believe our result makes progress towards understanding the limitations of MapReduce.
We prove that with high probability over the choice of a random graph $G$ from the ErdH{o}s-Renyi distribution $G(n,1/2)$, a natural $n^{O(varepsilon^2 log n)}$-time, degree $O(varepsilon^2 log n)$ sum-of-squares semidefinite program cannot refute the existence of a valid $k$-coloring of $G$ for $k = n^{1/2 +varepsilon}$. Our result implies that the refutation guarantee of the basic semidefinite program (a close variant of the Lovasz theta function) cannot be appreciably improved by a natural $o(log n)$-degree sum-of-squares strengthening, and this is tight up to a $n^{o(1)}$ slack in $k$. To the best of our knowledge, this is the first lower bound for coloring $G(n,1/2)$ for even a single round strengthening of the basic SDP in any SDP hierarchy. Our proof relies on a new variant of instance-preserving non-pointwise complete reduction within SoS from coloring a graph to finding large independent sets in it. Our proof is (perhaps surprisingly) short, simple and does not require complicated spectral norm bounds on random matrices with dependent entries that have been otherwise necessary in the proofs of many similar results [BHK+16, HKP+17, KB19, GJJ+20, MRX20]. Our result formally holds for a constraint system where vertices are allowed to belong to multiple color classes; we leave the extension to the formally stronger formulation of coloring, where vertices must belong to unique colors classes, as an outstanding open problem.
172 - Xiaogang Liu 2015
Francis Castro, et al [2] computed the exact divisibility of families of exponential sums associated to binomials $F(X) = aX^{d_1} + bX^{d_2}$ over $mathbb{F}_p$, and a conjecture is presented for related work. Here we study this question.
415 - Henry Yuen 2013
The problem of distinguishing between a random function and a random permutation on a domain of size $N$ is important in theoretical cryptography, where the security of many primitives depend on the problems hardness. We study the quantum query complexity of this problem, and show that any quantum algorithm that solves this problem with bounded error must make $Omega(N^{1/5}/log N)$ queries to the input function. Our lower bound proof uses a combination of the Collision Problem lower bound and Ambainiss adversary theorem.
We prove that with high probability over the choice of a random graph $G$ from the ErdH{o}s-Renyi distribution $G(n,1/2)$, the $n^{O(d)}$-time degree $d$ Sum-of-Squares semidefinite programming relaxation for the clique problem will give a value of at least $n^{1/2-c(d/log n)^{1/2}}$ for some constant $c>0$. This yields a nearly tight $n^{1/2 - o(1)}$ bound on the value of this program for any degree $d = o(log n)$. Moreover we introduce a new framework that we call emph{pseudo-calibration} to construct Sum of Squares lower bounds. This framework is inspired by taking a computational analog of Bayesian probability theory. It yields a general recipe for constructing good pseudo-distributions (i.e., dual certificates for the Sum-of-Squares semidefinite program), and sheds further light on the ways in which this hierarchy differs from others.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا