Do you want to publish a course? Click here

Quantum algorithms for graph problems with cut queries

408   0   0.0 ( 0 )
 Added by Troy Lee
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Let $G$ be an $n$-vertex graph with $m$ edges. When asked a subset $S$ of vertices, a cut query on $G$ returns the number of edges of $G$ that have exactly one endpoint in $S$. We show that there is a bounded-error quantum algorithm that determines all connected components of $G$ after making $O(log(n)^6)$ many cut queries. In contrast, it follows from results in communication complexity that any randomized algorithm even just to decide whether the graph is connected or not must make at least $Omega(n/log(n))$ many cut queries. We further show that with $O(log(n)^8)$ many cut queries a quantum algorithm can with high probability output a spanning forest for $G$. En route to proving these results, we design quantum algorithms for learning a graph using cut queries. We show that a quantum algorithm can learn a graph with maximum degree $d$ after $O(d log(n)^2)$ many cut queries, and can learn a general graph with $O(sqrt{m} log(n)^{3/2})$ many cut queries. These two upper bounds are tight up to the poly-logarithmic factors, and compare to $Omega(dn)$ and $Omega(m/log(n))$ lower bounds on the number of cut queries needed by a randomized algorithm for the same problems, respectively. The key ingredients in our results are the Bernstein-Vazirani algorithm, approximate counting with OR queries, and learning sparse vectors from inner products as in compressed sensing.



rate research

Read More

The area of computing with uncertainty considers problems where some information about the input elements is uncertain, but can be obtained using queries. For example, instead of the weight of an element, we may be given an interval that is guaranteed to contain the weight, and a query can be performed to reveal the weight. While previous work has considered models where queries are asked either sequentially (adaptive model) or all at once (non-adaptive model), and the goal is to minimize the number of queries that are needed to solve the given problem, we propose and study a new model where $k$ queries can be made in parallel in each round, and the goal is to minimize the number of query rounds. We use competitive analysis and present upper and lower bounds on the number of query rounds required by any algorithm in comparison with the optimal number of query rounds. Given a set of uncertain elements and a family of $m$ subsets of that set, we present an algorithm for determining the value of the minimum of each of the subsets that requires at most $(2+varepsilon) cdot mathrm{opt}_k+mathrm{O}left(frac{1}{varepsilon} cdot lg mright)$ rounds for every $0<varepsilon<1$, where $mathrm{opt}_k$ is the optimal number of rounds, as well as nearly matching lower bounds. For the problem of determining the $i$-th smallest value and identifying all elements with that value in a set of uncertain elements, we give a $2$-round-competitive algorithm. We also show that the problem of sorting a family of sets of uncertain elements admits a $2$-round-competitive algorithm and this is the best possible.
Graph cut problems form a fundamental problem type in combinatorial optimization, and are a central object of study in both theory and practice. In addition, the study of fairness in Algorithmic Design and Machine Learning has recently received significant attention, with many different notions proposed and analyzed in a variety of contexts. In this paper we initiate the study of fairness for graph cut problems by giving the first fair definitions for them, and subsequently we demonstrate appropriate algorithmic techniques that yield a rigorous theoretical analysis. Specifically, we incorporate two different definitions of fairness, namely demographic and probabilistic individual fairness, in a particular cut problem modeling disaster containment scenarios. Our results include a variety of approximation algorithms with provable theoretical guarantees.
We study two variants of textsc{Maximum Cut}, which we call textsc{Connected Maximum Cut} and textsc{Maximum Minimal Cut}, in this paper. In these problems, given an unweighted graph, the goal is to compute a maximum cut satisfying some connectivity requirements. Both problems are known to be NP-complete even on planar graphs whereas textsc{Maximum Cut} on planar graphs is solvable in polynomial time. We first show that these problems are NP-complete even on planar bipartite graphs and split graphs. Then we give parameterized algorithms using graph parameters such as clique-width, tree-width, and twin-cover number. Finally, we obtain FPT algorithms with respect to the solution size.
79 - Petr Kolman 2017
Given a graph $G=(V,E)$ with two distinguished vertices $s,tin V$ and an integer parameter $L>0$, an {em $L$-bounded cut} is a subset $F$ of edges (vertices) such that the every path between $s$ and $t$ in $Gsetminus F$ has length more than $L$. The task is to find an $L$-bounded cut of minimum cardinality. Though the problem is very simple to state and has been studied since the beginning of the 70s, it is not much understood yet. The problem is known to be $cal{NP}$-hard to approximate within a small constant factor even for $Lgeq 4$ (for $Lgeq 5$ for the vertex cuts). On the other hand, the best known approximation algorithm for general graphs has approximation ratio only $mathcal{O}({n^{2/3}})$ in the edge case, and $mathcal{O}({sqrt{n}})$ in the vertex case, where $n$ denotes the number of vertices. We show that for planar graphs, it is possible to solve both the edge- and the vertex-version of the problem optimally in time $mathcal{O}(L^{3L}n)$. That is, the problem is fixed parameter tractable (FPT) with respect to $L$ on planar graphs. Furthermore, we show that the problem remains FPT even for bounded genus graphs, a super class of planar graphs. Our second contribution deals with approximations of the vertex version of the problem. We describe an algorithm that for a given a graph $G$, its tree decomposition of treewidth $tau$ and vertices $s$ and $t$ computes a $tau$-approximation of the minimum $L$-bounded $s-t$ vertex cut; if the decomposition is not given, then the approximation ratio is $mathcal{O}(tau sqrt{log tau})$. For graphs with treewidth bounded by $mathcal{O}(n^{1/2-epsilon})$ for any $epsilon>0$, but not by a constant, this is the best approximation in terms of~$n$ that we are aware of.
Consider the following 2-respecting min-cut problem. Given a weighted graph $G$ and its spanning tree $T$, find the minimum cut among the cuts that contain at most two edges in $T$. This problem is an important subroutine in Kargers celebrated randomized near-linear-time min-cut algorithm [STOC96]. We present a new approach for this problem which can be easily implemented in many settings, leading to the following randomized min-cut algorithms for weighted graphs. * An $O(mfrac{log^2 n}{loglog n} + nlog^6 n)$-time sequential algorithm: This improves Kargers $O(m log^3 n)$ and $O(mfrac{(log^2 n)log (n^2/m)}{loglog n} + nlog^6 n)$ bounds when the input graph is not extremely sparse or dense. Improvements over Kargers bounds were previously known only under a rather strong assumption that the input graph is simple [Henzinger et al. SODA17; Ghaffari et al. SODA20]. For unweighted graphs with parallel edges, our bound can be improved to $O(mfrac{log^{1.5} n}{loglog n} + nlog^6 n)$. * An algorithm requiring $tilde O(n)$ cut queries to compute the min-cut of a weighted graph: This answers an open problem by Rubinstein et al. ITCS18, who obtained a similar bound for simple graphs. * A streaming algorithm that requires $tilde O(n)$ space and $O(log n)$ passes to compute the min-cut: The only previous non-trivial exact min-cut algorithm in this setting is the 2-pass $tilde O(n)$-space algorithm on simple graphs [Rubinstein et al., ITCS18] (observed by Assadi et al. STOC19). In contrast to Kargers 2-respecting min-cut algorithm which deploys sophisticated dynamic programming techniques, our approach exploits some cute structural properties so that it only needs to compute the values of $tilde O(n)$ cuts corresponding to removing $tilde O(n)$ pairs of tree edges, an operation that can be done quickly in many settings.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا