Do you want to publish a course? Click here

Dynamic Set Cover: Improved Amortized and Worst-Case Update Time

105   0   0.0 ( 0 )
 Added by Sayan Bhattacharya
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

In the dynamic minimum set cover problem, a challenge is to minimize the update time while guaranteeing close to the optimal $min(O(log n), f)$ approximation factor. (Throughout, $m$, $n$, $f$, and $C$ are parameters denoting the maximum number of sets, number of elements, frequency, and the cost range.) In the high-frequency range, when $f=Omega(log n)$, this was achieved by a deterministic $O(log n)$-approximation algorithm with $O(f log n)$ amortized update time [Gupta et al. STOC17]. In the low-frequency range, the line of work by Gupta et al. [STOC17], Abboud et al. [STOC19], and Bhattacharya et al. [ICALP15, IPCO17, FOCS19] led to a deterministic $(1+epsilon)f$-approximation algorithm with $O(f log (Cn)/epsilon^2)$ amortized update time. In this paper we improve the latter update time and provide the first bounds that subsume (and sometimes improve) the state-of-the-art dynamic vertex cover algorithms. We obtain: 1. $(1+epsilon)f$-approximation ratio in $O(flog^2 (Cn)/epsilon^3)$ worst-case update time: No non-trivial worst-case update time was previously known for dynamic set cover. Our bound subsumes and improves by a logarithmic factor the $O(log^3 n/text{poly}(epsilon))$ worst-case update time for unweighted dynamic vertex cover (i.e., when $f=2$ and $C=1$) by Bhattacharya et al. [SODA17]. 2. $(1+epsilon)f$-approximation ratio in $Oleft((f^2/epsilon^3)+(f/epsilon^2) log Cright)$ amortized update time: This result improves the previous $O(f log (Cn)/epsilon^2)$ update time bound for most values of $f$ in the low-frequency range, i.e. whenever $f=o(log n)$. It is the first that is independent of $m$ and $n$. It subsumes the constant amortized update time of Bhattacharya and Kulkarni [SODA19] for unweighted dynamic vertex cover (i.e., when $f = 2$ and $C = 1$).



rate research

Read More

Consider the following distance query for an $n$-node graph $G$ undergoing edge insertions and deletions: given two sets of nodes $I$ and $J$, return the distances between every pair of nodes in $Itimes J$. This query is rather general and captures sever
Given a directed weighted graph $G=(V,E)$ undergoing vertex insertions emph{and} deletions, the All-Pairs Shortest Paths (APSP) problem asks to maintain a data structure that processes updates efficiently and returns after each update the distance matrix to the current version of $G$. In two breakthrough results, Italiano and Demetrescu [STOC 03] presented an algorithm that requires $tilde{O}(n^2)$ emph{amortized} update time, and Thorup showed in [STOC 05] that emph{worst-case} update time $tilde{O}(n^{2+3/4})$ can be achieved. In this article, we make substantial progress on the problem. We present the following new results: (1) We present the first deterministic data structure that breaks the $tilde{O}(n^{2+3/4})$ worst-case update time bound by Thorup which has been standing for almost 15 years. We improve the worst-case update time to $tilde{O}(n^{2+5/7}) = tilde{O}(n^{2.71..})$ and to $tilde{O}(n^{2+3/5}) = tilde{O}(n^{2.6})$ for unweighted graphs. (2) We present a simple deterministic algorithm with $tilde{O}(n^{2+3/4})$ worst-case update time ($tilde{O}(n^{2+2/3})$ for unweighted graphs), and a simple Las-Vegas algorithm with worst-case update time $tilde{O}(n^{2+2/3})$ ($tilde{O}(n^{2 + 1/2})$ for unweighted graphs) that works against a non-oblivious adversary. Both data structures require space $tilde{O}(n^2)$. These are the first exact dynamic algorithms with truly-subcubic update time emph{and} space usage. This makes significant progress on an open question posed in multiple articles [COCOON01, STOC 03, ICALP04, Encyclopedia of Algorithms 08] and is critical to algorithms in practice [TALG 06] where large space usage is prohibitive. Moreover, they match the worst-case update time of the best previous algorithms and the second algorithm improves upon a Monte-Carlo algorithm in a weaker adversary model with the same running time [SODA 17].
This paper gives a new deterministic algorithm for the dynamic Minimum Spanning Forest (MSF) problem in the EREW PRAM model, where the goal is to maintain a MSF of a weighted graph with $n$ vertices and $m$ edges while supporting edge insertions and deletions. We show that one can solve the dynamic MSF problem using $O(sqrt n)$ processors and $O(log n)$ worst-case update time, for a total of $O(sqrt n log n)$ work. This improves on the work of Ferragina [IPPS 1995] which costs $O(log n)$ worst-case update time and $O(n^{2/3} log{frac{m}{n}})$ work.
We present the first algorithm for maintaining a maximal independent set (MIS) of a fully dynamic graph---which undergoes both edge insertions and deletions---in polylogarithmic time. Our algorithm is randomized and, per update, takes $O(log^2 Delta cdot log^2 n)$ expected time. Furthermore, the algorithm can be adjusted to have $O(log^2 Delta cdot log^4 n)$ worst-case update-time with high probability. Here, $n$ denotes the number of vertices and $Delta$ is the maximum degree in the graph. The MIS problem in fully dynamic graphs has attracted significant attention after a breakthrough result of Assadi, Onak, Schieber, and Solomon [STOC18] who presented an algorithm with $O(m^{3/4})$ update-time (and thus broke the natural $Omega(m)$ barrier) where $m$ denotes the number of edges in the graph. This result was improved in a series of subsequent papers, though, the update-time remained polynomial. In particular, the fastest algorithm prior to our work had $widetilde{O}(min{sqrt{n}, m^{1/3}})$ update-time [Assadi et al. SODA19]. Our algorithm maintains the lexicographically first MIS over a random order of the vertices. As a result, the same algorithm also maintains a 3-approximation of correlation clustering. We also show that a simpler variant of our algorithm can be used to maintain a random-order lexicographically first maximal matching in the same update-time.
We consider the problem of maintaining an approximate maximum independent set of geometric objects under insertions and deletions. We present data structures that maintain a constant-factor approximate maximum independent set for broad classes of fat objects in $d$ dimensions, where $d$ is assumed to be a constant, in sublinear textit{worst-case} update time. This gives the first results for dynamic independent set in a wide variety of geometric settings, such as disks, fat polygons, and their high-dimensional equivalents. For axis-aligned squares and hypercubes, our result improves upon all (recently announced) previous works. We obtain, in particular, a dynamic $(4+epsilon)$-approximation for squares, with $O(log^4 n)$ worst-case update time. Our result is obtained via a two-level approach. First, we develop a dynamic data structure which stores all objects and provides an approximate independent set when queried, with output-sensitive running time. We show that via standard methods such a structure can be used to obtain a dynamic algorithm with textit{amortized} update time bounds. Then, to obtain worst-case update time algorithms, we develop a generic deamortization scheme that with each insertion/deletion keeps (i) the update time bounded and (ii) the number of changes in the independent set constant. We show that such a scheme is applicable to fat objects by showing an appropriate generalization of a separator theorem. Interestingly, we show that our deamortization scheme is also necessary in order to obtain worst-case update bounds: If for a class of objects our scheme is not applicable, then no constant-factor approximation with sublinear worst-case update time is possible. We show that such a lower bound applies even for seemingly simple classes of geometric objects including axis-aligned rectangles in the plane.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا