Do you want to publish a course? Click here

More Dynamic Data Structures for Geometric Set Cover with Sublinear Update Time

330   0   0.0 ( 0 )
 Added by Qizheng He
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

We study geometric set cover problems in dynamic settings, allowing insertions and deletions of points and objects. We present the first dynamic data structure that can maintain an $O(1)$-approximation in sublinear update time for set cover for axis-aligned squares in 2D. More precisely, we obtain randomized update time $O(n^{2/3+delta})$ for an arbitrarily small constant $delta>0$. Previously, a dynamic geometric set cover data structure with sublinear update time was known only for unit squares by Agarwal, Chang, Suri, Xiao, and Xue [SoCG 2020]. If only an approximate size of the solution is needed, then we can also obtain sublinear amortized update time for disks in 2D and halfspaces in 3D. As a byproduct, our techniques for dynamic set cover also yield an optimal randomized $O(nlog n)$-time algorithm for static set cover for 2D disks and 3D halfspaces, improving our earlier $O(nlog n(loglog n)^{O(1)})$ result [SoCG 2020].

rate research

Read More

We improve the running times of $O(1)$-approximation algorithms for the set cover problem in geometric settings, specifically, covering points by disks in the plane, or covering points by halfspaces in three dimensions. In the unweighted case, Agarwal and Pan [SoCG 2014] gave a randomized $O(nlog^4 n)$-time, $O(1)$-approximation algorithm, by using variants of the multiplicative weight update (MWU) method combined with geometric data structures. We simplify the data structure requirement in one of their methods and obtain a deterministic $O(nlog^3 nloglog n)$-time algorithm. With further new ideas, we obtain a still faster randomized $O(nlog n(loglog n)^{O(1)})$-time algorithm. For the weighted problem, we also give a randomized $O(nlog^4nloglog n)$-time, $O(1)$-approximation algorithm, by simple modifications to the MWU method and the quasi-uniform sampling technique.
127 - Minati De , Abhiruk Lahiri 2016
In this paper, we study two classic optimization problems: minimum geometric dominating set and set cover. Both the problems have been studied for different types of objects for a long time. These problems become APX-hard when the objects are axis-parallel rectangles, ellipses, $alpha$-fat objects of constant description complexity, and convex polygons. On the other hand, PTAS (polynomial time approximation scheme) is known for them when the objects are disks or unit squares. Surprisingly, PTAS was unknown even for arbitrary squares. For homothetic set of convex objects, an $O(k^4)$ approximation algorithm is known for dominating set problem, where $k$ is the number of corners in a convex object. On the other hand, QPTAS (quasi polynomial time approximation scheme) is known very recently for the covering problem when the objects are pseudodisks. For both problems obtaining a PTAS remains open for a large class of objects. For the dominating set problems, we prove that the popular local search algorithm leads to an $(1+varepsilon)$ approximation when objects are homothetic set of convex objects (which includes arbitrary squares, $k$-regular polygons, translated and scaled copies of a convex set etc.) in $n^{O(1/varepsilon^2)}$ time. On the other hand, the same technique leads to a PTAS for geometric covering problem when the objects are convex pseudodisks (which includes disks, unit height rectangles, homothetic convex objects etc.). As a consequence, we obtain an easy to implement approximation algorithm for both problems for a large class of objects, significantly improving the best known approximation guarantees.
In the dynamic minimum set cover problem, a challenge is to minimize the update time while guaranteeing close to the optimal $min(O(log n), f)$ approximation factor. (Throughout, $m$, $n$, $f$, and $C$ are parameters denoting the maximum number of sets, number of elements, frequency, and the cost range.) In the high-frequency range, when $f=Omega(log n)$, this was achieved by a deterministic $O(log n)$-approximation algorithm with $O(f log n)$ amortized update time [Gupta et al. STOC17]. In the low-frequency range, the line of work by Gupta et al. [STOC17], Abboud et al. [STOC19], and Bhattacharya et al. [ICALP15, IPCO17, FOCS19] led to a deterministic $(1+epsilon)f$-approximation algorithm with $O(f log (Cn)/epsilon^2)$ amortized update time. In this paper we improve the latter update time and provide the first bounds that subsume (and sometimes improve) the state-of-the-art dynamic vertex cover algorithms. We obtain: 1. $(1+epsilon)f$-approximation ratio in $O(flog^2 (Cn)/epsilon^3)$ worst-case update time: No non-trivial worst-case update time was previously known for dynamic set cover. Our bound subsumes and improves by a logarithmic factor the $O(log^3 n/text{poly}(epsilon))$ worst-case update time for unweighted dynamic vertex cover (i.e., when $f=2$ and $C=1$) by Bhattacharya et al. [SODA17]. 2. $(1+epsilon)f$-approximation ratio in $Oleft((f^2/epsilon^3)+(f/epsilon^2) log Cright)$ amortized update time: This result improves the previous $O(f log (Cn)/epsilon^2)$ update time bound for most values of $f$ in the low-frequency range, i.e. whenever $f=o(log n)$. It is the first that is independent of $m$ and $n$. It subsumes the constant amortized update time of Bhattacharya and Kulkarni [SODA19] for unweighted dynamic vertex cover (i.e., when $f = 2$ and $C = 1$).
We study metric data structures for curves in doubling spaces, such as trajectories of moving objects in Euclidean $mathbb{R}^d$, where the distance between two curves is measured using the discrete Frechet distance. We design data structures in an emph{asymmetric} setting where the input is a curve (or a set of $n$ curves) each of complexity $m$ and the queries are with curves of complexity $kll m$. We show that there exist approximate data structures that are independent of the input size $N = d cdot n cdot m$ and we study how to maintain them dynamically if the input is given in the stream. Concretely, we study two types of data structures: (i) distance oracles, where the task is to store a compressed version of the input curve, which can be used to answer queries for the distance of a query curve to the input curve, and (ii) nearest-neighbor data structures, where the task is to preprocess a set of input curves to answer queries for the input curve closest to the query curve. In both cases we are interested in approximation. For curves embedded in Euclidean $mathbb{R}^d$ with constant $d$, our distance oracle uses space in $mathcal{O}((k log(epsilon^{-1}) epsilon^{-d})^k)$ ($epsilon$ is the precision parameter). The oracle performs $(1+epsilon)$-approximate queries in time in $mathcal{O}(k^2)$ and is deterministic. We show how to maintain this distance oracle in the stream using polylogarithmic additional memory. In the stream, we can dynamically answer distance queries to the portion of the stream seen so far in $mathcal{O}(k^4 log^2 m)$ time. We apply our techniques to the second problem, approximate near neighbor (ANN) data structures, and achieve an exponential improvement in the dependency on the complexity of the input curves compared to the state of the art.
We consider the problem of maintaining an approximate maximum independent set of geometric objects under insertions and deletions. We present data structures that maintain a constant-factor approximate maximum independent set for broad classes of fat objects in $d$ dimensions, where $d$ is assumed to be a constant, in sublinear textit{worst-case} update time. This gives the first results for dynamic independent set in a wide variety of geometric settings, such as disks, fat polygons, and their high-dimensional equivalents. For axis-aligned squares and hypercubes, our result improves upon all (recently announced) previous works. We obtain, in particular, a dynamic $(4+epsilon)$-approximation for squares, with $O(log^4 n)$ worst-case update time. Our result is obtained via a two-level approach. First, we develop a dynamic data structure which stores all objects and provides an approximate independent set when queried, with output-sensitive running time. We show that via standard methods such a structure can be used to obtain a dynamic algorithm with textit{amortized} update time bounds. Then, to obtain worst-case update time algorithms, we develop a generic deamortization scheme that with each insertion/deletion keeps (i) the update time bounded and (ii) the number of changes in the independent set constant. We show that such a scheme is applicable to fat objects by showing an appropriate generalization of a separator theorem. Interestingly, we show that our deamortization scheme is also necessary in order to obtain worst-case update bounds: If for a class of objects our scheme is not applicable, then no constant-factor approximation with sublinear worst-case update time is possible. We show that such a lower bound applies even for seemingly simple classes of geometric objects including axis-aligned rectangles in the plane.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا