Do you want to publish a course? Click here

On the hardness of computing an average curve

48   0   0.0 ( 0 )
 Added by Martijn Struijs
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

We study the complexity of clustering curves under $k$-median and $k$-center objectives in the metric space of the Frechet distance and related distance measures. Building upon recent hardness results for the minimum-enclosing-ball problem under the Frechet distance, we show that also the $1$-median problem is NP-hard. Furthermore, we show that the $1$-median problem is W[1]-hard with the number of curves as parameter. We show this under the discrete and continuous Frechet and Dynamic Time Warping (DTW) distance. This yields an independent proof of an earlier result by Bulteau et al. from 2018 for a variant of DTW that uses squared distances, where the new proof is both simpler and more general. On the positive side, we give approximation algorithms for problem variants where the center curve may have complexity at most $ell$ under the discrete Frechet distance. In particular, for fixed $k,ell$ and $varepsilon$, we give $(1+varepsilon)$-approximation algorithms for the $(k,ell)$-median and $(k,ell)$-center objectives and a polynomial-time exact algorithm for the $(k,ell)$-center objective.



rate research

Read More

In this work, we show the first worst-case to average-case reduction for the classical $k$-SUM problem. A $k$-SUM instance is a collection of $m$ integers, and the goal of the $k$-SUM problem is to find a subset of $k$ elements that sums to $0$. In the average-case version, the $m$ elements are chosen uniformly at random from some interval $[-u,u]$. We consider the total setting where $m$ is sufficiently large (with respect to $u$ and $k$), so that we are guaranteed (with high probability) that solutions must exist. Much of the appeal of $k$-SUM, in particular connections to problems in computational geometry, extends to the total setting. The best known algorithm in the average-case total setting is due to Wagner (following the approach of Blum-Kalai-Wasserman), and achieves a run-time of $u^{O(1/log k)}$. This beats the known (conditional) lower bounds for worst-case $k$-SUM, raising the natural question of whether it can be improved even further. However, in this work, we show a matching average-case lower-bound, by showing a reduction from worst-case lattice problems, thus introducing a new family of techniques into the field of fine-grained complexity. In particular, we show that any algorithm solving average-case $k$-SUM on $m$ elements in time $u^{o(1/log k)}$ will give a super-polynomial improvement in the complexity of algorithms for lattice problems.
Let ${cal L}$ be an arrangement of $n$ lines in the Euclidean plane. The emph{$k$-level} of ${cal L}$ consists of all vertices $v$ of the arrangement which have exactly $k$ lines of ${cal L}$ passing below $v$. The complexity (the maximum size) of the $k$-level in a line arrangement has been widely studied. In 1998 Dey proved an upper bound of $O(ncdot (k+1)^{1/3})$. Due to the correspondence between lines in the plane and great-circles on the sphere, the asymptotic bounds carry over to arrangements of great-circles on the sphere, where the $k$-level denotes the vertices at distance at most $k$ to a marked cell, the emph{south pole}. We prove an upper bound of $O((k+1)^2)$ on the expected complexity of the $k$-level in great-circle arrangements if the south pole is chosen uniformly at random among all cells. We also consider arrangements of great $(d-1)$-spheres on the sphere $mathbb{S}^d$ which are orthogonal to a set of random points on $mathbb{S}^d$. In this model, we prove that the expected complexity of the $k$-level is of order $Theta((k+1)^{d-1})$.
Let $mathcal{P}$ be an $mathcal{H}$-polytope in $mathbb{R}^d$ with vertex set $V$. The vertex centroid is defined as the average of the vertices in $V$. We prove that computing the vertex centroid of an $mathcal{H}$-polytope is #P-hard. Moreover, we show that even just checking whether the vertex centroid lies in a given halfspace is already #P-hard for $mathcal{H}$-polytopes. We also consider the problem of approximating the vertex centroid by finding a point within an $epsilon$ distance from it and prove this problem to be #P-easy by showing that given an oracle for counting the number of vertices of an $mathcal{H}$-polytope, one can approximate the vertex centroid in polynomial time. We also show that any algorithm approximating the vertex centroid to emph{any} ``sufficiently non-trivial (for example constant) distance, can be used to construct a fully polynomial approximation scheme for approximating the centroid and also an output-sensitive polynomial algorithm for the Vertex Enumeration problem. Finally, we show that for unbounded polyhedra the vertex centroid can not be approximated to a distance of $d^{{1/2}-delta}$ for any fixed constant $delta>0$.
We define and study a discrete process that generalizes the convex-layer decomposition of a planar point set. Our process, which we call homotopic curve shortening (HCS), starts with a closed curve (which might self-intersect) in the presence of a set $Psubset mathbb R^2$ of point obstacles, and evolves in discrete steps, where each step consists of (1) taking shortcuts around the obstacles, and (2) reducing the curve to its shortest homotopic equivalent. We find experimentally that, if the initial curve is held fixed and $P$ is chosen to be either a very fine regular grid or a uniformly random point set, then HCS behaves at the limit like the affine curve-shortening flow (ACSF). This connection between ACSF and HCS generalizes the link between ACSF and convex-layer decomposition (Eppstein et al., 2017; Calder and Smart, 2020), which is restricted to convex curves. We prove that HCS satisfies some properties analogous to those of ACSF: HCS is invariant under affine transformations, preserves convexity, and does not increase the total absolute curvature. Furthermore, the number of self-intersections of a curve, or intersections between two curves (appropriately defined), does not increase. Finally, if the initial curve is simple, then the number of inflection points (appropriately defined) does not increase.
Throughout this paper, a persistence diagram ${cal P}$ is composed of a set $P$ of planar points (each corresponding to a topological feature) above the line $Y=X$, as well as the line $Y=X$ itself, i.e., ${cal P}=Pcup{(x,y)|y=x}$. Given a set of persistence diagrams ${cal P}_1,...,{cal P}_m$, for the data reduction purpose, one way to summarize their topological features is to compute the {em center} ${cal C}$ of them first under the bottleneck distance. We consider two discre
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا