ترغب بنشر مسار تعليمي؟ اضغط هنا

On the computational tractability of a geographic clustering problem arising in redistricting

102   0   0.0 ( 0 )
 نشر من قبل Philip Klein
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Redistricting is the problem of dividing a state into a number $k$ of regions, called districts. Voters in each district elect a representative. The primary criteria are: each district is connected, district populations are equal (or nearly equal), and districts are compact. There are multiple competing definitions of compactness, usually minimizing some quantity. One measure that has been recently promoted by Duchin and others is number of cut edges. In redistricting, one is given atomic regions out of which each district must be built. The populations of the atomic regions are given. Consider the graph with one vertex per atomic region (with weight equal to the regions population) and an edge between atomic regions that share a boundary. A districting plan is a partition of vertices into $k$ parts, each connnected, of nearly equal weight. The districts are considered compact to the extent that the plan minimizes the number of edges crossing between different parts. Consider two problems: find the most compact districting plan, and sample districting plans under a compactness constraint uniformly at random. Both problems are NP-hard so we restrict the input graph to have branchwidth at most $w$. (A planar graphs branchwidth is bounded by its diameter.) If both $k$ and $w$ are bounded by constants, the problems are solvable in polynomial time. Assume vertices have weight~1. One would like algorithms whose running times are of the form $O(f(k,w) n^c)$ for some constant $c$ independent of $k$ and $w$, in which case the problems are said to be fixed-parameter tractable with respect to $k$ and $w$). We show that, under a complexity-theoretic assumption, no such algorithms exist. However, we do give algorithms with running time $O(c^wn^{k+1})$. Thus if the diameter of the graph is moderately small and the number of districts is very small, our algorithm is useable.



قيم البحث

اقرأ أيضاً

In this paper we investigate the parameterized complexity of the Maximum-Duo Preservation String Mapping Problem, the complementary of the Minimum Common String Partition Problem. We show that this problem is fixed-parameter tractable when parameteri zed by the number k of conserved duos, by first giving a parameterized algorithm based on the color-coding technique and then presenting a reduction to a kernel of size O(k^6 ).
We consider the problem of estimating a vector of discrete variables $(theta_1,cdots,theta_n)$, based on noisy observations $Y_{uv}$ of the pairs $(theta_u,theta_v)$ on the edges of a graph $G=([n],E)$. This setting comprises a broad family of statis tical estimation problems, including group synchronization on graphs, community detection, and low-rank matrix estimation. A large body of theoretical work has established sharp thresholds for weak and exact recovery, and sharp characterizations of the optimal reconstruction accuracy in such models, focusing however on the special case of Erdos--Renyi-type random graphs. The single most important finding of this line of work is the ubiquity of an information-computation gap. Namely, for many models of interest, a large gap is found between the optimal accuracy achievable by any statistical method, and the optimal accuracy achieved by known polynomial-time algorithms. Moreover, this gap is generally believed to be robust to small amounts of additional side information revealed about the $theta_i$s. How does the structure of the graph $G$ affect this picture? Is the information-computation gap a general phenomenon or does it only apply to specific families of graphs? We prove that the picture is dramatically different for graph sequences converging to amenable graphs (including, for instance, $d$-dimensional grids). We consider a model in which an arbitrarily small fraction of the vertex labels is revealed, and show that a linear-time local algorithm can achieve reconstruction accuracy that is arbitrarily close to the information-theoretic optimum. We contrast this to the case of random graphs. Indeed, focusing on group synchronization on random regular graphs, we prove that the information-computation gap still persists even when a small amount of side information is revealed.
A graph is said to be a Konig graph if the size of its maximum matching is equal to the size of its minimum vertex cover. The Konig Edge Deletion problem asks if in a given graph there exists a set of at most k edges whose deletion results in a Konig graph. While the vertex version of the problem (Konig vertex deletion) has been shown to be fixed-parameter tractable more than a decade ago, the fixed-parameter-tractability of the Konig Edge Deletion problem has been open since then, and has been conjectured to be W[1]-hard in several papers. In this paper, we settle the conjecture by proving it W[1]-hard. We prove that a variant of this problem, where we are given a graph G and a maximum matching M and we want a k-sized Konig edge deletion set that is disjoint from M, is fixed-parameter-tractable.
The Hospitals/Residents problem (HR) is a many-to-one matching problem whose solution concept is stability. It is widely used in assignment systems such as assigning medical students (residents) to hospitals. To resolve imbalance in the number of res idents assigned to hospitals, an extension called HR with regional caps (HRRC) was introduced. In this problem, a positive integer (called a regional cap) is associated with a subset of hospitals (called a region), and the total number of residents assigned to hospitals in a region must be at most its regional cap. Kamada and Kojima defined strong stability for HRRC and demonstrated that a strongly stable matching does not necessarily exist. Recently, Aziz et al. proved that the problem of determining if a strongly stable matching exists is NP-complete in general. In this paper, we refine Aziz et al.s result by investigating the computational complexity of the problem in terms of the length of preference lists, the size of regions, and whether or not regions can overlap, and completely classify tractable and intractable cases.
The problem of finding the maximum number of vertex-disjoint uni-color paths in an edge-colored graph (called MaxCDP) has been recently introduced in literature, motivated by applications in social network analysis. In this paper we investigate how t he complexity of the problem depends on graph parameters (namely the number of vertices to remove to make the graph a collection of disjoint paths and the size of the vertex cover of the graph), which makes sense since graphs in social networks are not random and have structure. The problem was known to be hard to approximate in polynomial time and not fixed-parameter tractable (FPT) for the natural parameter. Here, we show that it is still hard to approximate, even in FPT-time. Finally, we introduce a new variant of the problem, called MaxCDDP, whose goal is to find the maximum number of vertex-disjoint and color-disjoint uni-color paths. We extend some of the results of MaxCDP to this new variant, and we prove that unlike MaxCDP, MaxCDDP is already hard on graphs at distance two from disjoint paths.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا