ترغب بنشر مسار تعليمي؟ اضغط هنا

Multi-way sparsest cut problem on trees with a control on the number of parts and outliers

192   0   0.0 ( 0 )
 نشر من قبل Ramin Javadi
 تاريخ النشر 2017
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Given a graph, the sparsest cut problem asks for a subset of vertices whose edge expansion (the normalized cut given by the subset) is minimized. In this paper, we study a generalization of this problem seeking for $ k $ disjoint subsets of vertices (clusters) whose all edge expansions are small and furthermore, the number of vertices remained in the exterior of the subsets (outliers) is also small. We prove that although this problem is $ NP-$hard for trees, it can be solved in polynomial time for all weighted trees, provided that we restrict the search space to subsets which induce connected subgraphs. The proposed algorithm is based on dynamic programming and runs in the worst case in $ O(k^2 n^3) $, when $ n $ is the number of vertices and $ k $ is the number of clusters. It also runs in linear time when the number of clusters and the number of outliers is bounded by a constant.



قيم البحث

اقرأ أيضاً

Given a weighted graph $G=(V,E)$ with weight functions $c:Eto mathbb{R}_+$ and $pi:Vto mathbb{R}_+$, and a subset $Usubseteq V$, the normalized cut value for $U$ is defined as the sum of the weights of edges exiting $U$ divided by the weight of verti ces in $U$. The {it mean isoperimetry problem}, $mathsf{ISO}^1(G,k)$, for a weighted graph $G$ is a generalization of the classical uniform sparsest cut problem in which, given a parameter $k$, the objective is to find $k$ disjoint nonempty subsets of $V$ minimizing the average normalized cut value of the parts. The robust version of the problem seeks an optimizer where the number of vertices that fall out of the subpartition is bounded by some given integer $0 leq rho leq |V|$. Our main result states that $mathsf{ISO}^1(G,k)$, as well as its robust version, $mathsf{CRISO}^1(G,k,rho)$, subjected to the condition that each part of the subpartition induces a connected subgraph, are solvable in time $O(k^2 rho^2 pi(V(T)^3)$ on any weighted tree $T$, in which $pi(V(T))$ is the sum of the vertex-weights. This result implies that $mathsf{ISO}^1(G,k)$ is strongly polynomial-time solvable on weighted trees when the vertex-weights are polynomially bounded and may be compared to the fact that the problem is NP-Hard for weighted trees in general. Also, using this, we show that both mentioned problems, $mathsf{ISO}^1(G,k)$ and $mathsf{CRISO}^1(G,k,rho)$ as well as the ordinary robust mean isoperimetry problem $mathsf{RISO}^1(G,k,rho)$, admit polynomial-time $O(log^{1.5}|V| loglog |V|)$-approximation algorithms for weighted graphs with polynomially bounded weights, using the R{a}cke-Shah tree cut sparsifier.
The (non-uniform) sparsest cut problem is the following graph-partitioning problem: given a supply graph, and demands on pairs of vertices, delete some subset of supply edges to minimize the ratio of the supply edges cut to the total demand of the pa irs separated by this deletion. Despite much effort, there are only a handful of nontrivial classes of supply graphs for which constant-factor approximations are known. We consider the problem for planar graphs, and give a $(2+varepsilon)$-approximation algorithm that runs in quasipolynomial time. Our approach defines a new structural decomposition of an optimal solution using a patching primitive. We combine this decomposition with a Sherali-Adams-style linear programming relaxation of the problem, which we then round. This should be compared with the polynomial-time approximation algorithm of Rao (1999), which uses the metric linear programming relaxation and $ell_1$-embeddings, and achieves an $O(sqrt{log n})$-approximation in polynomial time.
79 - Petr Kolman 2017
Given a graph $G=(V,E)$ with two distinguished vertices $s,tin V$ and an integer parameter $L>0$, an {em $L$-bounded cut} is a subset $F$ of edges (vertices) such that the every path between $s$ and $t$ in $Gsetminus F$ has length more than $L$. The task is to find an $L$-bounded cut of minimum cardinality. Though the problem is very simple to state and has been studied since the beginning of the 70s, it is not much understood yet. The problem is known to be $cal{NP}$-hard to approximate within a small constant factor even for $Lgeq 4$ (for $Lgeq 5$ for the vertex cuts). On the other hand, the best known approximation algorithm for general graphs has approximation ratio only $mathcal{O}({n^{2/3}})$ in the edge case, and $mathcal{O}({sqrt{n}})$ in the vertex case, where $n$ denotes the number of vertices. We show that for planar graphs, it is possible to solve both the edge- and the vertex-version of the problem optimally in time $mathcal{O}(L^{3L}n)$. That is, the problem is fixed parameter tractable (FPT) with respect to $L$ on planar graphs. Furthermore, we show that the problem remains FPT even for bounded genus graphs, a super class of planar graphs. Our second contribution deals with approximations of the vertex version of the problem. We describe an algorithm that for a given a graph $G$, its tree decomposition of treewidth $tau$ and vertices $s$ and $t$ computes a $tau$-approximation of the minimum $L$-bounded $s-t$ vertex cut; if the decomposition is not given, then the approximation ratio is $mathcal{O}(tau sqrt{log tau})$. For graphs with treewidth bounded by $mathcal{O}(n^{1/2-epsilon})$ for any $epsilon>0$, but not by a constant, this is the best approximation in terms of~$n$ that we are aware of.
Search trees are commonly used to implement access operations to a set of stored keys. If this set is static and the probabilities of membership queries are known in advance, then one can precompute an optimal search tree, namely one that minimizes t he expected access cost. For a non-key query, a search tree can determine its approximate location by returning the inter-key interval containing the query. This is in contrast to other dictionary data structures, like hash tables, that only report a failed search. We address the question what is the additional cost of determining approximate locations for non-key queries? We prove that for two-way comparison trees this additional cost is at most 1. Our proof is based on a novel probabilistic argument that involves converting a search tree that does not identify non-key queries into a random tree that does.
We study the NP-hard textsc{$k$-Sparsest Cut} problem ($k$SC) in which, given an undirected graph $G = (V, E)$ and a parameter $k$, the objective is to partition vertex set into $k$ subsets whose maximum edge expansion is minimized. Herein, the edge expansion of a subset $S subseteq V$ is defined as the sum of the weights of edges exiting $S$ divided by the number of vertices in $S$. Another problem that has been investigated is textsc{$k$-Small-Set Expansion} problem ($k$SSE), which aims to find a subset with minimum edge expansion with a restriction on the size of the subset. We extend previous studies on $k$SC and $k$SSE by inspecting their parameterized complexity. On the positive side, we present two FPT algorithms for both $k$SSE and 2SC problems where in the first algorithm we consider the parameter treewidth of the input graph and uses exponential space, and in the second we consider the parameter vertex cover number of the input graph and uses polynomial space. Moreover, we consider the unweighted version of the $k$SC problem where $k geq 2$ is fixed and proposed two FPT algorithms with parameters treewidth and vertex cover number of the input graph. We also propose a randomized FPT algorithm for $k$SSE when parameterized by $k$ and the maximum degree of the input graph combined. Its derandomization is done efficiently. oindent On the negative side, first we prove that for every fixed integer $k,taugeq 3$, the problem $k$SC is NP-hard for graphs with vertex cover number at most $tau$. We also show that $k$SC is W[1]-hard when parameterized by the treewidth of the input graph and the number~$k$ of components combined using a reduction from textsc{Unary Bin Packing}. Furthermore, we prove that $k$SC remains NP-hard for graphs with maximum degree three and also graphs with degeneracy two. Finally, we prove that the unweighted $k$SSE is W[1]-hard for the parameter $k$.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا