No Arabic abstract
In this paper, we consider the Minimum-Load $k$-Clustering/Facility Location (MLkC) problem where we are given a set $P$ of $n$ points in a metric space that we have to cluster and an integer $k$ that denotes the number of clusters. Additionally, we are given a set $F$ of cluster centers in the same metric space. The goal is to select a set $Csubseteq F$ of $k$ centers and assign each point in $P$ to a center in $C$, such that the maximum load over all centers is minimized. Here the load of a center is the sum of the distances between it and the points assigned to it. Although clustering/facility location problems have a rich literature, the minimum-load objective is not studied substantially, and hence MLkC has remained a poorly understood problem. More interestingly, the problem is notoriously hard even in some special cases including the one in line metrics as shown by Ahmadian et al. [ACM Trans. Algo. 2018]. They also show APX-hardness of the problem in the plane. On the other hand, the best-known approximation factor for MLkC is $O(k)$, even in the plane. In this work, we study a fair version of MLkC inspired by the work of Chierichetti et al. [NeurIPS, 2017], which generalizes MLkC. Here the input points are colored by one of the $ell$ colors denoting the group they belong to. MLkC is the special case with $ell=1$. Considering this problem, we are able to obtain a $3$-approximation in $f(k,ell)cdot n^{O(1)}$ time. Also, our scheme leads to an improved $(1 + epsilon)$-approximation in case of Euclidean norm, and in this case, the running time depends only polynomially on the dimension $d$. Our results imply the same approximations for MLkC with running time $f(k)cdot n^{O(1)}$, achieving the first constant approximations for this problem in general and Euclidean metric spaces.
In clustering problems, a central decision-maker is given a complete metric graph over vertices and must provide a clustering of vertices that minimizes some objective function. In fair clustering problems, vertices are endowed with a color (e.g., membership in a group), and the features of a valid clustering might also include the representation of colors in that clustering. Prior work in fair clustering assumes complete knowledge of group membership. In this paper, we generalize prior work by assuming imperfect knowledge of group membership through probabilistic assignments. We present clustering algorithms in this more general setting with approximation ratio guarantees. We also address the problem of metric membership, where different groups have a notion of order and distance. Experiments are conducted using our proposed algorithms as well as baselines to validate our approach and also surface nuanced concerns when group membership is not known deterministically.
We present an $(e^{O(p)} frac{log ell}{loglogell})$-approximation algorithm for socially fair clustering with the $ell_p$-objective. In this problem, we are given a set of points in a metric space. Each point belongs to one (or several) of $ell$ groups. The goal is to find a $k$-medians, $k$-means, or, more generally, $ell_p$-clustering that is simultaneously good for all of the groups. More precisely, we need to find a set of $k$ centers $C$ so as to minimize the maximum over all groups $j$ of $sum_{u text{ in group }j} d(u,C)^p$. The socially fair clustering problem was independently proposed by Ghadiri, Samadi, and Vempala [2021] and Abbasi, Bhaskara, and Venkatasubramanian [2021]. Our algorithm improves and generalizes their $O(ell)$-approximation algorithms for the problem. The natural LP relaxation for the problem has an integrality gap of $Omega(ell)$. In order to obtain our result, we introduce a strengthened LP relaxation and show that it has an integrality gap of $Theta(frac{log ell}{loglogell})$ for a fixed $p$. Additionally, we present a bicriteria approximation algorithm, which generalizes the bicriteria approximation of Abbasi et al. [2021].
Given a set of $n$ terminals, which are points in $d$-dimensional Euclidean space, the minimum Manhattan network problem (MMN) asks for a minimum-length rectilinear network that connects each pair of terminals by a Manhattan path, that is, a path consisting of axis-parallel segments whose total length equals the pairs Manhattan distance. Even for $d=2$, the problem is NP-hard, but constant-factor approximations are known. For $d ge 3$, the problem is APX-hard; it is known to admit, for any $eps > 0$, an $O(n^eps)$-approximation. In the generalized minimum Manhattan network problem (GMMN), we are given a set $R$ of $n$ terminal pairs, and the goal is to find a minimum-length rectilinear network such that each pair in $R$ is connected by a Manhattan path. GMMN is a generalization of both MMN and the well-known rectilinear Steiner arborescence problem (RSA). So far, only special cases of GMMN have been considered. We present an $O(log^{d+1} n)$-approximation algorithm for GMMN (and, hence, MMN) in $d ge 2$ dimensions and an $O(log n)$-approximation algorithm for 2D. We show that an existing $O(log n)$-approximation algorithm for RSA in 2D generalizes easily to $d>2$ dimensions.
We consider the $k$-clustering problem with $ell_p$-norm cost, which includes $k$-median, $k$-means and $k$-center cost functions, under an individual notion of fairness proposed by Jung et al. [2020]: given a set of points $P$ of size $n$, a set of $k$ centers induces a fair clustering if for every point $vin P$, $v$ can find a center among its $n/k$ closest neighbors. Recently, Mahabadi and Vakilian [2020] showed how to get a $(p^{O(p)},7)$-bicriteria approximation for the problem of fair $k$-clustering with $ell_p$-norm cost: every point finds a center within distance at most $7$ times its distance to its $(n/k)$-th closest neighbor and the $ell_p$-norm cost of the solution is at most $p^{O(p)}$ times the cost of an optimal fair solution. In this work, for any $varepsilon>0$, we present an improved $(16^p +varepsilon,3)$-bicriteria approximation for the fair $k$-clustering with $ell_p$-norm cost. To achieve our guarantees, we extend the framework of [Charikar et al., 2002, Swamy, 2016] and devise a $16^p$-approximation algorithm for the facility location with $ell_p$-norm cost under matroid constraint which might be of an independent interest. Besides, our approach suggests a reduction from our individually fair clustering to a clustering with a group fairness requirement proposed by Kleindessner et al. [2019], which is essentially the median matroid problem [Krishnaswamy et al., 2011].
Given $n$ points in a $d$ dimensional Euclidean space, the Minimum Enclosing Ball (MEB) problem is to find the ball with the smallest radius which contains all $n$ points. We give a $O(ndQcal/sqrt{epsilon})$ approximation algorithm for producing an enclosing ball whose radius is at most $epsilon$ away from the optimum (where $Qcal$ is an upper bound on the norm of the points). This improves existing results using emph{coresets}, which yield a $O(nd/epsilon)$ greedy algorithm. Finding the Minimum Enclosing Convex Polytope (MECP) is a related problem wherein a convex polytope of a fixed shape is given and the aim is to find the smallest magnification of the polytope which encloses the given points. For this problem we present a $O(mndQcal/epsilon)$ approximation algorithm, where $m$ is the number of faces of the polytope. Our algorithms borrow heavily from convex duality and recently developed techniques in non-smooth optimization, and are in contrast with existing methods which rely on geometric arguments. In particular, we specialize the excessive gap framework of citet{Nesterov05a} to obtain our results.