Maintaining Perfect Matchings at Low Cost


الملخص بالإنكليزية

The min-cost matching problem suffers from being very sensitive to small changes of the input. Even in a simple setting, e.g., when the costs come from the metric on the line, adding two nodes to the input might change the optimal solution completely. On the other hand, one expects that small changes in the input should incur only small changes on the constructed solutions, measured as the number of modified edges. We introduce a two-stage model where we study the trade-off between quality and robustness of solutions. In the first stage we are given a set of nodes in a metric space and we must compute a perfect matching. In the second stage $2k$ new nodes appear and we must adapt the solution to a perfect matching for the new instance. We say that an algorithm is $(alpha,beta)$-robust if the solutions constructed in both stages are $alpha$-approximate with respect to min-cost perfect matchings, and if the number of edges deleted from the first stage matching is at most $beta k$. Hence, $alpha$ measures the quality of the algorithm and $beta$ its robustness. In this setting we aim to balance both measures by deriving algorithms for constant $alpha$ and $beta$. We show that there exists an algorithm that is $(3,1)$-robust for any metric if one knows the number $2k$ of arriving nodes in advance. For the case that $k$ is unknown the situation is significantly more involved. We study this setting under the metric on the line and devise a $(10,2)$-robust algorithm that constructs a solution with a recursive structure that carefully balances cost and redundancy.

تحميل البحث