It has been observed independently by many researchers that the isolating cut lemma of Li and Panigrahi [FOCS 2020] can be easily extended to obtain new algorithms for finding the non-trivial minimizer of a symmetric submodular function and solving the hypergraph minimum cut problem. This note contains these observations.
Submodular function minimization (SFM) is a fundamental discrete optimization problem which generalizes many well known problems, has applications in various fields, and can be solved in polynomial time. Owing to applications in computer vision and m
achine learning, fast SFM algorithms are highly desirable. The current fastest algorithms [Lee, Sidford, Wong, FOCS 2015] run in $O(n^{2}log nMcdottextrm{EO} +n^{3}log^{O(1)}nM)$ time and $O(n^{3}log^{2}ncdot textrm{EO} +n^{4}log^{O(1)}n$) time respectively, where $M$ is the largest absolute value of the function (assuming the range is integers) and $textrm{EO}$ is the time taken to evaluate the function on any set. Although the best known lower bound on the query complexity is only $Omega(n)$, the current shortest non-deterministic proof certifying the optimum value of a function requires $Theta(n^{2})$ function evaluations. The main contribution of this paper are subquadratic SFM algorithms. For integer-valued submodular functions, we give an SFM algorithm which runs in $O(nM^{3}log ncdottextrm{EO})$ time giving the first nearly linear time algorithm in any known regime. For real-valued submodular functions with range in $[-1,1]$, we give an algorithm which in $tilde{O}(n^{5/3}cdottextrm{EO}/varepsilon^{2})$ time returns an $varepsilon$-additive approximate solution. At the heart of it, our algorithms are projected stochastic subgradient descent methods on the Lovasz extension of submodular functions where we crucially exploit submodularity and data structures to obtain fast, i.e. sublinear time subgradient updates. . The latter is crucial for beating the $n^{2}$ bound as we show that algorithms which access only subgradients of the Lovasz extension, and these include the theoretically best algorithms mentioned above, must make $Omega(n)$ subgradient calls (even for functions whose range is ${-1,0,1}$).
We consider submodular function minimization in the oracle model: given black-box access to a submodular set function $f:2^{[n]}rightarrow mathbb{R}$, find an element of $argmin_S {f(S)}$ using as few queries to $f(cdot)$ as possible. State-of-the-ar
t algorithms succeed with $tilde{O}(n^2)$ queries [LeeSW15], yet the best-known lower bound has never been improved beyond $n$ [Harvey08]. We provide a query lower bound of $2n$ for submodular function minimization, a $3n/2-2$ query lower bound for the non-trivial minimizer of a symmetric submodular function, and a $binom{n}{2}$ query lower bound for the non-trivial minimizer of an asymmetric submodular function. Our $3n/2-2$ lower bound results from a connection between SFM lower bounds and a novel concept we term the cut dimension of a graph. Interestingly, this yields a $3n/2-2$ cut-query lower bound for finding the global mincut in an undirected, weighted graph, but we also prove it cannot yield a lower bound better than $n+1$ for $s$-$t$ mincut, even in a directed, weighted graph.
This paper bridges discrete and continuous optimization approaches for decomposable submodular function minimization, in both the standard and parametric settings. We provide improved running times for this problem by reducing it to a number of cal
ls to a maximum flow oracle. When each function in the decomposition acts on $O(1)$ elements of the ground set $V$ and is polynomially bounded, our running time is up to polylogarithmic factors equal to that of solving maximum flow in a sparse graph with $O(vert V vert)$ vertices and polynomial integral capacities. We achieve this by providing a simple iterative method which can optimize to high precision any convex function defined on the submodular base polytope, provided we can efficiently minimize it on the base polytope corresponding to the cut function of a certain graph that we construct. We solve this minimization problem by lifting the solutions of a parametric cut problem, which we obtain via a new efficient combinatorial reduction to maximum flow. This reduction is of independent interest and implies some previously unknown bounds for the parametric minimum $s,t$-cut problem in multiple settings.
In this paper we provide improved running times and oracle complexities for approximately minimizing a submodular function. Our main result is a randomized algorithm, which given any submodular function defined on $n$-elements with range $[-1, 1]$, c
omputes an $epsilon$-additive approximate minimizer in $tilde{O}(n/epsilon^2)$ oracle evaluations with high probability. This improves over the $tilde{O}(n^{5/3}/epsilon^2)$ oracle evaluation algorithm of Chakrabarty etal~(STOC 2017) and the $tilde{O}(n^{3/2}/epsilon^2)$ oracle evaluation algorithm of Hamoudi etal. Further, we leverage a generalization of this result to obtain efficient algorithms for minimizing a broad class of nonconvex functions. For any function $f$ with domain $[0, 1]^n$ that satisfies $frac{partial^2f}{partial x_i partial x_j} le 0$ for all $i eq j$ and is $L$-Lipschitz with respect to the $L^infty$-norm we give an algorithm that computes an $epsilon$-additive approximate minimizer with $tilde{O}(n cdot mathrm{poly}(L/epsilon))$ function evaluation with high probability.
We study the problem of maximizing a monotone $k$-submodular function $f$ under a knapsack constraint, where a $k$-submodular function is a natural generalization of a submodular function to $k$ dimensions. We present a deterministic $(frac12-frac{1}
{2e})$-approximation algorithm that evaluates $f$ $O(n^5k^4)$ times.