Geometric Rescaling Algorithms for Submodular Function Minimization

Abstract

We present a new class of polynomial-time algorithms for submodular function minimization (SFM), as well as a unified framework to obtain strongly polynomial SFM algorithms. Our new algorithms are based on simple iterative methods for the minimum-norm problem, such as the conditional gradient and the Fujishige-Wolfe algorithms. We exhibit two techniques to turn simple iterative methods into polynomial-time algorithms. Firstly, we use the geometric rescaling technique, which has recently gained attention in linear programming. We adapt this technique to SFM and obtain a weakly polynomial bound O((n · EO+ n) log(nL)). Secondly, we exhibit a general combinatorial black-box approach to turn any strongly polynomial εL-approximate SFM oracle into a strongly polynomial exact SFM algorithm. This framework can be applied to a wide range of combinatorial and continuous algorithms, including pseudo-polynomial ones. In particular, we can obtain strongly polynomial algorithms by a repeated application of the conditional gradient or of the Fujishige-Wolfe algorithm. Combined with the geometric rescaling technique, the black-box approach provides a O((n · EO+ n) log n) algorithm. Finally, we show that one of the techniques we develop in the paper, “sliding”, can also be combined with the cutting-plane method of Lee, Sidford, and Wong [27], yielding a simplified variant of their O(n log n · EO+ n log n) algorithm. Centrum Wiskunde & Informatica. Supported by NWO Veni grant 639.071.510. London School of Economics. Supported by EPSRC First Grant EP/M02797X/1.

Cite this paper

@article{Dadush2017GeometricRA, title={Geometric Rescaling Algorithms for Submodular Function Minimization}, author={Daniel Dadush and L{\'a}szl{\'o} A. V{\'e}gh and Giacomo Zambelli}, journal={CoRR}, year={2017}, volume={abs/1707.05065} }