@article{Chakrabarty2017SubquadraticSF,
author={Deeparnab Chakrabarty and Y. Lee and Aaron Sidford and Sam Chiu-wai Wong},
journal={Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing},
year={2017}
}
Submodular function minimization (SFM) is a fundamental discrete optimization problem which generalizes many well known problems, has applications in various fields, and can be solved in polynomial time. Owing to applications in computer vision and machine learning, fast SFM algorithms are highly desirable. The current fastest algorithms [Lee, Sidford, Wong, 2015] run in O(n2lognM· EO + n3logO(1)nM) time and O(n3log2n· EO +n4logO(1)n)time respectively, where M is the largest absolute value of… Expand

Paper Mentions

A lower bound for parallel submodular minimization
• Computer Science
• STOC
• 2020
This paper gives the first non-trivial lower bound for the parallel runtime of submodular minimization, and shows that there is no o(logn/loglogn)-adaptive algorithm with poly(n) queries which solves the problem of sub MODULAR minimization. Expand
Minimizing a Submodular Function from Samples
• Computer Science, Mathematics
• NIPS
• 2017
There is a class of submodular functions with range in [0, 1] such that, despite being PAC-learnable and minimizable in polynomial-time, no algorithm can obtain an approximation strictly better than 1/2 − o(1) using polynomially-many samples drawn from any distribution. Expand
Minimizing a Submodular Function from Samples
In this paper we consider the problem of minimizing a submodular function from training data. Submodular functions can be efficiently minimized and are consequently heavily applied in machineExpand
Decomposable Submodular Function Minimization via Maximum Flow
• Computer Science
• ICML
• 2021
This paper bridges discrete and continuous optimization approaches for decomposable submodular function minimization, in both the standard and parametric settings. We provide improved running timesExpand
Submodular Minimization Under Congruency Constraints
• Mathematics, Computer Science
• Comb.
• 2019
It is shown that efficient SFM is possible even for a significantly larger class than parity constraints, by introducing a new approach that combines techniques from Combinatorial Optimization, Combinatorics, and Number Theory. Expand
Submodular Minimization Under Congruency Constraints
• Computer Science, Mathematics
• SODA
• 2018
It is shown that efficient SFM is possible even for a significantly larger class than parity constraints, by introducing a new approach that combines techniques from Combinatorial Optimization, Combinatorics, and Number Theory. Expand
Submodular Maximization via Gradient Ascent: The Case of Deep Submodular Functions
• Computer Science, Medicine
• NeurIPS
• 2018
This work shows that the multilinear extension of any DSF has a natural and computationally attainable concave relaxation that it can optimize using gradient ascent, and guarantees a guarantee of max 0 < δ < 1 with a running time of O(n 2 /ϵ 2 ) plus time for pipage rounding. Expand
Geometric Rescaling Algorithms for Submodular Function Minimization
• Computer Science, Mathematics
• SODA
• 2018
A new class of polynomial-time algorithms for submodular function minimization (SFM), as well as a unified framework to obtain stronglyPolynomial SFM algorithms, which can be applied to a wide range of combinatorial and continuous algorithms, including pseudo-polynomial ones. Expand
Optimal approximation for unconstrained non-submodular minimization
• Computer Science, Mathematics
• ICML
• 2020
It is proved how a projected subgradient method can perform well even for certain non-submodular functions, and it is proved that in this model, the approximation result obtained is the best possible with a subexponential number of queries. Expand
Submodular Function Minimization with Noisy Evaluation Oracle
• Shinji Ito
• Computer Science, Mathematics
• NeurIPS
• 2019
An algorithm is provided with an algorithm with an-additive error bound as well as a worst-case analysis including a lower bound of $\Omega(n/\sqrt{T})$, which together imply that the algorithm achieves an optimal error bound up to a constant. Expand

References

SHOWING 1-10 OF 65 REFERENCES
Provable Submodular Minimization using Wolfe's Algorithm
• Mathematics, Computer Science
• NIPS
• 2014
A maiden convergence analysis of Wolfe's algorithm is given and a robust version of Fujishige's theorem is proved which shows that an O(1/n2)-approximate solution to the min-norm point on the base polytope implies exact submodular minimization. Expand
A simple combinatorial algorithm for submodular function minimization
• Computer Science
• SODA
• 2009
This is the first fully combinatorial submodular function minimization algorithm that does not rely on the scaling method and can be implemented in strongly polynomial time using only additions, subtractions, comparisons, and the oracle calls for function evaluation. Expand
Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints
• Computer Science, Mathematics
• NIPS
• 2013
It is shown that both these problems are closely related and an approximation algorithm solving one can be used to obtain an approximation guarantee for the other, and hardness results for both problems are provided, thus showing that the approximation factors are tight up to log-factors. Expand
Submodular Approximation: Sampling-based Algorithms and Lower Bounds
• Mathematics, Computer Science
• 2008 49th Annual IEEE Symposium on Foundations of Computer Science
• 2008
This work introduces several generalizations of classical computer science problems obtained by replacing simpler objective functions with general submodular functions, and presents an algorithm for approximately learning sub modular functions with special structure, whose guarantee is close to the lower bound. Expand
Efficient Minimization of Decomposable Submodular Functions
• Computer Science, Mathematics
• NIPS
• 2010
This paper develops an algorithm, SLG, that can efficiently minimize decomposable submodular functions with tens of thousands of variables, and applies it to synthetic benchmarks and a joint classification-and-segmentation task, and shows that it outperforms the state-of-the-art general purpose sub modular minimization algorithms by several orders of magnitude. Expand
A Tight Linear Time (1/2)-Approximation for Unconstrained Submodular Maximization
• Computer Science, Mathematics
• 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science
• 2012
This work presents a simple randomized linear time algorithm achieving a tight approximation guarantee of 1/2, thus matching the known hardness result of Feige et al. Expand
Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions
• Computer Science, Mathematics
• NIPS
• 2013
It is shown that the complexity of all three problems connected to machine learning depends on the "curvature" of the submodular function, and lower and upper bounds are provided that refine and improve previous results. Expand
On the Convergence Rate of Decomposable Submodular Function Minimization
• Computer Science, Mathematics
• NIPS
• 2014
It is shown that the algorithm converges linearly, and the upper and lower bounds on the rate of convergence are provided, which relies on the geometry of submodular polyhedra and draws on results from spectral graph theory. Expand
Learning with Submodular Functions: A Convex Optimization Perspective
• F. Bach
• Computer Science, Mathematics
• Found. Trends Mach. Learn.
• 2013
In Learning with Submodular Functions: A Convex Optimization Perspective, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. Expand
Random Coordinate Descent Methods for Minimizing Decomposable Submodular Functions
• Computer Science
• ICML
• 2015
This paper uses random coordinate descent methods to obtain algorithms with faster linear convergence rates and cheaper iteration costs, and their algorithms converge in significantly fewer iterations. Expand