Corpus ID: 1079747

Random Coordinate Descent Methods for Minimizing Decomposable Submodular Functions

@article{Ene2015RandomCD,
  title={Random Coordinate Descent Methods for Minimizing Decomposable Submodular Functions},
  author={Alina Ene and Huy L. Nguyen},
  journal={ArXiv},
  year={2015},
  volume={abs/1502.02643}
}
Submodular function minimization is a fundamental optimization problem that arises in several applications in machine learning and computer vision. The problem is known to be solvable in polynomial time, but general purpose algorithms have high running times and are unsuitable for large-scale problems. Recent work have used convex optimization techniques to obtain very practical algorithms for minimizing functions that are sums of "simple" functions. In this paper, we use random coordinate… Expand
Quadratic Decomposable Submodular Function Minimization
TLDR
This work introduces a new convex optimization problem, termed quadratic decomposable submodular function minimization, and describes an objective that may be optimized via random coordinate descent methods and projections onto cones and establishes the linear convergence rate of the RCD algorithm. Expand
Fast Decomposable Submodular Function Minimization using Constrained Total Variation
TLDR
A modified convex problem requiring constrained version of the total variation oracles that can be solved with significantly fewer calls to the simple minimization oracles is considered. Expand
Quadratic Decomposable Submodular Function Minimization: Theory and Practice
We introduce a new convex optimization problem, termed quadratic decomposable submodular function minimization (QDSFM), which allows to model a number of learning tasks on graphs and hypergraphs. TheExpand
Quadratic Decomposable Submodular Function Minimization: Theory and Practice
TLDR
A new convex optimization problem, termed quadratic decomposable submodular function minimization (QDSFM), which allows to model a number of learning tasks on graphs and hypergraphs and two new applications of QDSFM are described: hypergraph-adapted PageRank and semi-supervised learning. Expand
Decomposable Submodular Function Minimization via Maximum Flow
This paper bridges discrete and continuous optimization approaches for decomposable submodular function minimization, in both the standard and parametric settings. We provide improved running timesExpand
Geometric Rescaling Algorithms for Submodular Function Minimization
TLDR
A new class of polynomial-time algorithms for submodular function minimization (SFM), as well as a unified framework to obtain stronglyPolynomial SFM algorithms, which can be applied to a wide range of combinatorial and continuous algorithms, including pseudo-polynomial ones. Expand
Subquadratic submodular function minimization
TLDR
For integer-valued submodular functions, this paper gives an SFM algorithm which runs in O(nM3logn· EO) time giving the first nearly linear time algorithm in any known regime. Expand
Minimizing a Submodular Function from Samples
TLDR
There is a class of submodular functions with range in [0, 1] such that, despite being PAC-learnable and minimizable in polynomial-time, no algorithm can obtain an approximation strictly better than 1/2 − o(1) using polynomially-many samples drawn from any distribution. Expand
Greed is good : greedy optimization methods for large-scale structured problems
TLDR
This dissertation shows that greedy coordinate descent and Kaczmarz methods have efficient implementations and can be faster than their randomized counterparts for certain common problem structures in machine learning, and shows linear convergence for greedy (block) coordinate descent methods under a revived relaxation of strong convexity from 1963. Expand
Convex Optimization for Parallel Energy Minimization
TLDR
This work reformulates the quadratic energy minimization problem as a total variation denoising problem, which, when viewed geometrically, enables the use of projection and reflection based convex methods and performs an extensive empirical evaluation comparing state-of-the-art combinatorial algorithms and convex optimization techniques. Expand
...
1
2
3
4
...

References

SHOWING 1-10 OF 21 REFERENCES
On the Convergence Rate of Decomposable Submodular Function Minimization
TLDR
It is shown that the algorithm converges linearly, and the upper and lower bounds on the rate of convergence are provided, which relies on the geometry of submodular polyhedra and draws on results from spectral graph theory. Expand
Efficient Minimization of Decomposable Submodular Functions
TLDR
This paper develops an algorithm, SLG, that can efficiently minimize decomposable submodular functions with tens of thousands of variables, and applies it to synthetic benchmarks and a joint classification-and-segmentation task, and shows that it outperforms the state-of-the-art general purpose sub modular minimization algorithms by several orders of magnitude. Expand
Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • SIAM J. Optim.
  • 2012
TLDR
Surprisingly enough, for certain classes of objective functions, the proposed methods for solving huge-scale optimization problems are better than the standard worst-case bounds for deterministic algorithms. Expand
A push-relabel framework for submodular function minimization and applications to parametric optimization
TLDR
This paper improves the running time of Schrijver's algorithm by designing a push-relabel framework for submodular function minimization (SFM), and extends this algorithm to carry out parametric minimization for a strong map sequence of sub modular functions in the same asymptotic running time as a single SFM. Expand
A FASTER SCALING ALGORITHM FOR MINIMIZING SUBMODULAR FUNCTIONS∗
Combinatorial strongly polynomial algorithms for minimizing submodular functions have been developed by Iwata, Fleischer, and Fujishige (IFF) and by Schrijver. The IFF algorithm employs a scalingExpand
Learning with Submodular Functions: A Convex Optimization Perspective
  • F. Bach
  • Computer Science, Mathematics
  • Found. Trends Mach. Learn.
  • 2013
TLDR
In Learning with Submodular Functions: A Convex Optimization Perspective, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. Expand
Submodular functions and convexity
TLDR
In “continuous” optimization convex functions play a central role, and linear programming may be viewed as the optimization of very special (linear) objective functions over very special convex domains (polyhedra). Expand
Reflection methods for user-friendly submodular optimization
TLDR
This work proposes a new method that exploits existing decomposability of submodular functions, and solves both the continuous and discrete formulations of the problem, and therefore has applications in learning, inference, and reconstruction. Expand
A Combinatorial Algorithm Minimizing Submodular Functions in Strongly Polynomial Time
  • A. Schrijver
  • Mathematics, Computer Science
  • J. Comb. Theory, Ser. B
  • 2000
We give a strongly polynomial-time algorithm minimizing a submodular function f given by a value-giving oracle. The algorithm does not use the ellipsoid method or any other linear programming method.Expand
Minimizing a sum of submodular functions
  • V. Kolmogorov
  • Computer Science, Mathematics
  • Discret. Appl. Math.
  • 2012
TLDR
This work casts the problem of minimizing a function represented as a sum of submodular terms in an auxiliary graph in such a way that applying most existing SF algorithms would rely only on these subroutines, and shows how to improve its complexity in the case when the function contains cardinality-dependent terms. Expand
...
1
2
3
...