• Corpus ID: 8268885

Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions

@inproceedings{Iyer2013CurvatureAO,
  title={Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions},
  author={Rishabh K. Iyer and Stefanie Jegelka and Jeff A. Bilmes},
  booktitle={NIPS},
  year={2013}
}
We investigate three related and important problems connected to machine learning: approximating a submodular function everywhere, learning a submodular function (in a PAC-like setting [28]), and constrained minimization of submodular functions. We show that the complexity of all three problems depends on the "curvature" of the submodular function, and provide lower and upper bounds that refine and improve previous results [2, 6, 8, 27]. Our proof techniques are fairly generic. We either use a… 

Figures and Tables from this paper

Optimal approximation for submodular and supermodular optimization with bounded curvature

TLDR
It is proved that the approximation results obtained are the best possible in the value oracle model, even in the case of a cardinality constraint.

Optimal approximation for unconstrained non-submodular minimization

TLDR
It is proved how a projected subgradient method can perform well even for certain non-submodular functions, and it is proved that in this model, the approximation result obtained is the best possible with a subexponential number of queries.

Near Optimal algorithms for constrained submodular programs with discounted cooperative costs

TLDR
This work provides a tighter connection between theory and practice by enabling theoretically satisfying guarantees for a rich class of expressible, natural, and useful submodular cost models.

The Power of Optimization from Samples

TLDR
This paper shows that for any monotone submodular function with curvature c there is a (1 - c)/(1 + c - c^2) approximation algorithm for maximization under cardinality constraints when polynomially-many samples are drawn from the uniform distribution over feasible sets.

Minimizing a Submodular Function from Samples

TLDR
There is a class of submodular functions with range in [0, 1] such that, despite being PAC-learnable and minimizable in polynomial-time, no algorithm can obtain an approximation strictly better than 1/2 − o(1) using polynomially-many samples drawn from any distribution.

Minimizing a Submodular Function from Samples

TLDR
There is a class of submodular functions with range in [0, 1] such that, despite being PAC-learnable and minimizable in polynomial-time, no algorithm can obtain an approximation strictly better than 1/2 − o(1) using polynomially-many samples drawn from any distribution.

Approximate Submodular Functions and Performance Guarantees

TLDR
This work considers the problem of maximizing non-negative non-decreasing set functions and introduces a novel concept of $\delta$-approximation of a function, which is used to define the space of submodular functions that lie within an approximation error.

Approximate Submodularity and its Applications: Subset Selection, Sparse Approximation and Dictionary Selection

TLDR
The submodularity ratio is introduced as a measure of how "close" to submodular a set function f is, and it is shown that when f has sub modularity ratio γ, the greedy algorithm for maximizing f provides a (1 - e-γ)-approximation.

Algorithms for Optimizing the Ratio of Submodular Functions

TLDR
It is shown that RS optimization can be solved with bounded approximation factors and a hardness bound is provided and the tightest algorithm matches the lower bound up to a log factor.

Monotone Closure of Relaxed Constraints in Submodular Optimization: Connections Between Minimization and Maximization

TLDR
This work shows a relaxation formulation and simple rounding strategy that, based on the monotone closure of relaxed constraints, reveals analogies between minimization and maximization problems, and includes known results as special cases and extends to a wider range of settings.
...

References

SHOWING 1-10 OF 57 REFERENCES

Fast Semidifferential-based Submodular Function Optimization

TLDR
This work presents a practical and powerful new framework for both unconstrained and constrained submodular function optimization based on discrete semidifferentials (sub- and super-differentials) and takes steps towards providing a unifying paradigm applicable to both sub modular minimization and maximization.

Algorithms for Approximate Minimization of the Difference Between Submodular Functions, with Applications

TLDR
This work empirically and theoretically shows that the per-iteration cost of the new algorithms is much less, and can be used to efficiently minimize a dierence between submodular functions under various combinatorial constraints, a problem not previously addressed.

Submodular Approximation: Sampling-based Algorithms and Lower Bounds

  • Zoya SvitkinaL. Fleischer
  • Computer Science, Mathematics
    2008 49th Annual IEEE Symposium on Foundations of Computer Science
  • 2008
TLDR
This work introduces several generalizations of classical computer science problems obtained by replacing simpler objective functions with general submodular functions, and presents an algorithm for approximately learning sub modular functions with special structure, whose guarantee is close to the lower bound.

A Tight Linear Time (1/2)-Approximation for Unconstrained Submodular Maximization

TLDR
This work presents a simple randomized linear time algorithm achieving a tight approximation guarantee of 1/2, thus matching the known hardness result of Feige et al.

Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints

TLDR
It is shown that both these problems are closely related and an approximation algorithm solving one can be used to obtain an approximation guarantee for the other, and hardness results for both problems are provided, thus showing that the approximation factors are tight up to log-factors.

Submodularity and Curvature: The Optimal Algorithm

TLDR
This paper analyzes the continuous greedy algorithm and proves that it gives a 1 c (1 e c )-approximation for any matroid, and shows that this holds for a relaxed notion of curvature, curvature with respect to the optimum, and prove that any better approximation under these conditions would require an exponential number of value queries.

Submodularity and curvature : the optimal algorithm

TLDR
This paper analyzes the continuous greedy algorithm and proves that it gives a 1 c (1 − e−c)-approximation for any matroid, and shows that this holds for a relaxed notion of curvature, curvature with respect to the optimum.

Submodular function maximization via the multilinear relaxation and contention resolution schemes

TLDR
A broadly applicable framework for maximizing linear and submodular functions subject to independence constraints is developed and it is shown that contention resolution schemes are an effective way to round a fractional solution, even when f is non-monotone.

Submodular Function Minimization under Covering Constraints

  • S. IwataK. Nagano
  • Mathematics, Computer Science
    2009 50th Annual IEEE Symposium on Foundations of Computer Science
  • 2009
TLDR
This paper presents a rounding 2-approximation algorithm for the sub modular vertex cover problem based on the half-integrality of the continuous relaxation problem, and shows that the rounding algorithm can be performed by one application of submodular function minimization on a ring family.

Learning Fourier Sparse Set Functions

TLDR
It is proved that if the authors choose O(k log 4 |P|) sets uniformly at random, then with high probability, observing any k-sparse function on those sets is sucient to recover that function exactly, and that other properties, such as symmetry or submodularity imply structure in the Fourier spectrum, which can be exploited to further reduce sample complexity.
...