• Corpus ID: 16524401

Polyhedral aspects of Submodularity, Convexity and Concavity

@article{Iyer2015PolyhedralAO,
  title={Polyhedral aspects of Submodularity, Convexity and Concavity},
  author={Rishabh K. Iyer and Jeff A. Bilmes},
  journal={ArXiv},
  year={2015},
  volume={abs/1506.07329}
}
Seminal work by Edmonds and Lovasz shows the strong connection between submodularity and convexity. Submodular functions have tight modular lower bounds, and subdifferentials in a manner akin to convex functions. They also admit poly-time algorithms for minimization and satisfy the Fenchel duality theorem and the Discrete Seperation Theorem, both of which are fundamental characteristics of convex functions. Submodular functions also show signs similar to concavity. Submodular maximization… 

Approximate Submodularity and Its Implications in Discrete Optimization

TLDR
It is shown that previous analyses of mixed-integer sets, such as the submodular knapsack polytope, can be extended to the approximate submodularity setting and it is demonstrated that greedy algorithm bounds based on notions of approximate sub modularity are competitive with those in the literature.

Batch greedy maximization of non-submodular functions: Guarantees and applications to experimental design

TLDR
This work proposes and analyze batch greedy heuristics for cardinality constrained maximization of non-submodular non-decreasing set functions, and provides a novel reinterpretation of the classical greedy algorithm using the minorize-maximize (MM) principle.

Submitted to the Annals of Statistics RESTRICTED STRONG CONVEXITY IMPLIES WEAK SUBMODULARITY By

TLDR
This work shows that greedy algorithms perform within a constant factor from the best possible subset-selection solution for a broad class of general objective functions.

Submodularity In Machine Learning and Artificial Intelligence

TLDR
A gentle review of submodular and supermodularity and their properties, and how submodularity is useful for clustering, data partitioning, parallel machine learning, active and semi-supervised learning, probabilistic modeling, and structured norms and loss functions.

Stochastic Submodular Maximization: The Case of Coverage Functions

TLDR
This model captures situations where the discrete objective arises as an empirical risk, or is given as an explicit stochastic model, and yields solutions that are guaranteed to match the optimal approximation guarantees, while reducing the computational cost by several orders of magnitude, as demonstrated empirically.

Slack and Margin Rescaling as Convex Extensions of Supermodular Functions

TLDR
This analysis framework shows that, while neither margin nor slack rescaling dominate the other, known bounds on supermodular functions can be used to derive extensions that dominate both of these, indicating possible directions for defining novel structured output prediction surrogates.

Robust Submodular Minimization with Applications to Cooperative Modeling

TLDR
This is the first work to study the minimization version under a broad range of combinatorial constraints including cardinality, knapsack, matroid as well as graph-based constraints such as cuts, paths, matchings, and trees.

Submodlib: A Submodular Optimization Library

TLDR
SUBMODLIB is an open-source, easy-to-use, efficient and scalable Python library for submodular optimization with a C++ optimization engine that finds its application in summarization, data subset selection, hyper parameter tuning, efficient training and more.

Approximating Nash social welfare under rado valuations

TLDR
The approach gives the first constant-factor approximation algorithm for the asymmetric case under Rado valuations, provided that the maximum ratio between the weights is bounded by a constant.

The Lovász Hinge: A Novel Convex Surrogate for Submodular Losses

TLDR
The Lovász hinge is convex and yields an extension, which leads to oracle accesses to the loss function to compute a gradient or cutting-plane for non-supermodular loss functions.

References

SHOWING 1-10 OF 53 REFERENCES

Submodular Functions: Extensions, Distributions, and Algorithms. A Survey

TLDR
The purpose of this survey is to highlight the connection between extensions, distributions, relaxations, and optimization in the context of submodular functions, and present the first constant factor approximation algorithm for minimizing symmetric sub modular functions subject to a cardinality constraint.

Discrete convex analysis

TLDR
This work follows Rockafellar’s conjugate duality approach to convex/nonconvex programs in nonlinear optimization, while technically relying on the fundamental theorems of matroid-theoretic nature.

Submodular Approximation: Sampling-based Algorithms and Lower Bounds

  • Zoya SvitkinaL. Fleischer
  • Computer Science, Mathematics
    2008 49th Annual IEEE Symposium on Foundations of Computer Science
  • 2008
TLDR
This work introduces several generalizations of classical computer science problems obtained by replacing simpler objective functions with general submodular functions, and presents an algorithm for approximately learning sub modular functions with special structure, whose guarantee is close to the lower bound.

Theory of submodular programs: A fenchel-type min-max theorem and subgradients of submodular functions

TLDR
A convex (or concave) conjugate function of a submodular (or supermodular) function is defined and a Fenchel-type min-max theorem for sub modular and super modular functions is shown.

Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints

TLDR
It is shown that both these problems are closely related and an approximation algorithm solving one can be used to obtain an approximation guarantee for the other, and hardness results for both problems are provided, thus showing that the approximation factors are tight up to log-factors.

Polyhedrally Tight Set Functions and Discrete Convexity

This paper studies the class of polyhedrally tight functions in terms of the basic theorems on convex functions over <n, such as the Fenchel Duality Theorem, Separation Theorem etc. (Polyhedrally

Learning with Submodular Functions: A Convex Optimization Perspective

  • F. Bach
  • Computer Science
    Found. Trends Mach. Learn.
  • 2013
TLDR
In Learning with Submodular Functions: A Convex Optimization Perspective, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems.

Algorithms for Approximate Minimization of the Difference Between Submodular Functions, with Applications

TLDR
This work empirically and theoretically shows that the per-iteration cost of the new algorithms is much less, and can be used to efficiently minimize a dierence between submodular functions under various combinatorial constraints, a problem not previously addressed.

Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions

TLDR
It is shown that the complexity of all three problems connected to machine learning depends on the "curvature" of the submodular function, and lower and upper bounds are provided that refine and improve previous results.
...