• Corpus ID: 17696185

Parametric Maxflows for Structured Sparse Learning with Convex Relaxations of Submodular Functions

@article{Kawahara2015ParametricMF,
  title={Parametric Maxflows for Structured Sparse Learning with Convex Relaxations of Submodular Functions},
  author={Y. Kawahara and Yutaro Yamaguchi},
  journal={ArXiv},
  year={2015},
  volume={abs/1509.03946}
}
The proximal problem for structured penalties obtained via convex relaxations of submodular functions is known to be equivalent to minimizing separable convex functions over the corresponding submodular polyhedra. In this paper, we reveal a comprehensive class of structured penalties for which penalties this problem can be solved via an efficiently solvable class of parametric maxflow optimization. We then show that the parametric maxflow algorithm proposed by Gallo et al. and its variants… 

Figures from this paper

References

SHOWING 1-10 OF 58 REFERENCES
Learning with Submodular Functions: A Convex Optimization Perspective
  • F. Bach
  • Computer Science
    Found. Trends Mach. Learn.
  • 2013
TLDR
In Learning with Submodular Functions: A Convex Optimization Perspective, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems.
Convex and Network Flow Optimization for Structured Sparsity
TLDR
Two different strategies are presented that show that the proximal operator associated with a sum of l∞-norms can be computed exactly in polynomial time by solving a quadratic min-cost flow problem, allowing the use of accelerated proximal gradient methods.
Convex Relaxation for Combinatorial Penalties
TLDR
This paper considers the situation of a model simultaneously penalized by a set- function on the support of the unknown parameter vector which represents prior knowledge on supports, and regularized in Lp-norm, and shows that the natural combinatorial optimization problems obtained may be relaxed into convex optimization problems.
Structured sparsity-inducing norms through submodular functions
TLDR
This paper shows that for nondecreasing submodular set-functions, the corresponding convex envelope can be obtained from its Lovasz extension, a common tool in sub modular analysis, and defines a family of polyhedral norms, for which it provides generic algorithmic tools and theoretical results.
Shaping Level Sets with Submodular Functions
  • F. Bach
  • Mathematics, Computer Science
    NIPS
  • 2011
TLDR
By selecting specific submodular functions, this work gives a new interpretation to known norms, such as the total variation, and defines new norms, in particular ones that are based on order statistics with application to clustering and outlier detection, and on noisy cuts in graphs withApplication to change point detection in the presence of outliers.
Structured sparsity through convex optimization
TLDR
It is shown that the $\ell_1$-norm can be extended to structured norms built on either disjoint or overlapping groups of variables, leading to a flexible framework that can deal with various structures.
On fast approximate submodular minimization
TLDR
A fast approximate method to minimize arbitrary submodular functions and shows theoretical properties, and empirical results suggest significant speedups over minimum norm while retaining higher accuracies.
Proximal Methods for Hierarchical Sparse Coding
TLDR
The procedure has a complexity linear, or close to linear, in the number of atoms, and allows the use of accelerated gradient techniques to solve the tree-structured sparse approximation problem at the same computational cost as traditional ones using the l1-norm.
Complexity and algorithms for nonlinear optimization problems
TLDR
The focus here is on a complexity approach for designing and analyzing algorithms for nonlinear optimization problems providing optimal solutions with prespecified accuracy in the solution space.
Structural and algorithmic properties for parametric minimum cuts
TLDR
This work defines two conditions on parametrized arc capacities that are necessary and sufficient for (strictly) decreasing differences of the parametric cut function, and shows how to construct appropriate Flow Updates in linear time under these conditions.
...
...