Sublabel-Accurate Discretization of Nonconvex Free-Discontinuity Problems

@article{Mllenhoff2017SublabelAccurateDO,
  title={Sublabel-Accurate Discretization of Nonconvex Free-Discontinuity Problems},
  author={Thomas M{\"o}llenhoff and Daniel Cremers},
  journal={2017 IEEE International Conference on Computer Vision (ICCV)},
  year={2017},
  pages={1192-1200}
}
  • T. Möllenhoff, D. Cremers
  • Published 21 November 2016
  • Computer Science
  • 2017 IEEE International Conference on Computer Vision (ICCV)
In this work we show how sublabel-accurate multilabeling approaches [15, 18] can be derived by approximating a classical label-continuous convex relaxation of nonconvex free-discontinuity problems. This insight allows to extend these sublabel-accurate approaches from total variation to general convex and nonconvex regularizations. Furthermore, it leads to a systematic approach to the discretization of continuous convex relaxations. We study the relationship to existing discretizations and to… 

Figures from this paper

Sublabel-Accurate Multilabeling Meets Product Label Spaces
TLDR
This paper presents a combination of two approaches designed to make liftings more scalable, namely product-space relaxations and sublabel-accurate discretizations, which significantly outperforms baseline methods and finds solutions with lower energies given the same amount of memory.
Sublabel-Accurate Convex Relaxation with Total Generalized Variation Regularization
TLDR
The proposed formulation extends a recent sublabel-accurate relaxation for multi-label problems and thus allows for accurate solutions using only a small number of labels, significantly improving over previous approaches towards lifting the total generalized variation.
Functional Liftings of Vectorial Variational Problems with Laplacian Regularization
TLDR
A functional lifting-based convex relaxation of variational problems with Laplacian-based second-order regularization that encompasses the discretization-first sublabel-accurate continuous multilabeling approach as a special case is proposed.
Lifting methods for manifold-valued variational problems
TLDR
This work provides a review of lifting methods in a refined framework based on a finite element discretization of the range, which extends the concept of sublabel-accurate lifting to manifolds and generalizes existing methods for total variation regularization to support general convex regularization.
Inverse Scale Space Iterations for Non-convex Variational Problems Using Functional Lifting
TLDR
This work applies the classical Bregman iteration to a lifted version of the functional with sublabel-accurate discretization, and provides a condition for the subgradients of the regularizer under which this lifted iteration reduces to the standard BRegman iteration.
Efficient and Flexible Sublabel-Accurate Energy Minimization
TLDR
This work proposes an efficient sublabel-accurate method that utilizes the best properties of both continuous and discrete models and shows the flexibility of the proposed approach to general pairwise smoothness terms, so that it is applicable to a wide range of regularizations.
Inverse Scale Space Iterations for Non-Convex Variational Problems: The Continuous and Discrete Case
Non-linear filtering approaches allow to obtain decompositions of images with respect to a nonclassical notion of scale, induced by the choice of a convex, absolutely one-homogeneous regularizer. The
Lifting Vectorial Variational Problems: A Natural Formulation Based on Geometric Measure Theory and Discrete Exterior Calculus
TLDR
This work recalls that functionals with polyconvex Lagrangians can be reparametrized as convex one-homogeneous functionals on the graph of the function and proposes a discretization of the resulting infinite-dimensional optimization problem using Whitney forms, which generalizes recent "sublabel-accurate" multilabeling approaches.
Fast Convex Relaxations using Graph Discretizations
TLDR
This methodology is discussed in detail and examples in multi-label segmentation by minimal partitions and stereo estimation are shown, where it is demonstrated that the proposed graph discretization technique can reduce the runtime as well as the memory consumption by up to a factor of 10 in comparison to classical pixelwise discretizations.
Composite Optimization by Nonconvex Majorization-Minimization
TLDR
This work considers a natural class of nonconvex majorizers for these functions, and shows that these majorizers are still sufficient for a globally convergent optimization scheme, and illustrates the behavior of the algorithm for depth super-resolution from raw time-of-flight data.
...
...

References

SHOWING 1-10 OF 40 REFERENCES
Sublabel–Accurate Relaxation of Nonconvex Energies
TLDR
This work proposes a novel spatially continuous framework for convex relaxations based on functional lifting and shows less grid bias, which is easy to implement and allows an efficient primal-dual optimization on GPUs.
Sublabel-Accurate Convex Relaxation of Vectorial Multilabel Energies
TLDR
This work proposes the first sublabel accurate convex relaxation for vectorial multilabel problems in a piecewise convex (rather than piecewise linear) manner and has a more faithful approximation of the original cost function that provides a meaningful interpretation for the fractional solutions of the relaxed convex problem.
Convex Relaxation of Vectorial Problems with Coupled Regularization
TLDR
The key idea is to consider a collection of hypersurfaces with a relaxation that takes into account the entire functional rather than separately treating the data term and the regularizers.
A convex representation for the vectorial Mumford-Shah functional
TLDR
This work proposes the first tractable convex formulation of the vectorial Mumford-Shah functional which allows to compute high-quality solutions independent of the initialization and proposes an efficient solution which makes the overall optimization problem as tractable as in the scalar-valued case.
Continuous Multiclass Labeling Approaches and Algorithms
TLDR
This work proposes a globally convergent Douglas-Rachford scheme and shows that a sequence of dual iterates can be recovered in order to provide a posteriori optimality bounds and shows competitive performance on synthetic and real-world images.
Tight Convex Relaxations for Vector-Valued Labeling
TLDR
This paper proposes a reduction technique for the case that the label space is a continuous product space and the regularizer is separable, i.e., a sum of regularizers for each dimension of thelabel space.
A Convex Discrete-Continuous Approach for Markov Random Fields
TLDR
An extension of the well-known LP relaxation for Markov random fields to explicitly allow continuous label spaces is proposed, and the widely used efficient scheme for handling L1 smoothness priors over discrete ordered label sets is presented.
Total Variation Regularization for Functions with Values in a Manifold
TLDR
This paper proposes the first algorithm to solve variational problems which applies to arbitrary Riemannian manifolds and demonstrates that the proposed framework can be applied to variational models that incorporate chromaticity values, normal fields, or camera trajectories.
Global Solutions of Variational Models with Convex Regularization
TLDR
An algorithmic framework for computing global solutions of variational models with convex regularity terms that permit quite arbitrary data terms and a fast primal-dual algorithm which significantly outperforms existing algorithms is proposed.
A Convex Solution to Spatially-Regularized Correspondence Problems
TLDR
A discretization of this surface formulation that gives rise to a convex minimization problem and compute a globally optimal solution using an efficient primal-dual algorithm is proposed.
...
...