Local high-order regularization on data manifolds

@article{Kim2015LocalHR,
  title={Local high-order regularization on data manifolds},
  author={Kwang In Kim and James Tompkin and Hanspeter Pfister and Christian Theobalt},
  journal={2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2015},
  pages={5473-5481}
}
  • K. Kim, J. Tompkin, C. Theobalt
  • Published 7 June 2015
  • Computer Science, Mathematics
  • 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
The common graph Laplacian regularizer is well-established in semi-supervised learning and spectral dimensionality reduction. However, as a first-order regularizer, it can lead to degenerate functions in high-dimensional manifolds. The iterated graph Laplacian enables high-order regularization, but it has a high computational complexity and so cannot be applied to large problems. We introduce a new regularizer which is globally high order and so does not suffer from the degeneracy of the graph… 

Figures and Tables from this paper

Regularization on a rapidly varying manifold

TLDR
A Jerk based manifold regularization (JR) for dense, oscillating and manifolds with inflection points is proposed which approximates accurate and generic input space geometrical constraints to outperform existing state-of-the-art manifoldRegularization techniques by a significant margin.

The Mathematical Foundations of Manifold Learning

TLDR
A mathematical perspective on manifold learning is presented, delving into the intersection of kernel learning, spectral graph theory, and differential geometry, which forms the foundation for the widely-used technique of manifold regularization.

Criteria Sliders: Learning Continuous Database Criteria via Interactive Ranking

TLDR
This work learns low-dimensional continuous criteria via interactive ranking, so that the novice user need only describe the relative ordering of examples, and actively suggest data points to the user to rank in a more informative way than existing work.

cvpaper.challenge in 2015 - A review of CVPR2015 and DeepSurvey

TLDR
This review focused on reading the ALL 602 conference papers presented at the CVPR2015, the premier annual computer vision event held in June 2015, and proposed "DeepSurvey" as a mechanism embodying the entire process from the reading through all the papers, the generation of ideas, and to the writing of paper.

cvpaper.challenge in CVPR2015 -- A review of CVPR2015

TLDR
This challenge aims to simultaneously read papers and create documents for easy understanding top conference papers in Japanese in the fields of computer vision, image processing, pattern recognition and machine learning.

cvpaper.challenge in 2016: Futuristic Computer Vision through 1, 600 Papers Survey

The paper gives futuristic challenges disscussed in the cvpaper.challenge. In 2015 and 2016, we thoroughly study 1,600+ papers in several conferences/journals such as CVPR/ICCV/ECCV/NIPS/PAMI/IJCV.

References

SHOWING 1-10 OF 62 REFERENCES

Semi-supervised Learning by Higher Order Regularization

TLDR
This paper addresses the problem of semi-supervised learning at the limit of infinite unlabeled points while fixing labeled ones by using regularization based on an iterated Laplacian, which is equivalent to a higher order Sobolev semi-norm.

Curvature-Aware Regularization on Riemannian Submanifolds

TLDR
This work presents a procedure for characterizing the extrinsic (as well as intrinsic) curvature of a manifold M which is described by a sampled point cloud in a high-dimensional Euclidean space and uses this characterization in general diffusion and regularization on M, and form a new regularizer on a point cloud.

Semi-supervised Regression using Hessian energy with an application to semi-supervised dimensionality reduction

TLDR
This work proposes to use the second-order Hessian energy for semi-supervised regression which overcomes both the bias towards a constant and the lack of extrapolating power of the solution.

Nonlinear dimensionality reduction by locally linear embedding.

TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

TLDR
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.

A Geometric take on Metric Learning

TLDR
It is proved that, with appropriate changes, multi-metric learning corresponds to learning the structure of a Riemannian manifold, and it is shown that this structure gives a principled way to perform dimensionality reduction and regression according to the learned metrics.

Towards a theoretical foundation for Laplacian-based manifold methods

Statistical Analysis of Semi-Supervised Learning: The Limit of Infinite Unlabelled Data

TLDR
It is shown that the popular Laplacian Regularization method for Semi-Supervised Learning is actually not well-posed, and as the number of unlabeled points increases the solution degenerates to a noninformative function.

Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data

  • D. DonohoC. Grimes
  • Computer Science, Mathematics
    Proceedings of the National Academy of Sciences of the United States of America
  • 2003
TLDR
The Hessian-based locally linear embedding method for recovering the underlying parametrization of scattered data (mi) lying on a manifold M embedded in high-dimensional Euclidean space is described, where the isometric coordinates can be recovered up to a linear isometry.

Analyzing the Harmonic Structure in Graph-Based Learning

TLDR
It is shown that the variation of the target function across a cut can be upper and lower bounded by the ratio of its harmonic loss and the cut cost.
...