Regularization of Mixture Models for Robust Principal Graph Learning

@article{Bonnaire2021RegularizationOM,
  title={Regularization of Mixture Models for Robust Principal Graph Learning},
  author={Tony Bonnaire and Aur{\'e}lien Decelle and Nabila Aghanim},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2021},
  volume={PP},
  pages={1-1}
}
A regularized version of Mixture Models is proposed to learn a principal graph from a distribution of D-dimensional data points. In the particular case of manifold learning for ridge detection, we assume that the underlying structure can be modeled as a graph acting like a topological prior for the Gaussian clusters turning the problem into a maximum a posteriori estimation. Parameters of the model are iteratively estimated through an Expectation-Maximization procedure making the learning of… 

Figures from this paper

References

SHOWING 1-10 OF 39 REFERENCES

Regularization-free principal curve estimation

This paper introduces a new objective function, facilitated through a modification of the principal curve estimation approach, for which all critical points are principal curves and minima, and removes the fundamental issue for model selection in principal curves estimation.

Nonparametric Ridge Estimation

Ridge estimation is an extension of mode finding and is useful for understanding the structure of a density and can be used to find hidden structure in point cloud data.

Robust and Scalable Learning of Complex Intrinsic Dataset Geometry via ElPiGraph

ElPiGraph exploits and further develops the concept of elastic energy, the topological graph grammar approach, and a gradient descent-like optimization of the graph topology, and is capable of approximating data point clouds via principal graph ensembles.

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.

Locally Defined Principal Curves and Surfaces

A novel theoretical understanding of principal curves and surfaces, practical algorithms as general purpose machine learning tools, and applications of these algorithms to several practical problems are presented.

SimplePPT: A Simple Principal Tree Algorithm

A principal tree model is proposed and a new algorithm is developed that learns a tree structure automatically from data that compares favorably with baselines and can discover a breast cancer progression path with multiple branches.

Generalized Mode and Ridge Estimation

A method is proposed for studying the geometric structure of generalized densities using a modification of the mean shift algorithm and its variant, subspace constrained mean shift, which can be used to perform clustering and to calculate a measure of connectivity between clusters.

Principal Graph and Structure Learning Based on Reversed Graph Embedding

A novel principal graph and structure learning framework that captures the local information of the underlying graph structure based on reversed graph embedding is developed and a new learning algorithm is developed that learns a set of principal points and a graph structure from data, simultaneously.

Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering

The algorithm provides a computationally efficient approach to nonlinear dimensionality reduction that has locality preserving properties and a natural connection to clustering.

Geometric Inference for Probability Measures

Replacing compact subsets by measures, a notion of distance function to a probability distribution in ℝd is introduced and it is shown that it is possible to reconstruct offsets of sampled shapes with topological guarantees even in the presence of outliers.