Genetic Programming for Manifold Learning: Preserving Local Topology

@article{Lensen2021GeneticPF,
  title={Genetic Programming for Manifold Learning: Preserving Local Topology},
  author={Andrew Lensen and Bing Xue and Mengjie Zhang},
  journal={ArXiv},
  year={2021},
  volume={abs/2108.09914}
}
Manifold learning methods are an invaluable tool in today’s world of increasingly huge datasets. Manifold learning algorithms can discover a much lower-dimensional representation (embedding) of a high-dimensional dataset through non-linear transformations that preserve the most important structure of the original data. State-of-the-art manifold learning methods directly optimise an embedding without mapping between the original space and the discovered embedded space. This makes… Expand

References

SHOWING 1-10 OF 67 REFERENCES
Multi-objective genetic programming for manifold learning: balancing quality and dimensionality
TLDR
This paper substantially extends previous work on manifold learning, by introducing a multi-objective approach that automatically balances the competing objectives of manifold quality and dimensionality. Expand
Can Genetic Programming Do Manifold Learning Too?
TLDR
A genetic programming approach to manifold learning called GP-MaL is proposed which evolves functional mappings from a high-dimensional space to a lower dimensional space through the use of interpretable trees and is competitive with existing manifold learning algorithms. Expand
Benchmarking Manifold Learning Methods on a Large Collection of Datasets
TLDR
It is shown that GP-based methods can more effectively learn a manifold across a set of 155 different problems and deliver more separable embeddings than many established methods. Expand
Representation Learning: A Review and New Perspectives
TLDR
Recent work in the area of unsupervised feature learning and deep learning is reviewed, covering advances in probabilistic models, autoencoders, manifold learning, and deep networks. Expand
Learning a Parametric Embedding by Preserving Local Structure
TLDR
The paper presents a new unsupervised dimensionality reduction technique, called parametric t-SNE, that learns a parametric mapping between the high-dimensional data space and the low-dimensional latent space, and evaluates the performance in experiments on three datasets. Expand
Nonlinear Dimensionality Reduction
TLDR
The purpose of the book is to summarize clear facts and ideas about well-known methods as well as recent developments in the topic of nonlinear dimensionality reduction, which encompasses many of the recently developed methods. Expand
Estimating the intrinsic dimension of datasets by a minimal neighborhood information
TLDR
A new ID estimator using only the distance of the first and the second nearest neighbor of each point in the sample is proposed, which enables us to reduce the effects of curvature, of density variation, and the resulting computational cost. Expand
UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction
TLDR
The UMAP algorithm is competitive with t-SNE for visualization quality, and arguably preserves more of the global structure with superior run time performance. Expand
A global geometric framework for nonlinear dimensionality reduction.
TLDR
An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure. Expand
Nonlinear dimensionality reduction by locally linear embedding.
TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds. Expand
...
1
2
3
4
5
...