Geometric Methods for Feature Extraction and Dimensional Reduction - A Guided Tour

@inproceedings{Burges2005GeometricMF,
  title={Geometric Methods for Feature Extraction and Dimensional Reduction - A Guided Tour},
  author={Christopher J. C. Burges},
  booktitle={Data Mining and Knowledge Discovery Handbook},
  year={2005}
}
  • Christopher J. C. Burges
  • Published in
    Data Mining and Knowledge…
    2005
  • Mathematics, Computer Science
  • We give a tutorial overview of several geometric methods for feature extractionand dimensional reduction. We divide the methods into projective methods and methods thatmodel the manifold on which the data lies. For projective methods, we review projectionpursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, and orientedPCA; and for the manifold methods, we review multidimensional scaling (MDS), landmarkMDS, Isomap, locally linear embedding, Laplacian eigenmaps and spectral… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 36 CITATIONS

    Multiproject–multicenter evaluation of automatic brain tumor classification by magnetic resonance spectroscopy

    VIEW 8 EXCERPTS
    CITES METHODS
    HIGHLY INFLUENCED

    Exon Structure Analysis via PCA and ICA of Short-Time Fourier Transform

    VIEW 23 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Essays in high-dimensional nonlinear time series analysis

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Original approach for reduction of high dimensionality in unsupervised learning

    VIEW 1 EXCERPT

    FPGA-Based Fully Parallel PCA-ANN for Spectrum Sensing

    VIEW 2 EXCERPTS
    CITES METHODS

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 48 REFERENCES

    Matrix analysis

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Principal component neural networks — Theory and applications

    • Juha Karhunen
    • Mathematics, Computer Science
    • Pattern Analysis and Applications
    • 2005
    VIEW 9 EXCERPTS
    HIGHLY INFLUENTIAL

    Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

    VIEW 12 EXCERPTS
    HIGHLY INFLUENTIAL

    Independent Component Analysis

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Learning Segmentation by Random Walks

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Nonlinear dimensionality reduction by locally linear embedding.

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Normalized Cuts and Image Segmentation

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Asymptotics of Graphical Projection Pursuit

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Some Notes on Applied Mathematics for Machine Learning

    VIEW 3 EXCERPTS