Spectral methods in machine learning and new strategies for very large datasets.

@article{Belabbas2009SpectralMI,
  title={Spectral methods in machine learning and new strategies for very large datasets.},
  author={Mohamed-Ali Belabbas and Patrick J. Wolfe},
  journal={Proceedings of the National Academy of Sciences of the United States of America},
  year={2009},
  volume={106 2},
  pages={369-74}
}
Spectral methods are of fundamental importance in statistics and machine learning, because they underlie algorithms from classical principal components analysis to more recent approaches that exploit manifold structure. In most cases, the core technical problem can be reduced to computing a low-rank approximation to a positive-definite kernel. For the growing number of applications dealing with very large or high-dimensional datasets, however, the optimal approximation afforded by an exact… CONTINUE READING
Highly Influential
This paper has highly influenced 10 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 126 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 3 times. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 75 extracted citations

Kernel K-Means Sampling for Nyström Approximation

IEEE Transactions on Image Processing • 2018
View 6 Excerpts
Highly Influenced

A review of Nyström methods for large-scale machine learning

Information Fusion • 2015
View 7 Excerpts
Highly Influenced

A kernelized maximal-figure-of-merit learning approach based on subspace distance minimization

2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2011
View 4 Excerpts
Highly Influenced

Clusterability Analysis and Incremental Sampling for Nyström Extension Based Spectral Clustering

2011 IEEE 11th International Conference on Data Mining • 2011
View 7 Excerpts
Highly Influenced

127 Citations

01020'10'13'16'19
Citations per Year
Semantic Scholar estimates that this publication has 127 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 17 references

Hessian eigenmaps: locally linear embedding techniques for high-dimensional data.

Proceedings of the National Academy of Sciences of the United States of America • 2003
View 5 Excerpts
Highly Influenced

Matrix Analysis (Cambridge

RA Horn, CR Johnson
1999
View 3 Excerpts
Highly Influenced

Fast Low-Rank Approximation for Covariance Matrices

2007 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing • 2007
View 1 Excerpt

Randomized algorithms for the low-rank approximation of matrices.

Proceedings of the National Academy of Sciences of the United States of America • 2007

Improved Approximation Algorithms for Large Matrices via Random Projections

2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06) • 2006

Geometric diffusions as a tool for harmonic analysis and structure definition of data: diffusion maps.

Proceedings of the National Academy of Sciences of the United States of America • 2005
View 1 Excerpt

Monte Carlo Statistical Methods (Springer, New York), 2nd Ed

CP Robert, G Casella
2004
View 1 Excerpt

Similar Papers

Loading similar papers…