Learning a kernel matrix for nonlinear dimensionality reduction

@inproceedings{Weinberger2004LearningAK,
  title={Learning a kernel matrix for nonlinear dimensionality reduction},
  author={Kilian Q. Weinberger and Fei Sha and Lawrence K. Saul},
  booktitle={ICML},
  year={2004}
}
We investigate how to learn a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. Noting that the kernel matrix implicitly maps the data into a nonlinear feature space, we show how to discover a mapping that "unfolds" the underlying manifold from which the data was sampled. The kernel matrix is constructed by maximizing the variance in feature space subject to local constraints that preserve the angles and distances between nearest neighbors. The main… CONTINUE READING
Highly Influential
This paper has highly influenced 40 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 711 citations. REVIEW CITATIONS
272 Citations
8 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 272 extracted citations

711 Citations

020406080'05'08'11'14'17
Citations per Year
Semantic Scholar estimates that this publication has 711 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…