# On Spectral Clustering: Analysis and an algorithm

@inproceedings{Ng2001OnSC, title={On Spectral Clustering: Analysis and an algorithm}, author={A. Ng and Michael I. Jordan and Yair Weiss}, booktitle={NIPS}, year={2001} }

Despite many empirical successes of spectral clustering methods— algorithms that cluster points using eigenvectors of matrices derived from the data—there are several unresolved issues. [... ] Key Method Using tools from matrix perturbation theory, we analyze the algorithm, and give conditions under which it can be expected to do well. We also show surprisingly good experimental results on a number of challenging clustering problems. Expand

## Figures from this paper

## 8,504 Citations

A Note on Spectral Clustering Method Based on Normalized Cut Criterion

- Computer Science2009 Chinese Conference on Pattern Recognition
- 2009

A note is given on why the first k eigenvectors in the algorithm are chosen and the conditions for indicator vectors under which the clustering problem could lead to the problem of minimizing the objective function of the spectral clustering method based on normalized cut criterion.

A tutorial on spectral clustering

- Computer ScienceStat. Comput.
- 2007

This tutorial describes different graph Laplacians and their basic properties, present the most common spectral clustering algorithms, and derive those algorithms from scratch by several different approaches.

New Methods for Spectral Clustering.

- Computer Science
- 2004

This work proposes three new algorithmic components which are appropriate for improving performance of spectral clustering by observing the eigenvectors and turns out to allow a robust automatic determination of the kernel radius σ.

Spectral clustering based on matrix perturbation theory

- Computer ScienceScience in China Series F: Information Sciences
- 2007

This paper exposes some intrinsic characteristics of the spectral clustering method by using the tools from the matrix perturbation theory, and shows that the eigenvector of the weight matrix can be used directly to perform clustering.

Consistency of spectral clustering

- Computer Science
- 2008

It is proved that one of the two major classes of spectral clustering (normalized clustering) converges under very general conditions, while the other is only consistent under strong additional assumptions, which are not always satisfied in real data.

A new spectral clustering algorithm for large training sets

- Computer ScienceProceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693)
- 2003

A new algorithm for spectral clustering that can cluster a large number of samples that would be impossible to cluster with current approaches is depicted, based on a "clustering of clusters" technique, that combines the use of k-means and spectral clusters.

A randomized algorithm for spectral clustering

- Computer Science, MathematicsESANN
- 2010

A bound for choosing a correct number of eigenvectors in a randomized spectral algorithm able to find a clustering solution is shown and the efficacy of the algorithm is shown with experiments on real world graphs.

Spectral Clustering with Automatic Cluster-Number Identification via Finding Sparse Eigenvectors

- Computer Science2018 26th European Signal Processing Conference (EUSIPCO)
- 2018

Imposing sparsity on the eigenvectors of graph Laplacian is proposed to attain reasonable approximations of the so-called cluster-indicator-vectors, from which the clusters as well as the cluster number are identified.

A self-adaptive spectral clustering algorithm

- Computer Science2008 27th Chinese Control Conference
- 2008

It is proved theoretically that the eigenvectors of the affinity matrix can be used directly to cluster the data points and a self-adaptive spectral clustering algorithm based on affinity matrix is proposed that is more effective than previous algorithms.

Fast Spectral Clustering via the Nyström Method

- Computer ScienceALT
- 2013

A fast spectral clustering algorithm with computational complexity linear in the number of data points that is directly applicable to large-scale datasets and the conditions under which the algorithm performance is comparable to spectral clusters with the original graph Laplacian are discussed.

## References

SHOWING 1-10 OF 14 REFERENCES

Spectral Kernel Methods for Clustering

- Computer ScienceNIPS
- 2001

This paper introduces new algorithms for unsupervised learning based on the use of a kernel matrix, and shows how the optimal solution can be approximated by slightly relaxing the corresponding optimization problem, and how this corresponds to using eigenvector information.

Spectral Partitioning: The More Eigenvectors, The Better

- Computer Science32nd Design Automation Conference
- 1995

This work maps each graph vertex to a vector in d-dimensional space, where d is the number of eigenvectors, such that these vectors constitute an instance of the vector partitioning problem.

Segmentation using eigenvectors: a unifying view

- Computer ScienceProceedings of the Seventh IEEE International Conference on Computer Vision
- 1999

A unified treatment of eigenvectors of block matrices based on eigendecompositions in the context of segmentation is given, and close connections between them are shown while highlighting their distinguishing features.

On clusterings-good, bad and spectral

- Computer ScienceProceedings 41st Annual Symposium on Foundations of Computer Science
- 2000

Two results regarding the quality of the clustering found by a popular spectral algorithm are presented, one proffers worst case guarantees whilst the other shows that if there exists a "good" clustering then the spectral algorithm will find one close to it.

Spectral partitioning works: planar graphs and finite element meshes

- Computer Science, MathematicsProceedings of 37th Conference on Foundations of Computer Science
- 1996

It is proved that spectral partitioning techniques can be used to produce separators whose ratio of vertices removed to edges cut is O(/spl radic/n) for bounded-degree planar graphs and two-dimensional meshes and O(n/sup 1/d/) for well-shaped d-dimensional mesh.

Feature grouping by 'relocalisation' of eigenvectors of the proximity matrix

- Computer ScienceBMVC
- 1990

We describe a widely applicable method of grouping or clustering image features (such as points, lines, corners, flow vectors and the like). It takes as input a "proximity matrix" H a square,…

Learning Segmentation by Random Walks

- Computer ScienceNIPS
- 2000

This interpretation shows that spectral methods for clustering and segmentation have a probabilistic foundation and proves that the Normalized Cut method arises naturally from the framework.

Spectral Graph Theory

- Mathematics
- 1996

Eigenvalues and the Laplacian of a graph Isoperimetric problems Diameters and eigenvalues Paths, flows, and routing Eigenvalues and quasi-randomness Expanders and explicit constructions Eigenvalues…