Deep Clusteringwith Concrete K-Means

@article{Gao2020DeepCC,
  title={Deep Clusteringwith Concrete K-Means},
  author={Boyan Gao and Yongxin Yang and Henry Gouk and Timothy M. Hospedales},
  journal={ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2020},
  pages={4252-4256}
}
We address the problem of simultaneously learning a k-means clustering and deep feature representation from unlabelled data, which is of interest due to the potential for deep k-means to outperform traditional two-step feature extraction and shallow clustering strategies. We achieve this by developing a gradient estimator for the non-differentiable k-means objective via the Gumbel-Softmax reparameterisation trick. In contrast to previous attempts at deep clustering, our concrete k-means model… Expand
3 Citations
Deep Clustering for Domain Adaptation
  • 2
  • PDF
Balanced Order Batching with Task-Oriented Graph Clustering
  • Highly Influenced
  • PDF

References

SHOWING 1-10 OF 12 REFERENCES
Towards K-means-friendly Spaces: Simultaneous Deep Learning and Clustering
  • 293
  • Highly Influential
  • PDF
Unsupervised Deep Embedding for Clustering Analysis
  • 926
  • Highly Influential
  • PDF
Power k-Means Clustering
  • 18
  • PDF
k-means++: the advantages of careful seeding
  • 5,769
  • PDF
Categorical Reparameterization with Gumbel-Softmax
  • 1,709
  • PDF
An Information-Theoretic Analysis of Hard and Soft Assignment Methods for Clustering
  • 196
  • PDF
Locally Consistent Concept Factorization for Document Clustering
  • 231
  • PDF
Gradient Estimation Using Stochastic Computation Graphs
  • 249
  • PDF
Gradient-based learning applied to document recognition
  • 27,503
  • Highly Influential
  • PDF
The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables
  • 1,117
  • PDF
...
1
2
...