Corpus ID: 236428742

Clustering by Maximizing Mutual Information Across Views

  title={Clustering by Maximizing Mutual Information Across Views},
  author={Kien Do and Truyen Tran and Svetha Venkatesh},
We propose a novel framework for image clustering that incorporates joint representation learning and clustering. Our method consists of two heads that share the same backbone network a “representation learning” head and a “clustering” head. The “representation learning” head captures fine-grained patterns of objects at the instance level which serve as clues for the “clustering” head to extract coarse-grain information that separates objects into clusters. The whole model is trained in an end… Expand


Self-labelling via simultaneous clustering and representation learning
The proposed novel and principled learning formulation is able to self-label visual data so as to train highly competitive image representations without manual labels and yields the first self-supervised AlexNet that outperforms the supervised Pascal VOC detection baseline. Expand
Deep Comprehensive Correlation Mining for Image Clustering
A novel clustering framework, named deep comprehensive correlation mining~(DCCM), for exploring and taking full advantage of various kinds of correlations behind the unlabeled data from three aspects: Instead of only using pair-wise information, pseudo-label supervision is proposed to investigate category information and learn discriminative features. Expand
Joint Unsupervised Learning of Deep Representations and Image Clusters
A recurrent framework for joint unsupervised learning of deep representations and image clusters by integrating two processes into a single model with a unified weighted triplet loss function and optimizing it end-to-end can obtain not only more powerful representations, but also more precise image clusters. Expand
Invariant Information Clustering for Unsupervised Image Classification and Segmentation
We present a novel clustering objective that learns a neural network classifier from scratch, given only unlabelled data samples. The model discovers clusters that accurately match semantic classes,Expand
Deep Adaptive Image Clustering
Deep Adaptive Clustering (DAC) is proposed that recasts the clustering problem into a binary pairwise-classification framework to judge whether pairs of images belong to the same clusters to overcome the main challenge, the ground-truth similarities are unknown in image clustering. Expand
Deep Semantic Clustering by Partition Confidence Maximisation
This work introduces a novel deep clustering method named PartItion Confidence mAximisation (PICA), established on the idea of learning the most semantically plausible data separation, in which all clusters can be mapped to the ground-truth classes one-to-one by maximising the "global" partition confidence of clustering solution. Expand
Deep Robust Clustering by Contrastive Learning.
Deep Robust Clustering (DRC) is proposed, a general framework that can turn any maximizing mutual information into minimizing contrastive loss by investigating the internal relationship between mutual information and contrastive learning. Expand
Towards K-means-friendly Spaces: Simultaneous Deep Learning and Clustering
A joint DR and K-means clustering approach in which DR is accomplished via learning a deep neural network (DNN) while exploiting theDeep neural network's ability to approximate any nonlinear function is proposed. Expand
Online Deep Clustering for Unsupervised Representation Learning
This work proposes Online Deep Clustering (ODC) that performs clustering and network update simultaneously rather than alternatingly, and designs and maintains two dynamic memory modules, i.e., samples memory to store samples' labels and features, andCentroids memory for centroids evolution. Expand
An Analysis of Single-Layer Networks in Unsupervised Feature Learning
The results show that large numbers of hidden nodes and dense feature extraction are critical to achieving high performance—so critical, in fact, that when these parameters are pushed to their limits, they achieve state-of-the-art performance on both CIFAR-10 and NORB using only a single layer of features. Expand