Corpus ID: 235377159

Multi-Facet Clustering Variational Autoencoders

  title={Multi-Facet Clustering Variational Autoencoders},
  author={Fabian Falck and Haoting Zhang and Matthew Willetts and George Nicholson and Christopher Yau and Christopher C. Holmes},
Work in deep clustering focuses on finding a single partition of data. However, high-dimensional data, such as images, typically feature multiple interesting characteristics one could cluster over. For example, images of objects against a background could be clustered over the shape of the object and separately by the colour of the background. In this paper, we introduce Multi-Facet Clustering Variational Autoencoders (MFCVAE), a novel class of variational autoencoders with a hierarchy of… Expand
I Don't Need u: Identifiable Non-Linear ICA Without Side Information
This work focuses on generative models which perform clustering in their latent space — a model structure which matches previous identifiable models, but with the learnt clustering providing a synthetic form of auxiliary information. Expand


Disentangling to Cluster: Gaussian Mixture Variational Ladder Autoencoders
This work proposes a clustering algorithm, VLAC, that outperforms a Gaussian Mixture DGM in cluster accuracy over digit identity on the test set of SVHN and demonstrates learning clusters jointly over numerous layers of the hierarchy of latent variables for the data. Expand
Towards K-means-friendly Spaces: Simultaneous Deep Learning and Clustering
A joint DR and K-means clustering approach in which DR is accomplished via learning a deep neural network (DNN) while exploiting theDeep neural network's ability to approximate any nonlinear function is proposed. Expand
Variational Deep Embedding: An Unsupervised and Generative Approach to Clustering
Variational Deep Embedding (VaDE) is proposed, a novel unsupervised generative clustering approach within the framework of Variational Auto-Encoder (VAE), which shows its capability of generating highly realistic samples for any specified cluster, without using supervised information during training. Expand
Deep Spectral Clustering Using Dual Autoencoder Network
A joint learning framework for discriminative embedding and spectral clustering is proposed, which can significantly outperform state-of-the-art clustering approaches and be more robust to noise. Expand
Learning Latent Superstructures in Variational Autoencoders for Deep Multidimensional Clustering
This work investigates a variant of variational autoencoders where there is a superstructure of discrete latent variables on top of the latent features, which is a tree structure of multiple super latent variables and it is automatically learned from data. Expand
beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework
Learning an interpretable factorised representation of the independent data generative factors of the world without supervision is an important precursor for the development of artificialExpand
SpectralNet: Spectral Clustering using Deep Neural Networks
A deep learning approach to spectral clustering that overcomes the major limitations of scalability and generalization of the spectral embedding and applies VC dimension theory to derive a lower bound on the size of SpectralNet. Expand
Nonparametric Variational Auto-Encoders for Hierarchical Representation Learning
This work proposes hierarchical non-parametric variational autoencoders, which combines tree-structured Bayesian nonparametric priors with VAEs, to enable infinite flexibility of the latent representation space. Expand
ClusterGAN : Latent Space Clustering in Generative Adversarial Networks
The results show a remarkable phenomenon that GANs can preserve latent space interpolation across categories, even though the discriminator is never exposed to such vectors. Expand
Towards Hierarchical Discrete Variational Autoencoders
The Hierarchical Discrete Variational Autoencoder (HD-VAE) is introduced: a hierarchy of variational memory layers and the Concrete/Gumbel-Softmax relaxation allows maximizing a surrogate of the Evidence Lower Bound by stochastic gradient ascent. Expand