Corpus ID: 59158890

Deep Clustering with a Dynamic Autoencoder

@article{Mrabah2019DeepCW,
  title={Deep Clustering with a Dynamic Autoencoder},
  author={Nairouz Mrabah and Naimul Mefraz Khan and Riadh Ksantini},
  journal={ArXiv},
  year={2019},
  volume={abs/1901.07752}
}
In unsupervised learning, there is no obvious straightforward loss function which can capture the major factors of variations and similarities. Since natural systems have smooth dynamics, an opportunity is lost if an unsupervised loss function remains static during the training process. The absence of concrete supervision suggests that smooth complex dynamics should be integrated as a substitute to the classical static loss functions to better make use of the gradual and uncertain knowledge… Expand
Clustering with Deep Neural Networks – An Overview of Recent Methods
The application of clustering has always been an important method for problem-solving. As technology advances, in particular the trend of Deep Learning enables new methods of clustering. This paperExpand
Adversarial Deep Embedded Clustering: on a better trade-off between Feature Randomness and Feature Drift
TLDR
ADEC (Adversarial Deep Embedded Clustering) is proposed, a novel autoencoder-based clustering model, which addresses a dual problem, namely, Feature Randomness and Feature Drift, using adversarial training, and empirically demonstrates the suitability of the model on handling these problems using benchmark real datasets. Expand
Generative Deep-Neural-Network Mixture Modeling with Semi-Supervised MinMax+EM Learning
TLDR
A novel statistical framework for a DNN mixture model using a single generative adversarial network, and a novel data-likelihood term relying on a well-regularized / constrained Gaussian mixture model in the latent space along with a prior term on the DNN weights are proposed. Expand
Clustering Alexa Internet Data using Auto Encoder Network and Affinity Propagation
Non-linear mapping is one of the most popular solutions for complex data structures and distinct patterns to cluster data. Auto encoder Networks (AENs) are widely used in clustering as they improveExpand
Similarity Metric for Millions of Unlabeled Face Images
  • Hadi Salman, J. Zhan
  • Computer Science
  • 2020 10th Annual Computing and Communication Workshop and Conference (CCWC)
  • 2020
TLDR
A Convolution Neural Network (CNN) based metric that achieves state of the art results and learns the priority features that differentiate two given input faces. Expand
Application of artificially intelligent systems for the identification of discrete fossiliferous levels
TLDR
This study shows Machine Learning analyses to be a valuable tool for the processing of spatial data in an efficient and quantitative manner, successfully identifying the presence of discrete fossiliferous levels in both Batallones-3 and BatAllones-10. Expand
Unsupervised Approach for Monitoring Satire on Social Media
TLDR
This work proposes an autoencoder based clustering framework to effectively combine embedded feature learning and clustering assignments for detection of satire in images taken from the popular photo-sharing platform - Flickr. Expand
Cluster Activation Mapping with Applications to Medical Imaging
TLDR
This work developed novel methodology to generate CLuster Activation Mapping (CLAM) which combines an unsupervised deep clustering framework with a modification of Score-CAM, an approach for discriminative localization in the supervised setting. Expand

References

SHOWING 1-10 OF 51 REFERENCES
Deep Clustering via Joint Convolutional Autoencoder Embedding and Relative Entropy Minimization
TLDR
A new clustering model, called DEeP Embedded Regularized ClusTering (DEPICT), which efficiently maps data into a discriminative embedding subspace and precisely predicts cluster assignments is proposed, which indicates the superiority and faster running time of DEPICT in real-world clustering tasks, where no labeled data is available for hyper-parameter tuning. Expand
Towards K-means-friendly Spaces: Simultaneous Deep Learning and Clustering
TLDR
A joint DR and K-means clustering approach in which DR is accomplished via learning a deep neural network (DNN) while exploiting theDeep neural network's ability to approximate any nonlinear function is proposed. Expand
Deep Discriminative Latent Space for Clustering
TLDR
The high accuracy obtained by the proposed auto-encoder with respect to a discriminative pairwise loss function during the auto-Encoder pre-training phase is demonstrated as well as its rapid convergence, even with small networks. Expand
Variational Deep Embedding: An Unsupervised and Generative Approach to Clustering
TLDR
Variational Deep Embedding (VaDE) is proposed, a novel unsupervised generative clustering approach within the framework of Variational Auto-Encoder (VAE), which shows its capability of generating highly realistic samples for any specified cluster, without using supervised information during training. Expand
Adversarial Deep Embedded Clustering: on a better trade-off between Feature Randomness and Feature Drift
TLDR
ADEC (Adversarial Deep Embedded Clustering) is proposed, a novel autoencoder-based clustering model, which addresses a dual problem, namely, Feature Randomness and Feature Drift, using adversarial training, and empirically demonstrates the suitability of the model on handling these problems using benchmark real datasets. Expand
Deep Subspace Clustering Networks
TLDR
The key idea is to introduce a novel self-expressive layer between the encoder and the decoder to mimic the "self-expressiveness" property that has proven effective in traditional subspace clustering. Expand
Representation Learning with Contrastive Predictive Coding
TLDR
This work proposes a universal unsupervised learning approach to extract useful representations from high-dimensional data, which it calls Contrastive Predictive Coding, and demonstrates that the approach is able to learn useful representations achieving strong performance on four distinct domains: speech, images, text and reinforcement learning in 3D environments. Expand
An Analysis of Single-Layer Networks in Unsupervised Feature Learning
TLDR
The results show that large numbers of hidden nodes and dense feature extraction are critical to achieving high performance—so critical, in fact, that when these parameters are pushed to their limits, they achieve state-of-the-art performance on both CIFAR-10 and NORB using only a single layer of features. Expand
Joint Unsupervised Learning of Deep Representations and Image Clusters
TLDR
A recurrent framework for joint unsupervised learning of deep representations and image clusters by integrating two processes into a single model with a unified weighted triplet loss function and optimizing it end-to-end can obtain not only more powerful representations, but also more precise image clusters. Expand
Understanding and Improving Interpolation in Autoencoders via an Adversarial Regularizer
TLDR
This paper proposes a regularization procedure which encourages interpolated outputs to appear more realistic by fooling a critic network which has been trained to recover the mixing coefficient from interpolated data. Expand
...
1
2
3
4
5
...