• Corpus ID: 240070394

Learning Deep Representation with Energy-Based Self-Expressiveness for Subspace Clustering

@article{Li2021LearningDR,
  title={Learning Deep Representation with Energy-Based Self-Expressiveness for Subspace Clustering},
  author={Yanming Li and Changsheng Li and Shiye Wang and Ye Yuan and Guoren Wang},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.15037}
}
Deep subspace clustering has attracted increasing attention in recent years. Almost all the existing works are required to load the whole training data into one batch for learning the self-expressive coefficients in the framework of deep learning. Although these methods achieve promising results, such a learning fashion severely prevents from the usage of deeper neural network architectures (e.g., ResNet), leading to the limited representation abilities of the models. In this paper, we propose… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 44 REFERENCES

LRSC: Learning Representations for Subspace Clustering

This paper aims to leverage external data through constructing lots of relevant tasks to guide the training of the encoder, motivated by the idea of meta-learning, and proposes a novel subspace clustering framework through learning precise sample representations.

Learning a Self-Expressive Network for Subspace Clustering

It is shown that the proposed Self-Expressive Network (SENet) can not only learn the self-expressive coefficients with desired properties on the training data, but also handle out-of-sample data, and can also be leveraged to perform subspace clustering on large-scale datasets.

Deep Subspace Clustering Networks

The key idea is to introduce a novel self-expressive layer between the encoder and the decoder to mimic the "self-expressiveness" property that has proven effective in traditional subspace clustering.

Latent Distribution Preserving Deep Subspace Clustering

A novel deep subspace clustering method based on a latent distribution-preserving autoencoder is proposed, which introduces a distribution consistency loss to guide the learning of distribution- Preserving latent representation, and consequently enables strong capacity of characterizing the real-world data for sub space clustering.

Deep Subspace Clustering

A deep extension of sparse subspace clustering with L1-norm (DSC-L1), which can infer a new data affinity matrix by simultaneously satisfying the sparsity principle of SSC and the nonlinearity given by neural networks.

Self-Supervised Convolutional Subspace Clustering Network

An end-to-end trainable framework that combines a dual self-supervision that exploits the output of spectral clustering to supervise the training of the feature learning module and the self-expression module into a joint optimization framework is proposed.

Multi-Scale Fusion Subspace Clustering Using Similarity Constraint

The Multi-Scale Fusion Subspace Clustering Using Similarity Constraint (SC-MSFSC) network is proposed, which learns a more discriminative self-expression coefficient matrix by a novel multi-scale fusion module and introduces a similarity constraint module to guide the fused self- expression coefficient matrix in training.

Structured AutoEncoders for Subspace Clustering

This work proposes a novel subspace clustering approach by introducing a new deep model—Structured AutoEncoder (StructAE), which learns a set of explicit transformations to progressively map input data points into nonlinear latent spaces while preserving the local and global subspace structure.

Deep Adversarial Subspace Clustering

To the best knowledge, this is the first successful application of GAN-alike model for unsupervised subspace clustering, which also paves the way for deep learning to solve other unsuper supervised learning problems.

ClusterGAN : Latent Space Clustering in Generative Adversarial Networks

The results show a remarkable phenomenon that GANs can preserve latent space interpolation across categories, even though the discriminator is never exposed to such vectors.