Relations Among Some Low-Rank Subspace Recovery Models

@article{Zhang2014RelationsAS,
  title={Relations Among Some Low-Rank Subspace Recovery Models},
  author={Hongyang Zhang and Zhouchen Lin and Chao Zhang and Junbin Gao},
  journal={Neural Computation},
  year={2014},
  volume={27},
  pages={1915-1950}
}
Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems. We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (R-LRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 15 CITATIONS

A review on low-rank models in data analysis

VIEW 18 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Robust Adaptive Low-Rank and Sparse Embedding for Feature Representation

  • 2018 24th International Conference on Pattern Recognition (ICPR)
  • 2018
VIEW 3 EXCERPTS
CITES BACKGROUND

Robust Latent Subspace Learning for Image Classification

  • IEEE Transactions on Neural Networks and Learning Systems
  • 2018
VIEW 2 EXCERPTS
CITES METHODS

Outlier-Robust Tensor PCA

  • 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2017

References

Publications referenced by this paper.
SHOWING 1-10 OF 53 REFERENCES

Robust Recovery of Subspace Structures by Low-Rank Representation

  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2013
VIEW 15 EXCERPTS

A Multibody Factorization Method for Independently Moving Objects

  • International Journal of Computer Vision
  • 1998
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL