# Robust Low-Rank Subspace Segmentation with Semidefinite Guarantees

@article{Ni2010RobustLS,
title={Robust Low-Rank Subspace Segmentation with Semidefinite Guarantees},
author={Yuzhao Ni and Ju Sun and Xiao-Tong Yuan and Shuicheng Yan and Loong Fah Cheong},
journal={2010 IEEE International Conference on Data Mining Workshops},
year={2010},
pages={1179-1188}
}
• Published 20 September 2010
• Computer Science
• 2010 IEEE International Conference on Data Mining Workshops
Recently there is a line of research work proposing to employ Spectral Clustering (SC) to segment (group)\footnote{Throughout the paper, we use segmentation, clustering, and grouping, and their verb forms, interchangeably.} high-dimensional structural data such as those (approximately) lying on subspaces\footnote{We follow~\cite{liu2010robust} and use the term subspace'' to denote both linear subspaces and affine subspaces. There is a trivial conversion between linear subspaces and affine…

## Figures and Tables from this paper

Correlation Adaptive Subspace Segmentation by Trace Lasso
• Computer Science
2013 IEEE International Conference on Computer Vision
• 2013
The Correlation Adaptive Subspace Segmentation (CASS) method is proposed, a data correlation dependent method which simultaneously performs automatic data selection and groups correlated data together and can be regarded as a method which adaptively balances SSC and LSR.
Latent Low-Rank Representation for subspace segmentation and feature extraction
• Computer Science
2011 International Conference on Computer Vision
• 2011
This paper proposes to construct the dictionary by using both observed and unobserved, hidden data, and shows that the effects of the hidden data can be approximately recovered by solving a nuclear norm minimization problem, which is convex and can be solved efficiently.
Membership representation for detecting block-diagonal structure in low-rank or sparse subspace clustering
• Computer Science
2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
• 2015
The proposed method shares the philosophy of the above subspace clustering methods, in that it is a self-expressive system based on a Hadamard product of a membership matrix that has eigenvalues normalized in between zero and one and shows competitive results.
Relations Among Some Low-Rank Subspace Recovery Models
It is found that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (R -LRR), and robust latent low- rank representation (r-LatLRR) are actually deeply connected.
Simultaneous Clustering and Model Selection: Algorithm, Theory and Applications
• Computer Science
IEEE Transactions on Pattern Analysis and Machine Intelligence
• 2018
This paper addresses model selection and clustering in a joint manner by recovering an ideal affinity tensor from an imperfect input, and takes into account the relationship of the affinities induced by the cluster structures.
Robust Multi-subspace Analysis Using Novel Column L0-norm Constrained Matrix Factorization
• Computer Science
ArXiv
• 2018
Experimental results on both synthetic and real-world datasets demonstrate that besides the superiority over traditional and state-of-the-art methods for subspace clustering, data reconstruction, error correction, MFC0 also shows its uniqueness for multi-subspace basis learning and direct sparse representation.

## References

SHOWING 1-10 OF 36 REFERENCES
Analysis and Improvement of Low Rank Representation for Subspace segmentation
• Computer Science
ArXiv
• 2011
It is shown that LRR can be approximated as a factorization method that combines noise removal by column sparse robust PCA and an improved version of LRR, called Robust Shape Interaction (RSI), which uses the corrected data as the dictionary instead of the noisy data.
Sparse subspace clustering
• Computer Science
2009 IEEE Conference on Computer Vision and Pattern Recognition
• 2009
This work proposes a method based on sparse representation (SR) to cluster data drawn from multiple low-dimensional linear or affine subspaces embedded in a high-dimensional space and applies this method to the problem of segmenting multiple motions in video.
Robust Subspace Segmentation by Low-Rank Representation
• Computer Science
ICML
• 2010
Both theoretical and experimental results show that low-rank representation is a promising tool for subspace segmentation from corrupted data.
Estimation of Subspace Arrangements with Applications in Modeling and Segmenting Mixed Data
• Computer Science
SIAM Rev.
• 2008
This paper provides a comprehensive summary of important algebraic properties and statistical facts that are crucial for making the inference of subspace arrangements both efficient and robust, even when the given data are corrupted by noise or contaminated with outliers.
A General Framework for Motion Segmentation: Independent, Articulated, Rigid, Non-rigid, Degenerate and Non-degenerate
• Mathematics, Computer Science
ECCV
• 2006
This work proposes a general framework for motion segmentation under affine projections which utilizes two properties of trajectory data: geometric constraint and locality, and estimates a number of linear manifolds, whose dimensions are unknown beforehand.
A Singular Value Thresholding Algorithm for Matrix Completion
• Computer Science
SIAM J. Optim.
• 2010
This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank, and develops a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.
Learning With $\ell ^{1}$-Graph for Image Analysis
• Computer Science
IEEE Transactions on Image Processing
• 2010
Compared with the conventional k -nearest-neighbor graph and ¿-ball graph, the ¿1-graph possesses the advantages: greater robustness to data noise, (2) automatic sparsity, and (3) adaptive neighborhood for individual datum.
Kernel k-means: spectral clustering and normalized cuts
• Computer Science
KDD
• 2004
The generality of the weighted kernel k-means objective function is shown, and the spectral clustering objective of normalized cut is derived as a special case, leading to a novel weightedkernel k-Means algorithm that monotonically decreases the normalized cut.
Robust principal component analysis?
• Computer Science
JACM
• 2011
It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis.