• Publications
  • Influence
Discriminative Transfer Subspace Learning via Low-Rank and Sparse Representation
TLDR
This paper addresses the problem of unsupervised domain transfer learning in which no labels are available in the target domain by the inexact augmented Lagrange multiplier method and can avoid a potentially negative transfer by using a sparse matrix to model the noise and, thus, is more robust to different types of noise. Expand
Robust Sparse Linear Discriminant Analysis
TLDR
A novel feature extraction method called robust sparse linear discriminant analysis (RSLDA) is proposed to solve the above problems and achieves the competitive performance compared with other state-of-the-art feature extraction methods. Expand
Learning a Nonnegative Sparse Graph for Linear Regression
TLDR
A novel nonnegative sparse graph (NNSG) learning method, in which the linear regression and graph learning were simultaneously performed to guarantee an overall optimum, showed that NNSG can obtain very high classification accuracy and greatly outperforms conventional G-SSL methods, especially some conventional graph construction methods. Expand
Low-Rank Embedding for Robust Image Feature Extraction
TLDR
A robust linear dimensionality reduction technique termed low-rank embedding (LRE) is proposed, which provides a robust image representation to uncover the potential relationship among the images to reduce the negative influence from the occlusion and corruption so as to enhance the algorithm’s robustness in image feature extraction. Expand
Low rank representation with adaptive distance penalty for semi-supervised subspace classification
TLDR
A novel LRR with Adaptive Distance Penalty (LRRADP) to construct a good affinity graph that can not only capture the global subspace structure of the whole data but also effectively preserve the neighbor relationship among samples. Expand
Data Uncertainty in Face Recognition
TLDR
This paper reduces the uncertainty of the face representation by synthesizing the virtual training samples and devise a representation approach based on the selected useful training samples to perform face recognition that can not only obtain a high face recognition accuracy, but also has a lower computational complexity than the other state-of-the-art approaches. Expand
Robust Latent Subspace Learning for Image Classification
TLDR
To learn a robust latent subspace, a sparse item is used to compensate error, which helps suppress the interference of noise via weakening its response during regression, and an efficient optimization algorithm is designed to solve the proposed optimization problem. Expand
Low-rank representation with adaptive graph regularization
TLDR
Experimental results show that the proposed graph learning method can significantly improve the clustering performance and a novel rank constraint is further introduced to the model, which encourages the learned graph to have very clear clustering structures. Expand
Regularized Label Relaxation Linear Regression
TLDR
A novel regularized label relaxation LR method is proposed, which relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Expand
Approximate Low-Rank Projection Learning for Feature Extraction
TLDR
This paper proposes to use two different matrices to approximate the low-rank projection in LatL RR so that the dimension of obtained features can be reduced, which is more flexible than original LatLRR. Expand
...
1
2
3
4
5
...