• Corpus ID: 18821252

Closed-Form Solutions to A Category of Nuclear Norm Minimization Problems

@article{Liu2010ClosedFormST,
  title={Closed-Form Solutions to A Category of Nuclear Norm Minimization Problems},
  author={Guangcan Liu and Ju Sun and Shuicheng Yan},
  journal={ArXiv},
  year={2010},
  volume={abs/1011.4829}
}
It is an efficient and effective strategy to utilize the nuclear norm approximation to learn low-rank matrices, which arise frequently in machine learning and computer vision. So the exploration of nuclear norm minimization problems is gaining much attention recently. In this paper we shall prove that the following Low-Rank Representation (LRR) \cite{icml_2010_lrr,lrr_extention} problem: {eqnarray*} \min_{Z} \norm{Z}_*, & {s.t.,} & X=AZ, {eqnarray*} has a unique and closed-form solution, where… 
Local Loss Optimization in Operator Models: A New Insight into Spectral Learning
TLDR
This paper re-visits the spectral method for learning latent variable models defined in terms of observable operators, showing that operators can be recovered by minimizing a loss defined on a finite subset of the domain, and proposes a regularized convex relaxation of this optimization.
Learning finite-state machines: statistical and algorithmic aspects
TLDR
This thesis gives the first application of this method for learning conditional distributions over pairs of aligned sequences and proves that the method can learn the whole class of probabilistic automata, thus extending the class of models previously known to be learnable with this approach.

References

SHOWING 1-7 OF 7 REFERENCES
Robust Low-Rank Subspace Segmentation with Semidefinite Guarantees
TLDR
It is advocated to enforce the symmetric positive semi definite constraint explicitly during learning (Low-Rank Representation with Positive Semi Definite constraint, or LRR-PSD), and it is shown that factually it can be solved in an exquisite scheme efficiently instead of general-purpose SDP solvers that usually scale up poorly.
Latent Low-Rank Representation for subspace segmentation and feature extraction
TLDR
This paper proposes to construct the dictionary by using both observed and unobserved, hidden data, and shows that the effects of the hidden data can be approximately recovered by solving a nuclear norm minimization problem, which is convex and can be solved efficiently.
Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices via Convex Optimization
TLDR
It is proved that most matrices A can be efficiently and exactly recovered from most error sign-and-support patterns by solving a simple convex program, for which it is given a fast and provably convergent algorithm.
Robust Subspace Segmentation by Low-Rank Representation
TLDR
Both theoretical and experimental results show that low-rank representation is a promising tool for subspace segmentation from corrupted data.
Robust principal component analysis?
TLDR
It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis.
A Multibody Factorization Method for Independently Moving Objects
TLDR
A new method for separating and recovering the motion and shape of multiple independently moving objects in a sequence of images by introducing a mathematical construct of object shapes, called the shape interaction matrix, which is invariant to both the object motions and the selection of coordinate systems.
Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices
This paper has been withdrawn due to a critical error near equation (71). This error causes the entire argument of the paper to collapse. Emmanuel Candes of Stanford discovered the error, and has