• Publications
  • Influence
Robust Recovery of Subspace Structures by Low-Rank Representation
TLDR
It is shown that the convex program associated with LRR solves the subspace clustering problem in the following sense: When the data is clean, LRR exactly recovers the true subspace structures; when the data are contaminated by outliers, it is proved that under certain conditions LRR can exactly recover the row space of the original data. Expand
A Geometric Analysis of Phase Retrieval
TLDR
It is proved that when the measurement vectors are generic, with high probability, a natural least-squares formulation for GPR has the following benign geometric structure: (1) There are no spurious local minimizers, and all global minimizers are equal to the target signal, up to a global phase, and (2) the objective function has a negative directional curvature around each saddle point. Expand
Complete Dictionary Recovery Over the Sphere I: Overview and the Geometric Picture
TLDR
A geometric characterization of the objective landscape is provided, showing that the problem is highly structured with high probability: 1) there are no “spurious” local minimizers and 2) around all saddle points the objective has a negative directional curvature. Expand
Hierarchical spatio-temporal context modeling for action recognition
TLDR
This paper proposes to model the spatio-temporal context information in a hierarchical way, where three levels of context are exploited in ascending order of abstraction, and proposes to employ the Multiple Kernel Learning (MKL) technique to prune the kernels towards speedup in algorithm evaluation. Expand
When Are Nonconvex Problems Not Scary?
TLDR
This work describes a second-order trust-region algorithm that provably converges to a global minimizer efficiently, without special initializations, and highlights alternatives. Expand
Finding a Sparse Vector in a Subspace: Linear Sparsity Using Alternating Directions
TLDR
This paper presents a relatively simple nonconvex approach based on alternating directions, which provably succeeds even when the fraction of nonzero entries is Ω(1), and is the first practical algorithm to achieve linear scaling under the planted sparse model. Expand
Complete Dictionary Recovery Over the Sphere II: Recovery by Riemannian Trust-Region Method
TLDR
A Riemannian trust region algorithm that provably converges to a local minimizer with from arbitrary initializations is described and showed that with high probability, the nonconvex formulation has no “spurious” local minimizers and around any saddle point, the objective function has a negative directional curvature. Expand
Robust Low-Rank Subspace Segmentation with Semidefinite Guarantees
TLDR
It is advocated to enforce the symmetric positive semi definite constraint explicitly during learning (Low-Rank Representation with Positive Semi Definite constraint, or LRR-PSD), and it is shown that factually it can be solved in an exquisite scheme efficiently instead of general-purpose SDP solvers that usually scale up poorly. Expand
Randomized Locality Sensitive Vocabularies for Bag-of-Features Model
TLDR
This paper proposes the random locality sensitive vocabulary (RLSV) scheme, a simple scheme that generates and aggregates multiple visual vocabularies based on random projections, without taking clustering or training efforts. Expand
...
1
2
3
...