#### Filter Results:

#### Publication Year

2012

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

- Qing Qu, Ju Sun, John Wright
- NIPS
- 2014

We consider the problem of recovering the sparsest vector in a subspace S ∈ R p with dim (S) = n. This problem can be considered a homogeneous variant of the sparse recovery problem, and finds applications in sparse dictionary learning, sparse PCA, and other problems in signal processing and machine learning. Simple convex heuristics for this problem… (More)

- Ju Sun, Qing Qu, John Wright
- ArXiv
- 2015

We consider the problem of recovering a complete (i.e., square and invertible) matrix A 0 , from Y ∈ R n×p with Y = A 0 X 0 , provided X 0 is sufficiently sparse. This recovery problem is central to the theoretical understanding of dictionary learning, which seeks a sparse representation for a collection of input signals, and finds numerous applications in… (More)

—Pixel-wise classification, where each pixel is assigned to a predefined class, is one of the most important procedures in hyperspectral image (HSI) analysis. By representing a test pixel as a linear combination of a small subset of labeled pixels, a sparse representation classifier (SRC) gives rather plausible results compared with that of traditional… (More)

- Ju Sun, Qing Qu, John Wright
- ArXiv
- 2015

In this note, we focus on smooth nonconvex optimization problems that obey: (1) all local minimizers are also global; and (2) around any saddle point or local maximizer, the objective has a negative directional curvature. Concrete applications such as dictionary learning, generalized phase retrieval, and orthogonal tensor decomposition are known to induce… (More)

- Ju Sun, Qing Qu, John Wright
- ISIT
- 2016

Can we recover a complex signal from its Fourier magnitudes? More generally, given a set of m measurements, y k = |a * k x| for k = 1,. .. , m, is it possible to recover x ∈ C n (i.e., length-n complex vector)? This generalized phase retrieval (GPR) problem is a fundamental task in various disciplines, and has been the subject of much recent investigation.… (More)

- Ju Sun, Qing Qu, John Wright
- ArXiv
- 2015

We consider the problem of recovering a complete (i.e., square and invertible) matrix A 0 , from Y ∈ R n×p with Y = A 0 X 0 , provided X 0 is sufficiently sparse. This recovery problem is central to the theoretical understanding of dictionary learning, which seeks a sparse representation for a collection of input signals, and finds numerous applications in… (More)

- Ju Sun, Qing Qu, John Wright
- ICML
- 2015

We consider the problem of recovering a complete (i.e., square and invertible) dictionary A 0 , from Y = A 0 X 0 with Y ∈ R n×p. This recovery setting is central to the theoretical understanding of dictionary learning. We give the first efficient algorithm that provably recovers A 0 when X 0 has O (n) nonzeros per column, under suitable probability model… (More)

- Ju Sun, Qing Qu, John Wright
- ArXiv
- 2015

We consider the problem of recovering a complete (i.e., square and invertible) matrix A 0 , from Y ∈ R n×p with Y = A 0 X 0 , provided X 0 is sufficiently sparse. This recovery problem is central to the theoretical understanding of dictionary learning, which seeks a sparse representation for a collection of input signals, and finds numerous applications in… (More)

- Jian Jin, Qing Qu, Yuantao Gu
- ArXiv
- 2013

The newly proposed l 1 norm constraint zero-point attraction Least Mean Square algorithm (ZA-LMS) demonstrates excellent performance on exact sparse system identification. However, ZA-LMS has less advantage against standard LMS when the system is near sparse. Thus, in this paper, firstly the near sparse system modeling by Generalized Gaussian Distribution… (More)