- Full text PDF available (21)
- This year (10)
- Last 5 years (23)
- Last 10 years (23)
Journals and Conferences
In this paper we study the problem of recovering an low-rank positive semidefinite matrix from linear measurements. Our algorithm, which we call Procrustes Flow, starts from an initial estimate… (More)
We show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.
We show that gradient descent converges to a local minimizer , almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.
We propose a novel technique for analyzing adaptive sampling called the Simulator. Our approach differs from the existing methods by considering not how much information could be gathered by any… (More)
We establish that first-order methods avoid saddle points for almost all initializations. Our results apply to a wide variety of first-order methods, including gradient descent, block coordinate… (More)
We prove that the ordinary least-squares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper… (More)
We prove a query complexity lower bound on rank-one principal component analysis (PCA). We consider an oracle model where, given a symmetric matrix M ∈ Rd×d, an algorithm is allowed to make T exact… (More)
A recommendation system confronts two opposing problems. In order to be practical, recommendations systems need to be quick, efficient, and scale well in both computational complexity and memory cost… (More)
A common problem in machine learning is to rank a set of n items based on pairwise comparisons. Here ranking refers to partitioning the items into sets of pre-specified sizes according to their… (More)