#### Filter Results:

- Full text PDF available (32)

#### Publication Year

2013

2018

- This year (11)
- Last 5 years (52)
- Last 10 years (52)

#### Publication Type

#### Co-author

#### Journals and Conferences

Learn More

- Yin Tat Lee, Aaron Sidford
- 2013 IEEE 54th Annual Symposium on Foundations ofâ€¦
- 2013

In this paper we show how to accelerate randomized coordinate descent methods and achieve faster convergence rates without paying per-iteration costs in asymptotic running time. In particular, weâ€¦ (More)

- Yair Carmon, John C. Duchi, Oliver Hinder, Aaron Sidford
- SIAM Journal on Optimization
- 2018

We present an accelerated gradient method for non-convex optimization problems with Lipschitz continuous first and second derivatives. The method requires time O( âˆ’7/4 log(1/ )) to find anâ€¦ (More)

- Roy Frostig, Rong Ge, Sham M. Kakade, Aaron Sidford
- ICML
- 2015

We develop a family of accelerated stochastic algorithms that minimize sums of convex functions. Our algorithms improve upon the fastest running time for empirical risk minimization (ERM), and inâ€¦ (More)

In this paper, we present a simple combinatorial algorithm that solves symmetric diagonally dominant (SDD) linear systems in nearly-linear time. It uses little of the machinery that previouslyâ€¦ (More)

- Chi Jin, Sham M. Kakade, Cameron Musco, Praneeth Netrapalli, Aaron Sidford
- ArXiv
- 2015

In this paper we provide faster algorithms and improved sample complexities for approximating the top eigenvector of a matrix A>A. In particular we give the following results for computing anâ€¦ (More)

- Yin Tat Lee, Aaron Sidford
- 2014 IEEE 55th Annual Symposium on Foundations ofâ€¦
- 2014

In this paper, we present a new algorithm for '/ solving linear programs that requires only OÌƒ(âˆšrank(A)L) iterations where A is the constraint matrix of a linear program with m constraints, nâ€¦ (More)

- Roy Frostig, Rong Ge, Sham M. Kakade, Aaron Sidford
- COLT
- 2015

In many estimation problems, e.g. linear and logistic regression, we wish to minimize an unknown objective given only unbiased samples of the objective function. Furthermore, we aim to achieve thisâ€¦ (More)

- Yair Carmon, John C. Duchi, Oliver Hinder, Aaron Sidford
- ICML
- 2017

We develop and analyze a variant of Nesterovâ€™s accelerated gradient descent (AGD) for minimization of smooth non-convex functions. We prove that one of two cases occurs: either our AGD variantâ€¦ (More)

This work provides improved guarantees for streaming princ iple component analysis (PCA). Given A1, . . . ,An âˆˆ R sampled independently from distributions satisfying E [Ai] = Î£ for Î£ 0, this workâ€¦ (More)

This work characterizes the benefits of averaging techniques widely used in conjunction with stochastic gradient descent (SGD). In particular, this work sharply analyzes: (1) mini-batching, a methodâ€¦ (More)