#### Filter Results:

- Full text PDF available (138)

#### Publication Year

2002

2017

- This year (8)
- Last 5 years (64)
- Last 10 years (127)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Shai Shalev-Shwartz, Yoram Singer, Nathan Srebro, Andrew Cotter
- Math. Program.
- 2007

We describe and analyze a simple and effective iterative algorithm for solving the optimization problem cast by Support Vector Machines (SVM). Our method alternates between stochastic gradient descent steps and projection steps. We prove that the number of iterations required to obtain a solution of accuracy ε is Õ(1/ε). In contrast, previous… (More)

- Shai Shalev-Shwartz, Koby Crammer, Ofer Dekel, Yoram Singer
- NIPS
- 2003

We present a unified view for online classification, regression, and uniclass problems. This view leads to a single algorithmic framework for the three problems. We prove worst case loss bounds for various algorithms for both the realizable case and the non-realizable case. A conversion of our main online algorithm to the setting of batch learning is also… (More)

We describe efficient algorithms for projecting a vector onto the <i>l</i><sub>1</sub>-ball. We present two methods for projection. The first performs exact projection in <i>O(n)</i> expected time, where <i>n</i> is the dimension of the space. The second works on vectors <i>k</i> of whose elements are perturbed outside the <i>l</i><sub>1</sub>-ball,… (More)

- Shai Shalev-Shwartz
- Foundations and Trends in Machine Learning
- 2012

Online learning is a well established learning paradigm which has both theoretical and practical appeals. The goal of online learning is to make a sequence of accurate predictions given knowledge of the correct answer to previous prediction tasks and possibly additional available information. Online learning has been studied in several research fields… (More)

Stochastic Gradient Descent (SGD) has become popular for solving large scale supervised machine learning optimization problems such as SVM, due to their strong theoretical guarantees. While the closely related Dual Coordinate Ascent (DCA) method has been implemented in various software packages, it has so far lacked good convergence analysis. This paper… (More)

- Shai Shalev-Shwartz, Yoram Singer, Andrew Y. Ng
- ICML
- 2004

We describe and analyze an online algorithm for supervised learning of pseudo-metrics. The algorithm receives pairs of instances and predicts their similarity according to a pseudo-metric. The pseudo-metrics we use are quadratic forms parameterized by positive semi-definite matrices. The core of the algorithm is an update rule that is based on successive… (More)

- Shai Shalev-Shwartz, Tong Zhang
- Journal of Machine Learning Research
- 2013

- John C. Duchi, Shai Shalev-Shwartz, Yoram Singer, Ambuj Tewari
- COLT
- 2010

We present a new method for regularized convex optimization and analyze it under both online and stochastic optimization settings. In addition to unifying previously known firstorder algorithms, such as the projected gradient method, mirror descent, and forwardbackward splitting, our method yields new analysis and algorithms. We also derive specific… (More)

- Shai Shalev-Shwartz, Tong Zhang
- Math. Program.
- 2014

We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge… (More)