Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization

Abstract

Stochastic Gradient Descent (SGD) has become popular for solving large scale supervised machine learning optimization problems such as SVM, due to their strong theoretical guarantees. While the closely related Dual Coordinate Ascent (DCA) method has been implemented in various software packages, it has so far lacked good convergence analysis. This paper… (More)
View Slides

Topics

8 Figures and Tables

Statistics

0501002012201320142015201620172018
Citations per Year

287 Citations

Semantic Scholar estimates that this publication has 287 citations based on the available data.

See our FAQ for additional information.

Slides referencing similar topics