Lazy Sparse Stochastic Gradient Descent for Regularized Mutlinomial Logistic Regression

Abstract

Stochastic gradient descent efficiently estimates maximum likelihood logistic regression coefficients from sparse input data. Regularization with respect to a prior coefficient distribution destroys the sparsity of the gradient evaluated at a single example. Sparsity is restored by lazily shrinking a coefficient along the cumulative gradient of the prior… (More)

Topics

Statistics

050100150201520162017
Citations per Year

Citation Velocity: 44

Averaging 44 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@inproceedings{Carpenter2008LazySS, title={Lazy Sparse Stochastic Gradient Descent for Regularized Mutlinomial Logistic Regression}, author={Bob Carpenter}, year={2008} }