Selective block minimization for faster convergence of limited memory large-scale linear models

@inproceedings{Chang2011SelectiveBM,
  title={Selective block minimization for faster convergence of limited memory large-scale linear models},
  author={Kai-Wei Chang and Dan Roth},
  booktitle={KDD},
  year={2011}
}
As the size of data sets used to build classifiers steadily increases, training a linear model efficiently with limited memory becomes essential. Several techniques deal with this problem by loading blocks of data from disk one at a time, but usually take a considerable number of iterations to converge to a reasonable model. Even the best block minimization techniques [1] require many block loads since they treat all training examples uniformly. As disk I/O is expensive, reducing the amount of… CONTINUE READING
Highly Cited
This paper has 40 citations. REVIEW CITATIONS
27 Citations
3 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 27 extracted citations

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…