Distributed Learning with Regularized Least Squares

@article{Lin2017DistributedLW,
  title={Distributed Learning with Regularized Least Squares},
  author={Shaobo Lin and Xin Guo and Ding-Xuan Zhou},
  journal={Journal of Machine Learning Research},
  year={2017},
  volume={18},
  pages={92:1-92:31}
}
We study distributed learning with the least squares regularization scheme in a reproducing kernel Hilbert space (RKHS). By a divide-and-conquer approach, the algorithm partitions a data set into disjoint data subsets, applies the least squares regularization scheme to each data subset to produce an output function, and then takes an average of the individual output functions as a final global estimator or predictor. We show with error bounds and learning rates in expectation in both the L… CONTINUE READING
Highly Cited
This paper has 20 citations. REVIEW CITATIONS

From This Paper

Topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 16 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 45 references

Thresholded spectral algorithms for sparse approximations

  • Z. C. Guo, D. H. Xiang, X. Guo, D. X. Zhou
  • Analysis and Applications,
  • 2017
2 Excerpts

Regularization schemes for minimum error entropy principle

  • T. Hu, J. Fan, Q. Wu, D. X. Zhou
  • Analysis and Applications,
  • 2015
2 Excerpts

Similar Papers

Loading similar papers…