Communication-efficient Sparse Regression

@article{Lee2017CommunicationefficientSR,
  title={Communication-efficient Sparse Regression},
  author={Jason D. Lee and Qiang Liu and Yuekai Sun and Jonathan E. Taylor},
  journal={Journal of Machine Learning Research},
  year={2017},
  volume={18},
  pages={5:1-5:30}
}
We devise a communication-efficient approach to distributed sparse regression in the highdimensional setting. The key idea is to average “debiased” or “desparsified” lasso estimators. We show the approach converges at the same rate as the lasso as long as the dataset is not split across too many machines, and consistently estimates the support under weaker conditions than the lasso. On the computational side, we propose a new parallel and computationally-efficient algorithm to compute the… CONTINUE READING

From This Paper

Figures, tables, results, connections, and topics extracted from this paper.
8 Extracted Citations
23 Extracted References
Similar Papers

Referenced Papers

Publications referenced by this paper.
Showing 1-10 of 23 references

Statistical learning with sparsity: the lasso and its generalizations

  • Trevor Hastie, Robert Tibshirani, Martin Wainwright
  • 2015
Highly Influential
6 Excerpts

Similar Papers

Loading similar papers…