Parallel and Communication Avoiding Least Angle Regression

@article{Das2021ParallelAC,
  title={Parallel and Communication Avoiding Least Angle Regression},
  author={S. Das and J. Demmel and K. Fountoulakis and L. Grigori and Michael W. Mahoney},
  journal={ArXiv},
  year={2021},
  volume={abs/1905.11340}
}
We are interested in parallelizing the Least Angle Regression (LARS) algorithm for fitting linear regression models to high-dimensional data. We consider two parallel and communication avoiding versions of the basic LARS algorithm. The two algorithms apply to data that have different layout patterns (one is appropriate for row-partitioned data, and the other is appropriate for column-partitioned data), and they have different asymptotic costs and practical performance. The first is bLARS, a… Expand
DistStat.jl: Towards Unified Programming for High-Performance Statistical Computing Environments in Julia
Numerical algorithms for high-performance computational science

References

SHOWING 1-10 OF 47 REFERENCES
Communication lower bounds and optimal algorithms for numerical linear algebra*†
Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization
Avoiding Communication in Dense Linear Algebra
Minimizing communication in sparse matrix solvers
Introspective Sorting and Selection Algorithms
LogGP: Incorporating Long Messages into the LogP Model for Parallel Computation
Tuning Hardware and Software for Multiprocessors
A class of parallel iterative methods implemented on multiprocessors
Communication-Avoiding Krylov Subspace Methods in Theory and Practice
...
1
2
3
4
5
...