# Fast, Provably convergent IRLS Algorithm for p-norm Linear Regression

@article{Adil2019FastPC, title={Fast, Provably convergent IRLS Algorithm for p-norm Linear Regression}, author={Deeksha Adil and Richard Peng and Sushant Sachdeva}, journal={ArXiv}, year={2019}, volume={abs/1907.07167} }

Linear regression in $\ell_p$-norm is a canonical optimization problem that arises in several applications, including sparse recovery, semi-supervised learning, and signal processing. Generic convex optimization algorithms for solving $\ell_p$-regression are slow in practice. Iteratively Reweighted Least Squares (IRLS) is an easy to implement family of algorithms for solving these problems that has been studied for over 50 years. However, these algorithms often diverge for p > 3, and since the… CONTINUE READING

#### Supplemental Code

GITHUB REPO

Via Papers with Code

Fast IRLS code for solving p-norm regression problems

#### Citations

##### Publications citing this paper.

SHOWING 1-6 OF 6 CITATIONS

## Acceleration with a Ball Optimization Oracle

VIEW 1 EXCERPT

## Improved Primal–Dual Interior-Point Method Using the Lawson-Norm for Inverse Problems

VIEW 1 EXCERPT

CITES METHODS

## MS BaCoN

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 62 REFERENCES

## Algorithms for 𝓁p-based semi-supervised learning on graphs

VIEW 7 EXCERPTS

HIGHLY INFLUENTIAL

## Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## Laplacian_lp_graph_ssl

VIEW 2 EXCERPTS