# From safe screening rules to working sets for faster Lasso-type solvers

@article{Massias2017FromSS, title={From safe screening rules to working sets for faster Lasso-type solvers}, author={Mathurin Massias and Alexandre Gramfort and Joseph Salmon}, journal={ArXiv}, year={2017}, volume={abs/1703.07285} }

Convex sparsity-promoting regularizations are ubiquitous in modern statistical learning. By construction, they yield solutions with few non-zero coefficients, which correspond to saturated constraints in the dual optimization formulation. Working set (WS) strategies are generic optimization techniques that consist in solving simpler problems that only consider a subset of constraints, whose indices form the WS. Working set methods therefore involve two nested iterations: the outer loop… CONTINUE READING

Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

9

Twitter Mentions

#### Citations

##### Publications citing this paper.

SHOWING 1-10 OF 14 CITATIONS

## Exploiting regularity in sparse Generalized Linear Models

VIEW 2 EXCERPTS

CITES BACKGROUND & METHODS

## Large scale Lasso with windowed active set for convolutional spike sorting

VIEW 1 EXCERPT

CITES BACKGROUND

## Stable Safe Screening and Structured Dictionaries for Faster $\ell _{1}$ Regularization

VIEW 1 EXCERPT

CITES METHODS

## A Fast, Principled Working Set Algorithm for Exploiting Piecewise Linear Structure in Convex Problems

VIEW 2 EXCERPTS

CITES BACKGROUND

## Celer: a Fast Solver for the Lasso with Dual Extrapolation

VIEW 1 EXCERPT

CITES METHODS

## On high dimensional regression : computational and statistical perspectives École Normale Supérieure

VIEW 3 EXCERPTS

CITES BACKGROUND & METHODS

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 35 REFERENCES

## Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization

VIEW 7 EXCERPTS

HIGHLY INFLUENTIAL

## A Primer on Coordinate Descent Algorithms

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## On the Convergence of Block Coordinate Descent Type Methods

VIEW 1 EXCERPT

HIGHLY INFLUENTIAL

## Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## PATHWISE COORDINATE OPTIMIZATION

VIEW 5 EXCERPTS

HIGHLY INFLUENTIAL