# sparsegl: An R Package for Estimating Sparse Group Lasso

@inproceedings{Liang2022sparseglAR, title={sparsegl: An R Package for Estimating Sparse Group Lasso}, author={Xiaoxuan Liang and Aaron Cohen and Anibal S{\'o}lon Heinsfeld and Franco Pestilli and Daniel J. McDonald}, year={2022} }

The sparse group lasso is a high-dimensional regression technique that is useful for problems whose predictors have a naturally grouped structure and where sparsity is encouraged at both the group and individual predictor level. In this paper we discuss a new R package for computing such regularized models. The intention is to provide highly optimized solution routines enabling analysis of very large datasets, especially in the context of sparse design matrices.

## References

SHOWING 1-10 OF 44 REFERENCES

### A Sparse-Group Lasso

- Computer Science
- 2013

A regularized model for linear regression with ℓ1 andℓ2 penalties is introduced and it is shown that it has the desired effect of group-wise and within group sparsity.

### The degrees of freedom of the Group Lasso for a General Design

- Computer Science, MathematicsICML 2012
- 2012

Regression problems where covariates can be grouped in nonoverlapping blocks, and where only a few of them are assumed to be active, are concerned, and the sensitivity of any group Lasso solution to the observations is studied.

### A fast unified algorithm for solving group-lasso penalize learning problems

- Computer ScienceStat. Comput.
- 2015

A unified algorithm called groupwise-majorization-descent (GMD) for efficiently computing the solution paths of the corresponding group-lasso penalized learning problem and allows for general design matrices, without requiring the predictors to be group-wise orthonormal.

### Adaptive sparse group LASSO in quantile regression

- Computer ScienceAdv. Data Anal. Classif.
- 2021

This paper studies the introduction of sparse group LASSO (SGL) to the quantile regression framework. Additionally, a more flexible version, an adaptive SGL is proposed based on the adaptive idea,…

### Regression Shrinkage and Selection via the Lasso

- Computer Science
- 1996

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

### On the “degrees of freedom” of the lasso

- Computer Science
- 2004

The number of nonzero coefficients is an unbiased estimate for the degrees of freedom of the lasso—a conclusion that requires no special assumption on the predictors and the unbiased estimator is shown to be asymptotically consistent.

### The Dantzig selector: Statistical estimation when P is much larger than n

- Computer Science
- 2007

Is it possible to estimate β reliably based on the noisy data y?

### Strong rules for discarding predictors in lasso‐type problems

- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2012

This work proposes strong rules for discarding predictors in lasso regression and related problems, that are very simple and yet screen out far more predictors than the SAFE rules, and derives conditions under which they are foolproof.

### Model selection and estimation in regression with grouped variables

- Mathematics
- 2006

Summary. We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor…

### The biglasso Package: A Memory- and Computation-Efficient Solver for Lasso Model Fitting with Big Data in R

- Computer ScienceR J.
- 2020

An R package called biglasso is implemented that tackles the challenge of fitting lasso-type models for ultrahigh-dimensional, multi-gigabyte data sets that cannot be accommodated by existing R packages and is equipped with newly proposed, more efficient feature screening rules, which substantially accelerate the computation.