• Corpus ID: 251371706

sparsegl: An R Package for Estimating Sparse Group Lasso

  title={sparsegl: An R Package for Estimating Sparse Group Lasso},
  author={Xiaoxuan Liang and Aaron Cohen and Anibal S{\'o}lon Heinsfeld and Franco Pestilli and Daniel J. McDonald},
The sparse group lasso is a high-dimensional regression technique that is useful for problems whose predictors have a naturally grouped structure and where sparsity is encouraged at both the group and individual predictor level. In this paper we discuss a new R package for computing such regularized models. The intention is to provide highly optimized solution routines enabling analysis of very large datasets, especially in the context of sparse design matrices. 



A Sparse-Group Lasso

A regularized model for linear regression with ℓ1 andℓ2 penalties is introduced and it is shown that it has the desired effect of group-wise and within group sparsity.

The degrees of freedom of the Group Lasso for a General Design

Regression problems where covariates can be grouped in nonoverlapping blocks, and where only a few of them are assumed to be active, are concerned, and the sensitivity of any group Lasso solution to the observations is studied.

A fast unified algorithm for solving group-lasso penalize learning problems

A unified algorithm called groupwise-majorization-descent (GMD) for efficiently computing the solution paths of the corresponding group-lasso penalized learning problem and allows for general design matrices, without requiring the predictors to be group-wise orthonormal.

Adaptive sparse group LASSO in quantile regression

This paper studies the introduction of sparse group LASSO (SGL) to the quantile regression framework. Additionally, a more flexible version, an adaptive SGL is proposed based on the adaptive idea,

Regression Shrinkage and Selection via the Lasso

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

On the “degrees of freedom” of the lasso

The number of nonzero coefficients is an unbiased estimate for the degrees of freedom of the lasso—a conclusion that requires no special assumption on the predictors and the unbiased estimator is shown to be asymptotically consistent.

The Dantzig selector: Statistical estimation when P is much larger than n

Is it possible to estimate β reliably based on the noisy data y?

Strong rules for discarding predictors in lasso‐type problems

This work proposes strong rules for discarding predictors in lasso regression and related problems, that are very simple and yet screen out far more predictors than the SAFE rules, and derives conditions under which they are foolproof.

Model selection and estimation in regression with grouped variables

Summary.  We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor

The biglasso Package: A Memory- and Computation-Efficient Solver for Lasso Model Fitting with Big Data in R

An R package called biglasso is implemented that tackles the challenge of fitting lasso-type models for ultrahigh-dimensional, multi-gigabyte data sets that cannot be accommodated by existing R packages and is equipped with newly proposed, more efficient feature screening rules, which substantially accelerate the computation.