• Corpus ID: 240353859

Asgl: A Python Package for Penalized Linear and Quantile Regression

@inproceedings{Civieta2021AsglAP,
  title={Asgl: A Python Package for Penalized Linear and Quantile Regression},
  author={'Alvaro M'endez Civieta and M Carmen Aguilera-Morillo and Rosa E. Lillo},
  year={2021}
}
asgl is a Python package that solves penalized linear regression and quantile regression models for simultaneous variable selection and prediction, for both high and low dimensional frameworks. It makes very easy to set up and solve different types of lasso-based penalizations among which the asgl (adaptive sparse group lasso, that gives name to the package) is remarked. This package is built on top of cvxpy, a Pythonembedded modeling language for convex optimization problems and makes… 
1 Citations

Figures and Tables from this paper

Explainable Global Error Weighted on Feature Importance: The xGEWFI metric to evaluate the error of data imputation and data augmentation

TLDR
This paper proposes a novel metric named "Explainable Global Error Weighted on Feature Importance"( xGEWFI), which is tested in a whole preprocessing method that detects the outliers and replaces them with a null value, aiming for an ethical AI.

References

SHOWING 1-10 OF 25 REFERENCES

A Sparse-Group Lasso

TLDR
A regularized model for linear regression with ℓ1 andℓ2 penalties is introduced and it is shown that it has the desired effect of group-wise and within group sparsity.

Adaptive sparse group LASSO in quantile regression

This paper studies the introduction of sparse group LASSO (SGL) to the quantile regression framework. Additionally, a more flexible version, an adaptive SGL is proposed based on the adaptive idea,

Regression Shrinkage and Selection via the Lasso

TLDR
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

Model selection and estimation in regression with grouped variables

Summary.  We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor

CVXPY: A Python-Embedded Modeling Language for Convex Optimization

TLDR
CVXPY allows the user to express convex optimization problems in a natural syntax that follows the math, rather than in the restrictive standard form required by solvers.

Sparse Principal Component Analysis

TLDR
This work introduces a new method called sparse principal component analysis (SPCA) using the lasso (elastic net) to produce modified principal components with sparse loadings and shows that PCA can be formulated as a regression-type optimization problem.

A note on the group lasso and a sparse group lasso

TLDR
An ecien t algorithm is derived for the resulting convex problem based on coordinate descent that can be used to solve the general form of the group lasso, with non-orthonormal model matrices.

The Adaptive Lasso and Its Oracle Properties

TLDR
A new version of the lasso is proposed, called the adaptive lasso, where adaptive weights are used for penalizing different coefficients in the ℓ1 penalty, and the nonnegative garotte is shown to be consistent for variable selection.

Sparse Group Lasso for Regression on Land Climate Variables

TLDR
It is demonstrated that the sparse model provides better predictive performance than the state-of-the-art, is climatologically interpretable and robust in variable selection.

A two-stage sparse logistic regression for optimal gene selection in high-dimensional microarray data classification

TLDR
Experimental results based on four publicly available gene expression datasets have shown that the proposed sparse logistic regression method significantly outperforms three state-of-the-art methods in terms of classification accuracy, G-mean, area under the curve, and stability.