# Sign Consistency of the Generalized Elastic Net Estimator

@inproceedings{Zhu2021SignCO, title={Sign Consistency of the Generalized Elastic Net Estimator}, author={Wencan Zhu and Eric Houngla Adjakossa and C'eline L'evy-Leduc and Nils Tern{\`e}s}, year={2021} }

In this paper, we propose a novel variable selection approach in the framework of highdimensional linear models where the columns of the design matrix are highly correlated. It consists in rewriting the initial high-dimensional linear model to remove the correlation between the columns of the design matrix and in applying a generalized Elastic Net criterion since it can be seen as an extension of the generalized Lasso. e properties of our approach called gEN (generalized Elastic Net) are…

## References

SHOWING 1-10 OF 33 REFERENCES

On Model Selection Consistency of the Elastic Net When p >> n

- Mathematics
- 2008

We study the model selection property of the Elastic Net. In the classical settings when p (the number of predictors) and q (the number of predictors with non-zero coefficients in the true linear…

Preconditioning the Lasso for sign consistency

- Computer Science
- 2015

Results are provided that show F X can satisfy the irrepresentable condi- tion even when X fails to satisfy the condition, and a class of preconditioners to balance these costs and beneﬁts is proposed.

Regularization and variable selection via the elastic net

- Computer Science
- 2005

It is shown that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation, and an algorithm called LARS‐EN is proposed for computing elastic net regularization paths efficiently, much like algorithm LARS does for the lamba.

Regression Shrinkage and Selection via the Lasso

- Computer Science
- 1996

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

Statistical Learning with Sparsity: The Lasso and Generalizations

- Computer Science
- 2015

Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data and extract useful and reproducible patterns from big datasets.

High dimensional ordinary least squares projection for screening variables

- Computer Science
- 2015

It is shown that HOLP has the sure screening property and gives consistent variable selection without the strong correlation assumption, and it has a low computational complexity.

On the Nonnegative Garrote Estimator

- Mathematics
- 2005

We study the nonnegative garrote estimator from three different aspects: computation, consistency and flexibility. We show that the nonnegative garrote estimate has a piecewise linear solution path.…

Variable Selection for Highly Correlated Predictors

- Computer Science
- 2017

A new Semi-standard PArtial Covariance (SPAC) which is able to reduce correlation effects from other predictors while incorporating the magnitude of coefficients and enjoys strong sign consistency in both finite-dimensional and high-dimensional settings under regularity conditions.

Adaptive estimation of a quadratic functional by model selection

- Mathematics
- 2000

We consider the problem of estimating ∥s∥ 2 when s belongs to some separable Hilbert space and one observes the Gaussian process Y(t) = (s, t) + σ L(t), for all t ∈ H, where L is some Gaussian…

Supplement : Proofs and Technical Details for “ The Solution Path of the Generalized Lasso ”

- Mathematics
- 2013

In this document we give supplementary details to the paper " The Solution Path of the Generalized Lasso ". We use the prefix " GL " when referring to equations, sections, etc. in the original paper,…