Corpus ID: 52953563

ET-Lasso: Efficient Tuning of Lasso for High-Dimensional Data

@article{Yang2018ETLassoET,
  title={ET-Lasso: Efficient Tuning of Lasso for High-Dimensional Data},
  author={Songshan Yang and Jiawei Wen and X. Zhan and D. Kifer},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.04513}
}
  • Songshan Yang, Jiawei Wen, +1 author D. Kifer
  • Published 2018
  • Mathematics, Computer Science
  • ArXiv
  • The L1 regularization (Lasso) has proven to be a versatile tool to select relevant features and estimate the model coefficients simultaneously. Despite its popularity, it is very challenging to guarantee the feature selection consistency of Lasso. One way to improve the feature selection consistency is to select an ideal tuning parameter. Traditional tuning criteria mainly focus on minimizing the estimated prediction error or maximizing the posterior model probability, such as cross-validation… CONTINUE READING
    Interpreting and Improving Deep Neural SLU Models via Vocabulary Importance

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 34 REFERENCES
    Bolasso: model consistent Lasso estimation through the bootstrap
    337
    On Model Selection Consistency of Lasso
    2129
    Shrinkage Tuning Parameter Selection with a Diverging Number of Parameters
    302
    Tuning parameter selectors for the smoothly clipped absolute deviation method.
    563
    Thresholding Procedures for High Dimensional Variable Selection and Statistical Estimation
    • S. Zhou
    • Mathematics, Computer Science
    • 2009
    39
    Sure independence screening for ultrahigh dimensional feature space
    1618
    Panning for Gold: Model-free Knockoffs for High-dimensional Controlled Variable Selection
    121
    Regression Shrinkage and Selection via the Lasso
    28683