Share This Author
The solution path of the generalized lasso
This work derives an unbiased estimate of the degrees of freedom of the generalized lasso fit for an arbitrary D, which turns out to be quite intuitive in many applications.
Surprises in High-Dimensional Ridgeless Least Squares Interpolation
This paper recovers-in a precise quantitative way-several phenomena that have been observed in large-scale neural networks and kernel machines, including the "double descent" behavior of the prediction risk, and the potential benefits of overparametrization.
The Lasso Problem and Uniqueness
- R. Tibshirani
- Computer Science, Mathematics
- 1 June 2012
The LARS algorithm is extended to cover the non-unique case, so that this path algorithm works for any predictor matrix and a simple method is derived for computing the component-wise uncertainty in lasso solutions of any given problem instance, based on linear programming.
Strong rules for discarding predictors in lasso‐type problems
- R. Tibshirani, J. Bien, R. Tibshirani
- Computer ScienceJournal of the Royal Statistical Society. Series…
- 9 November 2010
This work proposes strong rules for discarding predictors in lasso regression and related problems, that are very simple and yet screen out far more predictors than the SAFE rules, and derives conditions under which they are foolproof.
Distribution-Free Predictive Inference for Regression
- Jing Lei, Max G'sell, A. Rinaldo, R. Tibshirani, L. Wasserman
- Computer Science, MathematicsJournal of the American Statistical Association
- 14 April 2016
A general framework for distribution-free predictive inference in regression, using conformal inference, which allows for the construction of a prediction band for the response variable using any estimator of the regression function, and a model-free notion of variable importance, called leave-one-covariate-out or LOCO inference.
Adaptive piecewise polynomial estimation via trend filtering
- R. Tibshirani
- Mathematics, Computer Science
- 10 April 2013
Empirically, it is discovered that trend filtering estimates adapt to the local level of smoothness much better than smoothing splines, and further, they exhibit a remarkable similarity to locally adaptive regression splines.
A SIGNIFICANCE TEST FOR THE LASSO.
- R. Lockhart, Jonathan E. Taylor, R. Tibshirani, R. Tibshirani
- Computer ScienceAnnals of statistics
- 30 January 2013
A simple test statistic based on lasso fitted values is proposed, called the covariance test statistic, and it is shown that when the true model is linear, this statistic has an Exp(1) asymptotic distribution under the null hypothesis (the null being that all truly active variables are contained in the current lasso model).
Predictive inference with the jackknife+
This paper introduces the jackknife+, which is a novel method for constructing predictive confidence intervals. Whereas the jackknife outputs an interval centered at the predicted response of a test…
Exact Post-Selection Inference for Sequential Regression Procedures
ABSTRACT We propose new inference tools for forward stepwise regression, least angle regression, and the lasso. Assuming a Gaussian model for the observation vector y, we first describe a general…
Degrees of freedom in lasso problems
The degrees of freedom of the lasso fit is derived, placing no assumptions on the predictor matrix $X$ (and an arbitrary penalty matrix $D$), and some intermediate results on theLasso and generalized lasso are established that may be interesting on their own.