• Corpus ID: 245836794

A Cross Validation Framework for Signal Denoising with Applications to Trend Filtering, Dyadic CART and Beyond

@inproceedings{Chaudhuri2022ACV,
  title={A Cross Validation Framework for Signal Denoising with Applications to Trend Filtering, Dyadic CART and Beyond},
  author={Anamitra Chaudhuri and Sabyasachi Chatterjee},
  year={2022}
}
This paper formulates a general cross validation framework for signal denoising. The general framework is then applied to nonparametric regression methods such as Trend Filtering and Dyadic CART. The resulting cross validated versions are then shown to attain nearly the same rates of convergence as are known for the optimally tuned analogues. There did not exist any previous theoretical analyses of cross validated versions of Trend Filtering or Dyadic CART. To illustrate the generality of the… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 59 REFERENCES

Adaptive risk bounds in univariate total variation denoising and trend filtering

We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given positive integer $r$, the $r$-th order trend filtering estimator is defined as the minimizer

CROSS-VALIDATED LOCAL LINEAR NONPARAMETRIC REGRESSION

Local linear kernel methods have been shown to dominate local constant methods for the nonparametric estimation of regression functions. In this paper we study the theoretical properties of

Risk Bounds for Quantile Trend Filtering

We study quantile trend filtering, a recently proposed method for nonparametric quantile regression, with the goal of generalizing existing risk bounds known for the usual trend filtering

A Path Algorithm for the Fused Lasso Signal Approximator

TLDR
A path algorithm for solving the Fused Lasso Signal Approximator that computes the solutions for all values of λ1 and λ2 and presents an approximate algorithm that has considerable speed advantages for a moderate trade-off in accuracy.

Leave-one-out cross-validation is risk consistent for lasso

TLDR
This work gives the first definitive answer about the risk consistency of lasso when the smoothing parameter is chosen via cross-validation and shows that under some restrictions on the design matrix, the lasso estimator is still risk consistent with an empirically chosen tuning parameter.

Quantile Regression by Dyadic CART

TLDR
The scope of these globally optimal regression tree based methodologies to be applicable for heavy tailed data is extended and the QDCART estimator enjoys adaptively rate optimal estimation guarantees for piecewise constant and bounded variation function classes.

Adaptive piecewise polynomial estimation via trend filtering

TLDR
Empirically, it is discovered that trend filtering estimates adapt to the local level of smoothness much better than smoothing splines, and further, they exhibit a remarkable similarity to locally adaptive regression splines.

Prediction error of cross-validated Lasso

TLDR
A general upper bound on the prediction error of Lasso is given when the tuning parameter is chosen using a variant of 2-fold cross-validation using a general principle that may extend to other kinds of cross- validation as well as to other penalized regression methods.

MARS via LASSO

TLDR
This work proposes and study a natural LASSO variant of the MARS method based on least squares estimation over a convex class of functions obtained by considering infinite-dimensional linear combinations of functions in the MARs basis and imposing a variation based complexity constraint.

Trend Filtering ∗

TLDR
This paper proposes a variation on Hodrick–Prescott (H-P) filtering, a widely used method for trend estimation that substitutes a sum of absolute values for the sum of squares used in H-P filtering to penalize variations in the estimated trend.
...