# Adaptive risk bounds in univariate total variation denoising and trend filtering

@article{Guntuboyina2017AdaptiveRB, title={Adaptive risk bounds in univariate total variation denoising and trend filtering}, author={Adityanand Guntuboyina and Donovan Lieu and Sabyasachi Chatterjee and Bodhisattva Sen}, journal={arXiv: Statistics Theory}, year={2017} }

We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given positive integer $r$, the $r$-th order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute $r$-th order discrete derivatives of the fitted function at the design points. For $r=1$, the estimator reduces to total variation regularization which has received much attention in the statistics and image… Expand

#### 43 Citations

Prediction bounds for (higher order) total variation regularized least squares

- Mathematics
- 2019

We establish adaptive results for trend filtering: least squares estimation with a penalty on the total variation of $(k-1)^{\rm th}$ order differences. Our approach is based on combining a general… Expand

Frame-constrained total variation regularization for white noise regression

- Mathematics
- 2018

Despite the popularity and practical success of total variation (TV) regularization for function estimation, surprisingly little is known about its theoretical performance in a statistical setting.… Expand

Oracle inequalities for image denoising with total variation regularization

- Mathematics
- 2019

We derive oracle results for discrete image denoising with a total variation penalty. We consider the least squares estimator with a penalty on the $\ell^1$-norm of the total discrete derivative of… Expand

Adaptive Quantile Trend Filtering

- Mathematics
- 2020

We study quantile trend filtering, a recently proposed method for one-dimensional nonparametric quantile regression. We show that the penalized version of quantile trend filtering attains minimax… Expand

TOTAL VARIATION REGULARIZED LEAST SQUARES By

- 2021

We establish adaptive results for trend filtering: least squares estimation with a penalty on the total variation of (k − 1) order differences. Our approach is based on combining a general oracle… Expand

Logistic regression with total variation regularization

- Mathematics
- 2020

We study logistic regression with total variation penalty on the canonical parameter and show that the resulting estimator satisfies a sharp oracle inequality: the excess risk of the estimator is… Expand

Adaptive Online Estimation of Piecewise Polynomial Trends

- Computer Science, Mathematics
- NeurIPS
- 2020

A polynomial time algorithm is designed that achieves the nearly optimal dynamic regret of $\tilde{O}(n^{\frac{1}{2k+3}}C_n}{2K+3}) and the same policy is minimax optimal for several other non-parametric families of interest. Expand

Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy–Krause variation

- Mathematics
- 2019

We consider the problem of nonparametric regression when the covariate is $d$-dimensional, where $d \geq 1$. In this paper we introduce and study two nonparametric least squares estimators (LSEs) in… Expand

New Risk Bounds for 2D Total Variation Denoising

- Computer Science, Mathematics
- IEEE Transactions on Information Theory
- 2021

This paper rigorously shows that, when the truth is piecewise constant with few pieces, the ideally tuned TVD estimator performs better than in the worst case. Expand

Approximate $\ell_{0}$-penalized estimation of piecewise-constant signals on graphs

- Mathematics
- The Annals of Statistics
- 2018

We study recovery of piecewise-constant signals on graphs by the estimator minimizing an $l_0$-edge-penalized objective. Although exact minimization of this objective may be computationally… Expand

#### References

SHOWING 1-10 OF 60 REFERENCES

Adaptive piecewise polynomial estimation via trend filtering

- Mathematics
- 2014

We study trend filtering, a recently proposed tool of Kim et al. [SIAM Rev. 51 (2009) 339-360] for nonparametric regression. The trend filtering estimate is defined as the minimizer of a penalized… Expand

On Spatial Adaptive Estimation of Nonparametric Regression

- Mathematics
- 2004

The paper is devoted to developing spatial adaptive estimates for restoring functions from noisy observations. We show that the traditional least square (piecewise polynomial) estimate equipped with… Expand

Sharp oracle inequalities for Least Squares estimators in shape restricted regression

- Mathematics
- 2015

The performance of Least Squares (LS) estimators is studied in isotonic, unimodal and convex regression. Our results have the form of sharp oracle inequalities that account for the model… Expand

Approximate $\ell_{0}$-penalized estimation of piecewise-constant signals on graphs

- Mathematics
- The Annals of Statistics
- 2018

We study recovery of piecewise-constant signals on graphs by the estimator minimizing an $l_0$-edge-penalized objective. Although exact minimization of this objective may be computationally… Expand

Minimax estimation via wavelet shrinkage

- Mathematics
- 1998

We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets, we develop a nonlinear method which works in the wavelet domain by simple… Expand

$l_0$-estimation of piecewise-constant signals on graphs

- Mathematics
- 2017

We study recovery of piecewise-constant signals over arbitrary graphs by the estimator minimizing an $l_0$-edge-penalized objective. Although exact minimization of this objective may be… Expand

Approximate Recovery in Changepoint Problems, from $\ell_2$ Estimation Error Rates

- Mathematics
- 2016

In the 1-dimensional multiple changepoint detection problem, we prove that any procedure with a fast enough $\ell_2$ error rate, in terms of its estimation of the underlying piecewise constant mean… Expand

A new perspective on least squares under convex constraint

- Mathematics
- 2014

Consider the problem of estimating the mean of a Gaussian random vector when the mean vector is assumed to be in a given convex set. The most natural solution is to take the Euclidean projection of… Expand

Locally Adaptive Bandwidth Choice for Kernel Regression Estimators

- Mathematics
- 1993

Abstract Kernel estimators with a global bandwidth are commonly used to estimate regression functions. On the other hand, it is obvious that the choice of a local bandwidth can lead to better… Expand

Splines in Higher Order TV Regularization

- Mathematics, Computer Science
- International Journal of Computer Vision
- 2006

It is shown in a strictly discrete setting that splines of degree m−1 solve also a minimization problem with quadratic data term and m-th order total variation (TV) regularization term. Expand