Adaptive risk bounds in univariate total variation denoising and trend filtering

@article{Guntuboyina2017AdaptiveRB,
  title={Adaptive risk bounds in univariate total variation denoising and trend filtering},
  author={Adityanand Guntuboyina and Donovan Lieu and Sabyasachi Chatterjee and Bodhisattva Sen},
  journal={arXiv: Statistics Theory},
  year={2017}
}
We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given positive integer $r$, the $r$-th order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute $r$-th order discrete derivatives of the fitted function at the design points. For $r=1$, the estimator reduces to total variation regularization which has received much attention in the statistics and image… Expand

Figures from this paper

Prediction bounds for (higher order) total variation regularized least squares
We establish adaptive results for trend filtering: least squares estimation with a penalty on the total variation of $(k-1)^{\rm th}$ order differences. Our approach is based on combining a generalExpand
Frame-constrained total variation regularization for white noise regression
Despite the popularity and practical success of total variation (TV) regularization for function estimation, surprisingly little is known about its theoretical performance in a statistical setting.Expand
Oracle inequalities for image denoising with total variation regularization
We derive oracle results for discrete image denoising with a total variation penalty. We consider the least squares estimator with a penalty on the $\ell^1$-norm of the total discrete derivative ofExpand
Adaptive Quantile Trend Filtering
We study quantile trend filtering, a recently proposed method for one-dimensional nonparametric quantile regression. We show that the penalized version of quantile trend filtering attains minimaxExpand
TOTAL VARIATION REGULARIZED LEAST SQUARES By
We establish adaptive results for trend filtering: least squares estimation with a penalty on the total variation of (k − 1) order differences. Our approach is based on combining a general oracleExpand
Logistic regression with total variation regularization
We study logistic regression with total variation penalty on the canonical parameter and show that the resulting estimator satisfies a sharp oracle inequality: the excess risk of the estimator isExpand
Adaptive Online Estimation of Piecewise Polynomial Trends
TLDR
A polynomial time algorithm is designed that achieves the nearly optimal dynamic regret of $\tilde{O}(n^{\frac{1}{2k+3}}C_n}{2K+3}) and the same policy is minimax optimal for several other non-parametric families of interest. Expand
Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy–Krause variation
We consider the problem of nonparametric regression when the covariate is $d$-dimensional, where $d \geq 1$. In this paper we introduce and study two nonparametric least squares estimators (LSEs) inExpand
New Risk Bounds for 2D Total Variation Denoising
TLDR
This paper rigorously shows that, when the truth is piecewise constant with few pieces, the ideally tuned TVD estimator performs better than in the worst case. Expand
Approximate $\ell_{0}$-penalized estimation of piecewise-constant signals on graphs
We study recovery of piecewise-constant signals on graphs by the estimator minimizing an $l_0$-edge-penalized objective. Although exact minimization of this objective may be computationallyExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 60 REFERENCES
Adaptive piecewise polynomial estimation via trend filtering
We study trend filtering, a recently proposed tool of Kim et al. [SIAM Rev. 51 (2009) 339-360] for nonparametric regression. The trend filtering estimate is defined as the minimizer of a penalizedExpand
On Spatial Adaptive Estimation of Nonparametric Regression
The paper is devoted to developing spatial adaptive estimates for restoring functions from noisy observations. We show that the traditional least square (piecewise polynomial) estimate equipped withExpand
Sharp oracle inequalities for Least Squares estimators in shape restricted regression
The performance of Least Squares (LS) estimators is studied in isotonic, unimodal and convex regression. Our results have the form of sharp oracle inequalities that account for the modelExpand
Approximate $\ell_{0}$-penalized estimation of piecewise-constant signals on graphs
We study recovery of piecewise-constant signals on graphs by the estimator minimizing an $l_0$-edge-penalized objective. Although exact minimization of this objective may be computationallyExpand
Minimax estimation via wavelet shrinkage
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets, we develop a nonlinear method which works in the wavelet domain by simpleExpand
$l_0$-estimation of piecewise-constant signals on graphs
We study recovery of piecewise-constant signals over arbitrary graphs by the estimator minimizing an $l_0$-edge-penalized objective. Although exact minimization of this objective may beExpand
Approximate Recovery in Changepoint Problems, from $\ell_2$ Estimation Error Rates
In the 1-dimensional multiple changepoint detection problem, we prove that any procedure with a fast enough $\ell_2$ error rate, in terms of its estimation of the underlying piecewise constant meanExpand
A new perspective on least squares under convex constraint
Consider the problem of estimating the mean of a Gaussian random vector when the mean vector is assumed to be in a given convex set. The most natural solution is to take the Euclidean projection ofExpand
Locally Adaptive Bandwidth Choice for Kernel Regression Estimators
Abstract Kernel estimators with a global bandwidth are commonly used to estimate regression functions. On the other hand, it is obvious that the choice of a local bandwidth can lead to betterExpand
Splines in Higher Order TV Regularization
TLDR
It is shown in a strictly discrete setting that splines of degree m−1 solve also a minimization problem with quadratic data term and m-th order total variation (TV) regularization term. Expand
...
1
2
3
4
5
...