• Corpus ID: 88516998

Element-wise estimation error of a total variation regularized estimator for change point detection.

  title={Element-wise estimation error of a total variation regularized estimator for change point detection.},
  author={Teng Zhang},
  journal={arXiv: Statistics Theory},
  • Teng Zhang
  • Published 3 January 2019
  • Mathematics
  • arXiv: Statistics Theory
This work studies the total variation regularized $\ell_2$ estimator (fused lasso) in the setting of a change point detection problem. Compared with existing works that focus on the sum of squared estimation errors, we give bound on the element-wise estimation error. Our bound is nearly optimal in the sense that the sum of squared error matches the best existing result, up to a logarithmic factor. This analysis of the element-wise estimation error allows a screening method that can… 
Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy–Krause variation
We consider the problem of nonparametric regression when the covariate is $d$-dimensional, where $d \geq 1$. In this paper we introduce and study two nonparametric least squares estimators (LSEs) in
A review on minimax rates in change point detection and localisation.
This paper reviews recent developments in fundamental limits and optimal algorithms for change point analysis. We focus on minimax optimal rates in change point detection and localisation, in both
Automatic mapping of urban green spaces using a geospatial neural network
ABSTRACT Detailed and precise urban green spaces (UGS) maps provide essential data for the sustainable urban development and related studies (e.g. heatwave events, heat related health risk, urban


Multiple Change-Point Estimation With a Total Variation Penalty
We propose a new approach for dealing with the estimation of the location of change-points in one-dimensional piecewise constant signals observed in white noise. Our approach consists in reframing
On change point detection using the fused lasso method
In this paper we analyze the asymptotic properties of l1 penalized maximum likelihood estimation of signals with piece-wise constant mean values and/or variances. The focus is on segmentation of a
Approximate Recovery in Changepoint Problems, from $\ell_2$ Estimation Error Rates
In the 1-dimensional multiple changepoint detection problem, we prove that any procedure with a fast enough $\ell_2$ error rate, in terms of its estimation of the underlying piecewise constant mean
Adaptive risk bounds in univariate total variation denoising and trend filtering
We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given positive integer $r$, the $r$-th order trend filtering estimator is defined as the minimizer
The group fused Lasso for multiple change-point detection
The group fused Lasso is presented for detection of multiple change-points shared by a set of co-occurring one-dimensional signals and fast algorithms are proposed to solve the resulting optimization problems.
Fast Newton methods for the group fused lasso
A specialized projected Newton method, combined with a primal active set approach, is developed to be substantially faster that existing methods on the group fused lasso, a convex model that approximates a multi-dimensional signal via an approximately piecewise-constant signal.
Consider the problem of estimating a step function in the presence of additive measurement noise. In the case that the number of jumps is known, the least-squares estimators for the locations of the
Consistencies and rates of convergence of jump-penalized least squares estimators
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regression function by piecewise constant functions. Besides conventional consistency and convergence
The solution path of the generalized lasso
We present a path algorithm for the generalized lasso problem. This problem penalizes the $\ell_1$ norm of a matrix D times the coefficient vector, and has a wide range of applications, dictated by
Sparsity and smoothness via the fused lasso
Summary. The lasso penalizes a least squares regression by the sum of the absolute values (L1-norm) of the coefficients. The form of this penalty encourages sparse solutions (with many coefficients