Minimax Rates for Statistical Inverse Problems Under General Source Conditions

  title={Minimax Rates for Statistical Inverse Problems Under General Source Conditions},
  author={Litao Ding and P. Math{\'e}},
  journal={Computational Methods in Applied Mathematics},
  pages={603 - 608}
  • Litao Ding, P. Mathé
  • Published 2018
  • Mathematics, Computer Science
  • Computational Methods in Applied Mathematics
Abstract We describe the minimax reconstruction rates in linear ill-posed equations in Hilbert space when smoothness is given in terms of general source sets. The underlying fundamental result, the minimax rate on ellipsoids, is proved similarly to the seminal study by D. L. Donoho, R. C. Liu, and B. MacGibbon [4]. These authors highlighted the special role of the truncated series estimator, and for such estimators the risk can explicitly be given. We provide several examples, indicating… Expand
Posterior Contraction in Bayesian Inverse Problems Under Gaussian Priors
We study Bayesian inference in statistical linear inverse problems with Gaussian noise and priors in a separable Hilbert space setting. We focus our interest on the posterior contraction rate in theExpand
Regularization of linear ill-posed problems involving multiplication operators
We study regularization of ill-posed equations involving multiplication operators when the multiplier function is positive almost everywhere and zero is an accumulation point of the range of thisExpand
Designing truncated priors for direct and inverse Bayesian problems
Abstract: The Bayesian approach to inverse problems with functional unknowns, has received significant attention in recent years. An important component of the developing theory is the study of theExpand
On the asymptotical regularization for linear inverse problems in presence of white noise
We interpret steady linear statistical inverse problems as artificial dynamic systems with white noise and introduce a stochastic differential equation (SDE) system where the inverse of the endingExpand
De-noising by thresholding operator adapted wavelets Gene
Donoho and Johnstone [13] proposed a method from reconstructing an unknown smooth function u from noisy data u+ ζ by translating the empirical wavelet coefficients of u+ ζ towards zero. We considerExpand
De-noising by thresholding operator adapted wavelets
It is shown that the approximation of u obtained by thresholding the gamblet (operator adapted wavelet) coefficients of $u+\zeta$ is near minimax optimal (up to a multiplicative constant), and with high probability, its energy norm is bounded by that of u up to a constant depending on the amplitude of the noise. Expand
Empirical risk minimization as parameter choice rule for general linear regularization methods
We consider the statistical inverse problem to recover $f$ from noisy measurements $Y = Tf + \sigma \xi$ where $\xi$ is Gaussian white noise and $T$ a compact operator between Hilbert spaces.Expand


Statistical Inverse Estimation in Hilbert Scales
The recovery of signals from indirect measurements, blurred by random noise, is considered under the assumption that prior knowledge regarding the smoothness of the signal is avialable and the general problem is embedded in an abstract Hilbert scale. Expand
Geometry of linear ill-posed problems in variable Hilbert scales Inverse Problems 19 789-803
The authors study the best possible accuracy of recovering the solution from linear ill-posed problems in variable Hilbert scales. A priori smoothness of the solution is expressed in terms of generalExpand
Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
This paper introduces a unifying technique to study the mean square error of a large class of regularization methods (spectral methods) including the aforementioned estimators as well as many iterative methods, such as $\nu$-methods and the Landweber iteration. Expand
Regularization of some linear ill-posed problems with discretized random noisy data
For linear statistical ill-posed problems in Hilbert spaces we introduce an adaptive procedure to recover the unknown so- lution from indirect discrete and noisy data.. This procedure is shown to beExpand
Minimax Risk Over Hyperrectangles, and Implications
Consider estimating the mean of a standard Gaussian shift when that mean is known to lie in an orthosymmetric quadratically convex set in l 2 . The minimax risk among linear estimates is within 25%Expand
Review of rates of convergence and regularity conditions for inverse problems
The aim of this article is to review the different rates of convergence encountered in inverse problems, with both deterministic and stochastic noise. Indeed, in the litterature, several regularityExpand
Non asymptotic minimax rates of testing in signal detection with heterogeneous variances
The aim of this paper is to establish non-asymptotic minimax rates of testing for goodness-of-fit hypotheses in a heteroscedastic setting. More precisely, we deal with sequences $(Y_j)_{j\in J}$ ofExpand
Nonparametric statistical inverse problems
We explain some basic theoretical issues regarding nonparametric statistics applied to inverse problems. Simple examples are used to present classical concepts such as the white noise model, riskExpand
Minimax risk over hyperrectangles
  • and implications, Ann. Statist. 18
  • 1990
Minimax Risk for Hyperrectangles