Minimax theory for a class of nonlinear statistical inverse problems

@article{Ray2016MinimaxTF,
  title={Minimax theory for a class of nonlinear statistical inverse problems},
  author={Kolyan Ray and Johannes Schmidt-Hieber},
  journal={Inverse Problems},
  year={2016},
  volume={32},
  pages={065003}
}
We study a class of statistical inverse problems with nonlinear pointwise operators motivated by concrete statistical applications. A two-step procedure is proposed, where the first step smoothes the data and inverts the nonlinearity. This reduces the initial nonlinear problem to a linear inverse problem with deterministic noise, which is then solved in a second step. The noise reduction step is based on wavelet thresholding and is shown to be minimax optimal (up to logarithmic factors) in a… 
Convergence analysis of Tikhonov regularization for non-linear statistical inverse learning problems
We study a non-linear statistical inverse learning problem, where we observe the noisy image of a quantity through a non-linear operator at some random design points. We consider the widely used
Estimating the memory parameter for potentially non-linear and non-Gaussian time series with wavelets
TLDR
It is proved that, under some mild assumptions, a newly designed memory estimator, named LRMW in this paper, is asymptotically consistent.
Spike and slab variational Bayes for high dimensional logistic regression
TLDR
A mean-field spike and slab VB approximation of widely used Bayesian model selection priors in sparse high-dimensional logistic regression, providing non-asymptotic theoretical guarantees for the VB posterior in both $\ell_2$ and prediction loss for a sparse truth, giving optimal (minimax) convergence rates.
Asymptotic nonequivalence of density estimation and Gaussian white noise for small densities
It is well-known that density estimation on the unit interval is asymptotically equivalent to a Gaussian white noise experiment, provided the densities are sufficiently smooth and uniformly bounded
The Le Cam distance between density estimation, Poisson processes and Gaussian white noise
It is well-known that density estimation on the unit interval is asymptotically equivalent to a Gaussian white noise experiment, provided the densities have Holder smoothness larger than $1/2$ and
The Le Cam distance between density estimation and the Gaussian white noise model in the case of small signals
Consider nonparametric density estimation where we observe $n$ i.i.d. copies of a random variable with density $f$ on the unit interval. It is well-known that estimation of the density $f$ is
A regularity class for the roots of nonnegative functions
We investigate the regularity of the positive roots of a nonnegative function of one-variable. A modified Hölder space $$\mathcal {F}^\beta $$Fβ is introduced such that if $$f\in \mathcal {F}^\beta
A Tutorial on Fisher Information
Bayes factors for research workers
  • A. Ly
  • Computer Science
  • 2018
TLDR
This dissertation advocate the use of Bayes factors in empirical research to replace or complement standard null hypothesis tests based on p-values, and implemented them in Jeffreys’s Amazing Statistics Program, JASP, which is freely available and open-source.

References

SHOWING 1-10 OF 40 REFERENCES
Two-step regularization methods for linear inverse problems
In this paper we investigate reconstruction methods for the treatment of ill-posed inverse problems. These methods are based on a data estimation operator Sλ followed by a classical regularization
Consistency and rates of convergence of nonlinear Tikhonov regularization with random noise
We consider nonlinear inverse problems described by operator equations F(a) = u. Here a is an element of a Hilbert space H which we want to estimate, and u is an L2-function. The given data consist
Regularization by fractional filter methods and data smoothing
This paper is concerned with the regularization of linear ill-posed problems by a combination of data smoothing and fractional filter methods. For the data smoothing, a wavelet shrinkage denoising is
Iteratively regularized Newton-type methods for general data misfit functionals and applications to Poisson data
TLDR
The main focus of this paper is on inverse problems with Poisson data where the natural data misfit functional is given by the Kullback–Leibler divergence and convergence and convergence rates as the noise level tends to 0 are proved.
Regularization of Inverse Problems
Preface. 1. Introduction: Examples of Inverse Problems. 2. Ill-Posed Linear Operator Equations. 3. Regularization Operators. 4. Continuous Regularization Methods. 5. Tikhonov Regularization. 6.
Asymptotic equivalence of spectral density estimation and gaussian white noise
TLDR
Asymptotic equivalence, in the sense of Le Cam's deficiency Delta-distance, to two Gaussian experiments with simpler structure is established, which represents the step from a Gaussian scale model to a location model and also has a counterpart in established inference methods, i.e. log-periodogram regression.
Enhancing linear regularization to treat large noise
Abstract For solving linear ill-posed problems with noisy data, regularization methods are required. In this paper we study regularization under general noise assumptions containing large noise and
Asymptotic equivalence of functional linear regression and a white noise inverse problem
We consider the statistical experiment of functional linear regression (FLR). Furthermore, we introduce a white noise model where one observes an Ito process, which contains the covariance operator
Nonparametric regression in exponential families
Most results in nonparametric regression theory are developed only for the case of additive noise. In such a setting many smoothing techniques including wavelet thresholding methods have been
Asymptotic equivalence for nonparametric generalized linear models
Summary. We establish that a non-Gaussian nonparametric regression model is asymptotically equivalent to a regression model with Gaussian noise. The approximation is in the sense of Le Cam's
...
1
2
3
4
...