Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach. Part II: Theoretical Analysis

@article{Bortoli2020MaximumLE,
  title={Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach. Part II: Theoretical Analysis},
  author={Valentin De Bortoli and Alain Durmus and M. Pereyra and A. F. Vidal},
  journal={SIAM J. Imaging Sci.},
  year={2020},
  volume={13},
  pages={1990-2028}
}
This paper presents a detailed theoretical analysis of the three stochastic approximation proximal gradient algorithms proposed in our companion paper [49] to set regularization parameters by marginal maximum likelihood estimation. We prove the convergence of a more general stochastic approximation scheme that includes the three algorithms of [49] as special cases. This includes asymptotic and non-asymptotic convergence results with natural and easily verifiable conditions, as well as explicit… Expand
Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach Part I: Methodology and Experiments
TLDR
A general empirical Bayesian method for setting regularisation parameters in imaging problems that are convex w.r.t. the unknown image, which uses the same basic operators as proximal optimisation algorithms, namely gradient and proximal operators, and it is therefore straightforward to apply to Problems that are currently solved by using proximal Optimisation techniques. Expand
Bayesian imaging using Plug & Play priors: when Langevin meets Tweedie
TLDR
Detailed convergence guarantees are established for two algorithms: PnPULA and PnP-SGD (Plug & Play Stochastic Gradient Descent) for Monte Carlo sampling and MMSE inference, which approximately target a decision-theoretically optimal Bayesian model that is well-posed. Expand
On and beyond Total Variation regularisation in imaging: the role of space variance
TLDR
The major contributions in the field of space-variant TV-type image reconstruction models are reviewed, focusing, in particular, on their Bayesian interpretation which paves the way to new exciting and unexplored research directions. Expand
Kunneth Theorems for Vietoris-Rips Homology
TLDR
A Kunneth theorem is proved for the Vietoris-Rips homology and cohomology of a semi-uniform space and for graphs, where it is shown that the K Dunneth theorem holds for graphs with respect to the strong graph product. Expand

References

SHOWING 1-10 OF 61 REFERENCES
Maximum likelihood estimation of regularisation parameters in high-dimensional inverse problems: an empirical Bayesian approach
TLDR
A general empirical Bayesian method for setting regularisation parameters in imaging problems that are convex w.r.t. the unknown image, which uses a stochastic proximal gradient algorithm that is driven by two proximal Markov chain Monte Carlo samplers. Expand
Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Machine Learning
TLDR
This work provides a non-asymptotic analysis of the convergence of two well-known algorithms, stochastic gradient descent as well as a simple modification where iterates are averaged, suggesting that a learning rate proportional to the inverse of the number of iterations, while leading to the optimal convergence rate, is not robust to the lack of strong convexity or the setting of the proportionality constant. Expand
Efficient stochastic optimisation by unadjusted Langevin Monte Carlo
TLDR
This paper proposes to address methodological and theoretical difficulties related to using high-dimensional Markov chain Monte Carlo algorithms within a stochastic approximation scheme by using unadjusted Langevin algorithms to construct the stochastically approximation. Expand
On Perturbed Proximal Gradient Algorithms
TLDR
A version of the proximal gradient algorithm for which the gradient is intractable and is approximated by Monte Carlo methods, and conditions on the step size and the Monte Carlo batch size under which convergence is guaranteed are derived. Expand
Theoretical guarantees for approximate sampling from smooth and log‐concave densities
Sampling from various kinds of distributions is an issue of paramount importance in statistics since it is often the key ingredient for constructing estimators, test procedures or confidenceExpand
Maximum Likelihood Estimation of Regularisation Parameters
  • A. F. Vidal, M. Pereyra
  • Mathematics, Computer Science
  • 2018 25th IEEE International Conference on Image Processing (ICIP)
  • 2018
TLDR
An empirical Bayesian method to estimate regularisation parameters in imaging inverse problems by using a stochastic proximal gradient algorithm driven by two proximal Markov chain Monte Carlo samplers, intimately combining modern optimisation and sampling techniques. Expand
Convergence of Stochastic Proximal Gradient Algorithm
We prove novel convergence results for a stochastic proximal gradient algorithm suitable for solving a large class of convex optimization problems, where a convex objective function is given by theExpand
User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
TLDR
This paper analyzes several methods of approximate sampling based on discretizations of the Langevin diffusion and establishes guarantees on its error measured in the Wasserstein-2 distance, and provides an upper bound on the error of the first-order Langevin Monte Carlo (LMC) algorithm with optimized varying step-size. Expand
Efficient Bayesian Computation by Proximal Markov Chain Monte Carlo: When Langevin Meets Moreau
TLDR
A new and highly efficient Markov chain Monte Carlo methodology to perform Bayesian computation for high-dimensional models that are log-concave and nonsmooth, a class of models that is central in imaging sciences. Expand
A Convex Approach for Image Restoration with Exact Poisson-Gaussian Likelihood
TLDR
This work proposes a convex optimization strategy for the reconstruction of images degraded by a linear operator and corrupted with a mixed Poisson-Gaussian noise, and shows that in a variational framework the Shifted Poisson and Exponential approximations lead to very good restoration results. Expand
...
1
2
3
4
5
...