# Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach. Part II: Theoretical Analysis

@article{Bortoli2020MaximumLE, title={Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach. Part II: Theoretical Analysis}, author={Valentin De Bortoli and Alain Durmus and M. Pereyra and A. F. Vidal}, journal={SIAM J. Imaging Sci.}, year={2020}, volume={13}, pages={1990-2028} }

This paper presents a detailed theoretical analysis of the three stochastic approximation proximal gradient algorithms proposed in our companion paper [49] to set regularization parameters by marginal maximum likelihood estimation. We prove the convergence of a more general stochastic approximation scheme that includes the three algorithms of [49] as special cases. This includes asymptotic and non-asymptotic convergence results with natural and easily verifiable conditions, as well as explicit… Expand

#### 4 Citations

Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach Part I: Methodology and Experiments

- Computer Science
- SIAM J. Imaging Sci.
- 2020

A general empirical Bayesian method for setting regularisation parameters in imaging problems that are convex w.r.t. the unknown image, which uses the same basic operators as proximal optimisation algorithms, namely gradient and proximal operators, and it is therefore straightforward to apply to Problems that are currently solved by using proximal Optimisation techniques. Expand

Bayesian imaging using Plug & Play priors: when Langevin meets Tweedie

- Computer Science, Mathematics
- ArXiv
- 2021

Detailed convergence guarantees are established for two algorithms: PnPULA and PnP-SGD (Plug & Play Stochastic Gradient Descent) for Monte Carlo sampling and MMSE inference, which approximately target a decision-theoretically optimal Bayesian model that is well-posed. Expand

On and beyond Total Variation regularisation in imaging: the role of space variance

- Computer Science, Mathematics
- ArXiv
- 2021

The major contributions in the field of space-variant TV-type image reconstruction models are reviewed, focusing, in particular, on their Bayesian interpretation which paves the way to new exciting and unexplored research directions. Expand

Kunneth Theorems for Vietoris-Rips Homology

- Computer Science, Mathematics
- ArXiv
- 2020

A Kunneth theorem is proved for the Vietoris-Rips homology and cohomology of a semi-uniform space and for graphs, where it is shown that the K Dunneth theorem holds for graphs with respect to the strong graph product. Expand

#### References

SHOWING 1-10 OF 61 REFERENCES

Maximum likelihood estimation of regularisation parameters in high-dimensional inverse problems: an empirical Bayesian approach

- Mathematics, Computer Science
- 2019

A general empirical Bayesian method for setting regularisation parameters in imaging problems that are convex w.r.t. the unknown image, which uses a stochastic proximal gradient algorithm that is driven by two proximal Markov chain Monte Carlo samplers. Expand

Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Machine Learning

- Computer Science, Mathematics
- NIPS
- 2011

This work provides a non-asymptotic analysis of the convergence of two well-known algorithms, stochastic gradient descent as well as a simple modification where iterates are averaged, suggesting that a learning rate proportional to the inverse of the number of iterations, while leading to the optimal convergence rate, is not robust to the lack of strong convexity or the setting of the proportionality constant. Expand

Efficient stochastic optimisation by unadjusted Langevin Monte Carlo

- Computer Science, Mathematics
- Stat. Comput.
- 2021

This paper proposes to address methodological and theoretical difficulties related to using high-dimensional Markov chain Monte Carlo algorithms within a stochastic approximation scheme by using unadjusted Langevin algorithms to construct the stochastically approximation. Expand

On Perturbed Proximal Gradient Algorithms

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2017

A version of the proximal gradient algorithm for which the gradient is intractable and is approximated by Monte Carlo methods, and conditions on the step size and the Monte Carlo batch size under which convergence is guaranteed are derived. Expand

Theoretical guarantees for approximate sampling from smooth and log‐concave densities

- Mathematics
- 2014

Sampling from various kinds of distributions is an issue of paramount importance in statistics since it is often the key ingredient for constructing estimators, test procedures or confidence… Expand

Maximum Likelihood Estimation of Regularisation Parameters

- Mathematics, Computer Science
- 2018 25th IEEE International Conference on Image Processing (ICIP)
- 2018

An empirical Bayesian method to estimate regularisation parameters in imaging inverse problems by using a stochastic proximal gradient algorithm driven by two proximal Markov chain Monte Carlo samplers, intimately combining modern optimisation and sampling techniques. Expand

Convergence of Stochastic Proximal Gradient Algorithm

- Mathematics
- Applied Mathematics & Optimization
- 2019

We prove novel convergence results for a stochastic proximal gradient algorithm suitable for solving a large class of convex optimization problems, where a convex objective function is given by the… Expand

User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient

- Mathematics, Computer Science
- ArXiv
- 2017

This paper analyzes several methods of approximate sampling based on discretizations of the Langevin diffusion and establishes guarantees on its error measured in the Wasserstein-2 distance, and provides an upper bound on the error of the first-order Langevin Monte Carlo (LMC) algorithm with optimized varying step-size. Expand

Efficient Bayesian Computation by Proximal Markov Chain Monte Carlo: When Langevin Meets Moreau

- Computer Science, Mathematics
- SIAM J. Imaging Sci.
- 2018

A new and highly efficient Markov chain Monte Carlo methodology to perform Bayesian computation for high-dimensional models that are log-concave and nonsmooth, a class of models that is central in imaging sciences. Expand

A Convex Approach for Image Restoration with Exact Poisson-Gaussian Likelihood

- Mathematics, Computer Science
- SIAM J. Imaging Sci.
- 2015

This work proposes a convex optimization strategy for the reconstruction of images degraded by a linear operator and corrupted with a mixed Poisson-Gaussian noise, and shows that in a variational framework the Shifted Poisson and Exponential approximations lead to very good restoration results. Expand