• Corpus ID: 88524103

Inverse Problems and Data Assimilation.

@article{Stuart2018InversePA,
  title={Inverse Problems and Data Assimilation.},
  author={Andrew M. Stuart and Armeen Taeb},
  journal={arXiv: Methodology},
  year={2018}
}
These notes are designed with the aim of providing a clear and concise introduction to the subjects of Inverse Problems and Data Assimilation, and their inter-relations, together with citations to some relevant literature in this area. The first half of the notes is dedicated to studying the Bayesian framework for inverse problems. Techniques such as importance sampling and Markov Chain Monte Carlo (MCMC) methods are introduced; these methods have the desirable property that in the limit of an… 
Iterative Ensemble Kalman Methods: A Unified Perspective with Some New Variants
TLDR
This paper aims to demonstrate the efforts towards in-situ applicability of EMMARM, which aims to provide real-time information about the physical and emotional impacts of infectious disease on animals and their care and treatment.
Kernel Methods for Bayesian Elliptic Inverse Problems on Manifolds
TLDR
This paper investigates the formulation and implementation of Bayesian inverse problems to learn input parameters of partial differential equations (PDEs) defined on manifolds and adopts a Bayesian perspective to the inverse problem, and establishes an upper-bound on the total variation distance between the true posterior and an approximate posterior defined with the kernel forward map.
A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors
Hierarchical models with gamma hyperpriors provide a flexible, sparse-promoting framework to bridge L1 and L2 regularizations in Bayesian formulations to inverse problems. Despite the Bayesian…
Auto-differentiable Ensemble Kalman Filters
TLDR
Numerical results show that AD-EnKFs outperform existing methods that use expectation-maximization or particle filters to merge data assimilation and machine learning and are easy to implement and require minimal tuning.
Bayesian Update with Importance Sampling: Required Sample Size
TLDR
The required sample size for importance sampling in terms of the Ο‡2-divergence between target and proposal is investigated and the roles that dimension, noise-level and other model parameters play in approximating the Bayesian update with importance sampling are illustrated.
Calibrate, emulate, sample
TLDR
The EKS methodology provides a cheap solution to the design problem of where to place points in parameter space to efficiently train an emulator of the parameter-to-data map for the purposes of Bayesian inversion.
Graph-based Prior and Forward Models for Inverse Problems on Manifolds with Boundaries
TLDR
Graphical MatΓ©rn-type Gaussian field priors are introduced that enable flexible modeling near the boundaries, representing boundary values by superposition of harmonic functions with appropriate Dirichlet boundary conditions.
Iterated Kalman Methodology For Inverse Problems
TLDR
The application of the UKI to a novel stochastic dynamical system in which the parameter-to-data map is embedded yields improved inversion results, in comparison with the application of EKI to the same novel stochy system.
DIFFUSIVE OPTICAL TOMOGRAPHY IN THE BAYESIAN FRAMEWORK\ast
Many naturally occurring models in the sciences are well approximated by simplified models using multiscale techniques. In such settings it is natural to ask about the relationship between inverse…
Data-Driven Forward Discretizations for Bayesian Inversion
TLDR
A framework for the learning of discretizations of expensive forward models in Bayesian inverse problems is suggested and it is numerically shown that in a variety of inverse problems arising in mechanical engineering, signal processing and the geosciences, the observations contain useful information to guide the choice of discretion.
...
1
2
...

References

SHOWING 1-10 OF 112 REFERENCES
The Bayesian Approach to Inverse Problems
These lecture notes highlight the mathematical and computational structure relating to the formulation of, and development of algorithms for, the Bayesian approach to inverse problems in…
Ergodicity and Accuracy of Optimal Particle Filters for Bayesian Data Assimilation
For particle filters and ensemble Kalman filters it is of practical importance to understand how and why data assimilation methods can be effective when used with a fixed small number of particles,…
Variational Characterizations of Local Entropy and Heat Regularization in Deep Learning
TLDR
Variational characterizations that naturally suggest a two-step scheme for local entropy and heat regularized loss are introduced, based on the iterative shift of a probability density and the calculation of a best Gaussian approximation in Kullback–Leibler divergence.
Continuum Limits of Posteriors in Graph Bayesian Inverse Problems
TLDR
A graph-based Bayesian inverse problem is introduced, and it is shown that the graph-posterior measures over functions in $M_n$ converge, in the large $n$ limit, to a posterior over Functions in M that solves a Bayesian inverted problem with known domain.
Data assimilation in the geosciences: An overview of methods, issues, and perspectives
We commonly refer to state-estimation theory in geosciences as data assimilation. This term encompasses the entire sequence of operations that, starting from the observations of a system, and from…
Importance Sampling and Necessary Sample Size: An Information Theory Approach
  • D. Sanz-Alonso
  • Mathematics, Computer Science
    SIAM/ASA J. Uncertain. Quantification
  • 2018
TLDR
A general bound is derived that needs to hold for importance sampling to be successful, and relates the f-divergence between the target and the proposal to the sample size, which is deduced from a new and simple information theory paradigm for the study of importance sampling.
Adaptive importance sampling Monte Carlo simulation for general multivariate probability laws
  • R. Kawai
  • Mathematics, Computer Science
    J. Comput. Appl. Math.
  • 2017
TLDR
A parametric adaptive importance sampling variance reduction method for general multivariate probability laws and establishes the asymptotic normality of the estimator of the desired mean and of the importance sampling parameter as the number of observations tends to infinity.
Analysis of the Ensemble Kalman Filter for Inverse Problems
TLDR
The goal of this paper is to analyze the EnKF when applied to inverse problems with fixed ensemble size, and to demonstrate that the conclusions of the analysis extend beyond the linear inverse problem setting.
Bernstein–von Mises theorems for statistical inverse problems I: SchrΓΆdinger equation
The inverse problem of determining the unknown potential $f>0$ in the partial differential equation $$\frac{\Delta}{2} u - fu =0 \text{ on } \mathcal O ~~\text{s.t. } u = g \text { on } \partial…
Existence and Uniqueness for Four-Dimensional Variational Data Assimilation in Discrete Time
  • J. BrΓΆcker
  • Computer Science, Mathematics
    SIAM J. Appl. Dyn. Syst.
  • 2017
Variational techniques for data assimilation, i.e., estimating orbits of dynamical models from observations, are revisited. It is shown that under mild hypotheses a solution to this variational…
...
1
2
3
4
5
...