Model Reduction of Linear Dynamical Systems via Balancing for Bayesian Inference

@article{Qian2022ModelRO,
  title={Model Reduction of Linear Dynamical Systems via Balancing for Bayesian Inference},
  author={Elizabeth Qian and Jemima M. Tabeart and Christopher A. Beattie and Serkan Gugercin and Jiahua Jiang and Peter R. Kramer and Akil C. Narayan},
  journal={Journal of Scientific Computing},
  year={2022},
  volume={91},
  pages={1-30}
}
We consider the Bayesian approach to the linear Gaussian inference problem of inferring the initial condition of a linear dynamical system from noisy output measurements taken after the initial time. In practical applications, the large dimension of the dynamical system state poses a computational obstacle to computing the exact posterior distribution. Model reduction offers a variety of computational tools that seek to reduce this computational burden. In particular, balanced truncation is a… 

References

SHOWING 1-10 OF 62 REFERENCES

Likelihood-informed dimension reduction for nonlinear inverse problems

The intrinsic dimensionality of an inverse problem is affected by prior information, the accuracy and number of observations, and the smoothing properties of the forward operator. From a Bayesian

Balanced Truncation Model Order Reduction For Quadratic-Bilinear Control Systems

TLDR
This paper proposes algebraic Gramians for QB systems based on the underlying Volterra series representation of QB systems and their Hilbert adjoint systems and investigates the Lyapunov stability of the reduced-order systems.

Goal-Oriented Optimal Approximations of Bayesian Linear Inverse Problems

TLDR
Optimal dimensionality reduction techniques for the solution of goal-oriented linear-Gaussian inverse problems, where the quantity of interest (QoI) is a function of the inversion parameters, are proposed, suitable for large-scale applications.

Balanced model order reduction for linear random dynamical systems driven by Lévy noise

TLDR
This work considers Balanced truncation and singular perturbation approximation when the control is replaced by a noise term, and derives error bounds for both BT and SPA and provides numerical results for a specific example which support the theory.

The Bayesian Approach to Inverse Problems

These lecture notes highlight the mathematical and computational structure relating to the formulation of, and development of algorithms for, the Bayesian approach to inverse problems in

Principal component analysis in linear systems: Controllability, observability, and model reduction

Kalman's minimal realization theory involves geometric objects (controllable, unobservable subspaces) which are subject to structural instability. Specifically, arbitrarily small perturbations in a

Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset

TLDR
A novel methodology, based on manifold learning and manifold sampling, is proposed for solving this computational statistics problem under the following assumptions: neither the prior model nor the likelihood function are Gaussian and neither can be approximated by a Gaussian measure.

Model Reduction for fluids, Using Balanced Proper Orthogonal Decomposition

  • C. Rowley
  • Computer Science
    Int. J. Bifurc. Chaos
  • 2005
TLDR
The method presented here is a variation of existing methods using empirical Gramians that allows one to compute balancing transformations directly, without separate reduction of the Gramians, and has computational cost similar to that of POD.

Optimal Low-rank Approximations of Bayesian Linear Inverse Problems

TLDR
Two fast approximations of the posterior mean are proposed and proved optimality with respect to a weighted Bayes risk under squared-error loss and the Hessian of the negative log-likelihood and the prior precision are proved.
...