Encoding prior knowledge in the structure of the likelihood
@article{Knollmller2018EncodingPK, title={Encoding prior knowledge in the structure of the likelihood}, author={Jakob Knollm{\"u}ller and Torsten A. Ensslin}, journal={ArXiv}, year={2018}, volume={abs/1812.04403} }
The inference of deep hierarchical models is problematic due to strong dependencies between the hierarchies. We investigate a specific transformation of the model parameters based on the multivariate distributional transform. This transformation is a special form of the reparametrization trick, flattens the hierarchy and leads to a standard Gaussian prior on all resulting parameters. The transformation also transfers all the prior information into the structure of the likelihood, hereby…
9 Citations
Probabilistic Autoencoder Using Fisher Information
- Computer ScienceEntropy
- 2021
In this work, an extension to the autoencoder architecture is introduced, the FisherNet, which has advantages from a theoretical point of view as it provides a direct uncertainty quantification derived from the model and also accounts for uncertainty cross-correlations.
Metric Gaussian Variational Inference
- Computer ScienceArXiv
- 2019
The proposed Metric Gaussian Variational Inference (MGVI) is an iterative method that performs a series of Gaussian approximations to the posterior that achieves linear scaling by avoiding to store the covariance explicitly at any time.
Toward Bayesian Data Compression
- Computer ScienceAnnalen der Physik
- 2021
A Bayesian data compression algorithm that adapts to the specific measurement situation is derived in the context of signal reconstruction, and the applicability of BDC is demonstrated by applying it to synthetic data and radio astronomical data.
Bayesian parameter estimation of miss-specified models
- Computer Science
- 2018
This method relies on the usage of many data sets in a simultaneous analysis in order to overcome the problems caused by the degeneracy between model parameters and model error.
Charting nearby dust clouds using Gaia data only (Corrigendum)
- PhysicsAstronomy & Astrophysics
- 2019
Aims. Highly resolved maps of the local Galactic dust are an important ingredient for sky emission models. Over almost the whole electromagnetic spectrum one can see imprints of dust, many of which…
Non-parametric Bayesian Causal Modeling of the SARS-CoV-2 Viral Load Distribution vs. Patient's Age
- Medicine
- 2021
A method to analyze viral-load distribution data as a function of the patients' age within a flexible, non-parametric, hierarchical, Bayesian, and causal model is developed and made freely available.
Information Field Theory and Artificial Intelligence
- Computer ScienceEntropy
- 2022
This paper reformulated the process of inference in IFT in terms of GNN training, suggesting that IFT is well suited to address many problems in AI and ML research and application.
Studying Bioluminescence Flashes with the ANTARES Deep Sea Neutrino Telescope
- Physics
- 2021
A statistical method is proposed that allows for the reconstruction of the light emission of individual organisms, as well as their location and movement, and the first precise localizations of bioluminescent organisms using neutrino telescope data are revealed.
Unified radio interferometric calibration and imaging with joint uncertainty quantification
- Computer ScienceAstronomy & Astrophysics
- 2019
An algorithm is presented that unifies cross-calibration, self-calibrration, and imaging in the language of information field theory and uses Metric Gaussian Variational Inference as the underlying statistical method.
References
SHOWING 1-10 OF 31 REFERENCES
Auto-Encoding Variational Bayes
- Computer ScienceICLR
- 2014
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Variational Inference with Normalizing Flows
- Computer Science, MathematicsICML
- 2015
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Variational Inference: A Review for Statisticians
- Computer ScienceArXiv
- 2016
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.
DENSITY ESTIMATION BY DUAL ASCENT OF THE LOG-LIKELIHOOD ∗
- Computer Science
- 2010
A methodology is developed to assign, from an observed sample, a joint-probability distribution to a set of continuous variables, by mapping the original variables onto a jointly-Gaussian set.
Automatic Differentiation Variational Inference
- Computer ScienceJ. Mach. Learn. Res.
- 2017
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
Bayesian reconstruction of the cosmological large-scale structure: methodology, inverse algorithms and numerical optimization
- Mathematics
- 2008
We address the inverse problem of cosmic large-scale structure reconstruction from a Bayesian perspective. For a linear data model, a number of known and novel reconstruction schemes, which differ in…
The Variational Gaussian Approximation Revisited
- Computer ScienceNeural Computation
- 2009
The relationship between the Laplace and the variational approximation is discussed, and it is shown that for models with gaussian priors and factorizing likelihoods, the number of variational parameters is actually .
Gaussian Processes for Regression
- Computer Science, MathematicsNIPS
- 1995
This paper investigates the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations.
Fixed-Form Variational Posterior Approximation through Stochastic Linear Regression
- Computer ScienceArXiv
- 2012
A general algorithm for approximating nonstandard Bayesian posterior distributions that minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribu- tion.
Reconstruction of signals with unknown spectra in information field theory with parameter uncertainty
- PhysicsArXiv
- 2010
The general problem of signal inference in the presence of unknown parameters within the framework of information field theory is formulated and a generic parameter-uncertainty renormalized estimation (PURE) technique is developed.