• Corpus ID: 54557344

Encoding prior knowledge in the structure of the likelihood

  title={Encoding prior knowledge in the structure of the likelihood},
  author={Jakob Knollm{\"u}ller and Torsten A. Ensslin},
The inference of deep hierarchical models is problematic due to strong dependencies between the hierarchies. We investigate a specific transformation of the model parameters based on the multivariate distributional transform. This transformation is a special form of the reparametrization trick, flattens the hierarchy and leads to a standard Gaussian prior on all resulting parameters. The transformation also transfers all the prior information into the structure of the likelihood, hereby… 

Figures from this paper

Probabilistic Autoencoder Using Fisher Information
In this work, an extension to the autoencoder architecture is introduced, the FisherNet, which has advantages from a theoretical point of view as it provides a direct uncertainty quantification derived from the model and also accounts for uncertainty cross-correlations.
Metric Gaussian Variational Inference
The proposed Metric Gaussian Variational Inference (MGVI) is an iterative method that performs a series of Gaussian approximations to the posterior that achieves linear scaling by avoiding to store the covariance explicitly at any time.
Toward Bayesian Data Compression
A Bayesian data compression algorithm that adapts to the specific measurement situation is derived in the context of signal reconstruction, and the applicability of BDC is demonstrated by applying it to synthetic data and radio astronomical data.
Bayesian parameter estimation of miss-specified models
This method relies on the usage of many data sets in a simultaneous analysis in order to overcome the problems caused by the degeneracy between model parameters and model error.
Charting nearby dust clouds using Gaia data only (Corrigendum)
Aims. Highly resolved maps of the local Galactic dust are an important ingredient for sky emission models. Over almost the whole electromagnetic spectrum one can see imprints of dust, many of which
Non-parametric Bayesian Causal Modeling of the SARS-CoV-2 Viral Load Distribution vs. Patient's Age
A method to analyze viral-load distribution data as a function of the patients' age within a flexible, non-parametric, hierarchical, Bayesian, and causal model is developed and made freely available.
Information Field Theory and Artificial Intelligence
This paper reformulated the process of inference in IFT in terms of GNN training, suggesting that IFT is well suited to address many problems in AI and ML research and application.
Studying Bioluminescence Flashes with the ANTARES Deep Sea Neutrino Telescope
A statistical method is proposed that allows for the reconstruction of the light emission of individual organisms, as well as their location and movement, and the first precise localizations of bioluminescent organisms using neutrino telescope data are revealed.
Unified radio interferometric calibration and imaging with joint uncertainty quantification
An algorithm is presented that unifies cross-calibration, self-calibrration, and imaging in the language of information field theory and uses Metric Gaussian Variational Inference as the underlying statistical method.


Auto-Encoding Variational Bayes
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Variational Inference with Normalizing Flows
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Variational Inference: A Review for Statisticians
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.
A methodology is developed to assign, from an observed sample, a joint-probability distribution to a set of continuous variables, by mapping the original variables onto a jointly-Gaussian set.
Automatic Differentiation Variational Inference
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
Bayesian reconstruction of the cosmological large-scale structure: methodology, inverse algorithms and numerical optimization
We address the inverse problem of cosmic large-scale structure reconstruction from a Bayesian perspective. For a linear data model, a number of known and novel reconstruction schemes, which differ in
The Variational Gaussian Approximation Revisited
The relationship between the Laplace and the variational approximation is discussed, and it is shown that for models with gaussian priors and factorizing likelihoods, the number of variational parameters is actually .
Gaussian Processes for Regression
This paper investigates the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations.
Fixed-Form Variational Posterior Approximation through Stochastic Linear Regression
A general algorithm for approximating nonstandard Bayesian posterior distributions that minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribu- tion.
Reconstruction of signals with unknown spectra in information field theory with parameter uncertainty
The general problem of signal inference in the presence of unknown parameters within the framework of information field theory is formulated and a generic parameter-uncertainty renormalized estimation (PURE) technique is developed.