Corpus ID: 173990434

Bayesian Deconditional Kernel Mean Embeddings

@article{Hsu2019BayesianDK,
  title={Bayesian Deconditional Kernel Mean Embeddings},
  author={Kelvin Hsu and Fabio Tozeto Ramos},
  journal={ArXiv},
  year={2019},
  volume={abs/1906.00199}
}
Conditional kernel mean embeddings form an attractive nonparametric framework for representing conditional means of functions, describing the observation processes for many complex models. However, the recovery of the original underlying function of interest whose conditional mean was observed is a challenging inference task. We formalize deconditional kernel mean embeddings as a solution to this inverse problem, and show that it can be naturally viewed as a nonparametric Bayes' rule… Expand
Dual IV: A Single Stage Instrumental Variable Regression
TLDR
A novel single-stage procedure for instrumental variable (IV) regression called DualIV which simplifies traditional two-stage regression via a dual formulation and is closely related to the generalized method of moments (GMM) with specific assumptions. Expand
Deconditional Downscaling with Gaussian Processes
TLDR
This work introduces conditional mean process (CMP), a new class of Gaussian Processes describing conditional means, and demonstrates its proficiency in a synthetic and a real-world atmospheric field downscaling problem, showing substantial improvements over existing methods. Expand
Dual Instrumental Variable Regression
TLDR
Inspired by problems in stochastic programming, it is shown that the two-stage procedure for nonlinear IV regression can be reformulated as a convex-concave saddle-point problem. Expand

References

SHOWING 1-10 OF 23 REFERENCES
Bayesian Learning of Kernel Embeddings
TLDR
A new probabilistic model for kernel mean embeddings is proposed, the Bayesian Kernel Embedding model, combining a Gaussian process prior over the Reproducing Kernel Hilbert Space containing the mean embedding with a conjugate likelihood function, thus yielding a closed form posterior over themean embedding. Expand
Bayesian Learning of Conditional Kernel Mean Embeddings for Automatic Likelihood-Free Inference
TLDR
KELFI is presented, a holistic framework that automatically learns model hyperparameters to improve inference accuracy given limited simulation budget and demonstrates improved accuracy and efficiency on challenging inference problems in ecology. Expand
Conditional mean embeddings as regressors
TLDR
This work demonstrates an equivalence between reproducing kernel Hilbert space (RKHS) embeddings of conditional distributions and vector-valued regressors and derives a sparse version of the embedding by considering alternative formulations and derives minimax convergence rates which are valid under milder and more intuitive assumptions. Expand
K2-ABC: Approximate Bayesian Computation with Kernel Embeddings
TLDR
This paper proposes a fully nonparametric ABC paradigm which circumvents the need for manually selecting summary statistics, and uses maximum mean discrepancy (MMD) as a dissimilarity measure between the distributions over observed and simulated data. Expand
Kernel Bayes' rule: Bayesian inference with positive definite kernels
TLDR
A kernel method for realizing Bayes' rule is proposed, based on representations of probabilities in reproducing kernel Hilbert spaces, including Bayesian computation without likelihood and filtering with a nonparametric state-space model. Expand
Kernel Mean Embedding of Distributions: A Review and Beyonds
TLDR
A comprehensive review of existing work and recent advances in the Hilbert space embedding of distributions, and to discuss the most challenging issues and open problems that could lead to new research directions. Expand
Hilbert space embeddings of conditional distributions with applications to dynamical systems
TLDR
This paper derives a kernel estimate for the conditional embedding, and shows its connection to ordinary embeddings, and aims to derive a nonparametric method for modeling dynamical systems where the belief state of the system is maintained as a conditional embeddedding. Expand
Variational Inference: A Review for Statisticians
TLDR
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived. Expand
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
TLDR
This work treats the problem of dimensionality reduction as that of finding a low-dimensional “effective subspace” of X which retains the statistical relationship between X and Y and establishes a general nonparametric characterization of conditional independence using covariance operators on a reproducing kernel Hilbert space. Expand
Hierarchical Implicit Models and Likelihood-Free Variational Inference
TLDR
HIMs are introduced, which combine the idea of implicit densities with hierarchical Bayesian modeling, thereby defining models via simulators of data with rich hidden structure and likelihood-free variational inference (LFVI), a scalable Variational inference algorithm for HIMs. Expand
...
1
2
3
...