Computing functions of random variables via reproducing kernel Hilbert space representations

  title={Computing functions of random variables via reproducing kernel Hilbert space representations},
  author={Bernhard Sch{\"o}lkopf and Krikamol Muandet and Kenji Fukumizu and Stefan Harmeling and J. Peters},
  journal={Statistics and Computing},
We describe a method to perform functional operations on probability distributions of random variables. The method uses reproducing kernel Hilbert space representations of probability distributions, and it is applicable to all operations which can be applied to points drawn from the respective distributions. We refer to our approach as kernel probabilistic programming. We illustrate it on synthetic data and show how it can be used for nonparametric structural equation models, with an… 
Continuum versus discrete networks, graph Laplacians, and reproducing kernel Hilbert spaces
A Measure-Theoretic Approach to Kernel Conditional Mean Embeddings
A new operator-free, measure-theoretic definition of the conditional mean embedding as a random variable taking values in a reproducing kernel Hilbert space is presented, and a thorough analysis of its properties, including universal consistency is provided.
Consistent Kernel Mean Estimation for Functions of Random Variables
It is shown that for any continuous function f, consistent estimators of the mean embedding of a random variable X lead to consistent estimator of themean embeddings of f(X), and for Matern kernels and sufficiently smooth functions, rates of convergence are provided.
Reproducing kernel Hilbert space semantics for probabilistic programs
Denotational semantics for a language of probabilistic arithmetic expressions based on reproducing kernel Hilbert spaces (RKHS) is proposed and it is shown how to derive equivalent semantics based on RKHS and how to compute it approximately, potentially with convergence guarantees.
Dimensionality Reduction of Complex Metastable Systems via Kernel Embeddings of Transition Manifolds
A novel kernel-based machine learning algorithm for identifying the low-dimensional geometry of the effective dynamics of high-dimensional multiscale stochastic systems is presented by embedding and learning this transition manifold in a reproducing kernel Hilbert space, exploiting the favorable properties of kernel embeddings.
Kernel Mean Embedding of Distributions: A Review and Beyonds
A comprehensive review of existing work and recent advances in the Hilbert space embedding of distributions, and to discuss the most challenging issues and open problems that could lead to new research directions.
A Kernel Mean Embedding Approach to Reducing Conservativeness in Stochastic Programming and Control
The reduced-set expansion method is used as a way to discard sampled scenarios and the effect of such constraint removal is improved optimality and decreased conservativeness by solving a distributional-distance-regularized optimization problem.
Solving Chance-Constrained Optimization Under Nonparametric Uncertainty Through Hilbert Space Embedding
This article provides a systematic way of constructing the desired distribution based on the notion of scenario approximation in chance-constrained optimization as one of minimizing the distance between a desired distribution and the distribution of the constraint functions in Reproducing Kernel Hilbert Space.
Variational Hilbert regression for terrain modeling and trajectory optimization
A novel regression methodology for terrain modeling that can approximate arbitrarily complex functions based on a series of simple kernel calculations, using variational Bayesian inference is introduced.


A Hilbert Space Embedding for Distributions
We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert
Kernel Measures of Conditional Dependence
A new measure of conditional dependence of random variables, based on normalized cross-covariance operators on reproducing kernel Hilbert spaces, which has a straightforward empirical estimate with good convergence behaviour.
Hilbert space embeddings of conditional distributions with applications to dynamical systems
This paper derives a kernel estimate for the conditional embedding, and shows its connection to ordinary embeddings, and aims to derive a nonparametric method for modeling dynamical systems where the belief state of the system is maintained as a conditional embeddedding.
Recovering Distributions from Gaussian RKHS Embeddings
This paper theoretically analyzes the properties of a consistent estimator of a kernel mean, which is represented as a weighted sum of feature vectors and proves that the weighted average of a function in a Besov space, whose weights and samples are given by the kernel mean estimator, converges to the expectation of the function.
Probability Distributions of Algebraic Functions of Independent Random Variables
Fundamental methods are developed for the derivation of the probability density function and moments of rational algebraic functions of independent random variables. Laplace and Mellin integral
Injective Hilbert Space Embeddings of Probability Measures
This work considers more broadly the problem of specifying characteristic kernels, defined as kernels for which the RKHS embedding of probability measures is injective, and restricts ourselves to translation-invariant kernels on Euclidean space.
Kernel Mean Estimation and Stein Effect
Focusing on a subset of this class of estimators, this work proposes efficient shrinkage estimators for the kernel mean that can be improved due to a well-known phenomenon in statistics called Stein's phenomenon.
Learning from Distributions via Support Measure Machines
A kernel-based discriminative learning framework on probability measures that learns using a collection of probability distributions that have been constructed to meaningfully represent training data and proposes a flexible SVM (Flex-SVM) that places different kernel functions on each training example.
Kernel Bayes' rule: Bayesian inference with positive definite kernels
A kernel method for realizing Bayes' rule is proposed, based on representations of probabilities in reproducing kernel Hilbert spaces, including Bayesian computation without likelihood and filtering with a nonparametric state-space model.
Random Features for Large-Scale Kernel Machines
Two sets of random features are explored, provided convergence bounds on their ability to approximate various radial basis kernels, and it is shown that in large-scale classification and regression tasks linear machine learning algorithms applied to these features outperform state-of-the-art large- scale kernel machines.