• Corpus ID: 238419303

Neural Estimation of Statistical Divergences

@inproceedings{Sreekumar2021NeuralEO,
  title={Neural Estimation of Statistical Divergences},
  author={Sreejith Sreekumar and Ziv Goldfeld},
  year={2021}
}
Statistical divergences (SDs), which quantify the dissimilarity between probability distributions, are a basic constituent of statistical inference and machine learning. A modern method for estimating those divergences relies on parametrizing an empirical variational form by a neural network (NN) and optimizing over parameter space. Such neural estimators are abundantly used in practice, but corresponding performance guarantees are partial and call for further exploration. We establish non… 
Neural Estimation and Optimization of Directed Information over Continuous Spaces
TLDR
This work develops a new method for estimating and optimizing the directed information rate between two jointly stationary and ergodic stochastic processes through a recurrent neural network (RNN)-based estimator which is optimized via gradient ascent over the RNN parameters.
On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions
Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between multivariate