• Publications
  • Influence
Noise-contrastive estimation: A new estimation principle for unnormalized statistical models
TLDR
A new estimation principle is presented to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise, using the model log-density function in the regression nonlinearity, which leads to a consistent (convergent) estimator of the parameters.
VEEGAN: Reducing Mode Collapse in GANs using Implicit Variational Learning
TLDR
VEEGAN is introduced, which features a reconstructor network, reversing the action of the generator by mapping from data to noise, and resists mode collapsing to a far greater extent than other recent GAN variants, and produces more realistic samples.
Noise-Contrastive Estimation of Unnormalized Statistical Models, with Applications to Natural Image Statistics
TLDR
The basic idea is to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise and it is shown that the new method strikes a competitive trade-off in comparison to other estimation methods for unnormalized models.
Approximate Bayesian Computation
Just when you thought it was safe to go back into the water, I’m going to complicate things even further. The Nielsen-Wakely-Hey [5, 3, 4] approach is very flexible and very powerful, but even it
Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models
TLDR
This paper proposes a strategy which combines probabilistic modeling of the discrepancy with optimization to facilitate likelihood-free inference and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.
Fundamentals and Recent Developments in Approximate Bayesian Computation
TLDR
Approximate Bayesian computation refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible.
Likelihood-Free Inference by Ratio Estimation
TLDR
An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented.
Likelihood-free inference via classification
TLDR
This work finds that classification accuracy can be used to assess the discrepancy between simulated and observed data and the complete arsenal of classification methods becomes thereby available for inference of intractable generative models.
Bregman divergence as general framework to estimate unnormalized statistical models
We show that the Bregman divergence provides a rich framework to estimate unnormalized statistical models for continuous or discrete random variables, that is, models which do not integrate or sum to
Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation
TLDR
A new method for detecting changes in Markov network structure between two sets of samples is proposed, which directly learns the network structure change by estimating the ratio of Markovnetwork models.
...
...