Michael U Gutmannhe/him
Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Noise-contrastive estimation: A new estimation principle for unnormalized statistical models
A new estimation principle is presented to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise, using the model log-density function in the regression nonlinearity, which leads to a consistent (convergent) estimator of the parameters.
VEEGAN: Reducing Mode Collapse in GANs using Implicit Variational Learning
- Akash Srivastava, L. Valkov, Chris Russell, Michael U Gutmann, Charles Sutton
- Computer ScienceNIPS
- 22 May 2017
VEEGAN is introduced, which features a reconstructor network, reversing the action of the generator by mapping from data to noise, and resists mode collapsing to a far greater extent than other recent GAN variants, and produces more realistic samples.
Noise-Contrastive Estimation of Unnormalized Statistical Models, with Applications to Natural Image Statistics
The basic idea is to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise and it is shown that the new method strikes a competitive trade-off in comparison to other estimation methods for unnormalized models.
Approximate Bayesian Computation
- Michael U Gutmann
- Computer Science
Just when you thought it was safe to go back into the water, I’m going to complicate things even further. The Nielsen-Wakely-Hey [5, 3, 4] approach is very flexible and very powerful, but even it…
Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models
This paper proposes a strategy which combines probabilistic modeling of the discrepancy with optimization to facilitate likelihood-free inference and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.
Fundamentals and Recent Developments in Approximate Bayesian Computation
- Jarno Lintusaari, Michael U Gutmann, Ritabrata Dutta, Samuel Kaski, J. Corander
- Computer Science, BiologySystematic biology
- 11 September 2016
Approximate Bayesian computation refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible.
Likelihood-Free Inference by Ratio Estimation
- Owen Thomas, Ritabrata Dutta, J. Corander, Samuel Kaski, Michael U Gutmann
- Computer ScienceBayesian Analysis
- 30 November 2016
An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented.
Likelihood-free inference via classification
- Michael U Gutmann, Ritabrata Dutta, Samuel Kaski, J. Corander
- Computer ScienceStat. Comput.
- 18 July 2014
This work finds that classification accuracy can be used to assess the discrepancy between simulated and observed data and the complete arsenal of classification methods becomes thereby available for inference of intractable generative models.
Bregman divergence as general framework to estimate unnormalized statistical models
We show that the Bregman divergence provides a rich framework to estimate unnormalized statistical models for continuous or discrete random variables, that is, models which do not integrate or sum to…
Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation
- Song Liu, J. Quinn, Michael U Gutmann, Taiji Suzuki, Masashi Sugiyama
- Computer ScienceNeural Computation
- 25 April 2013
A new method for detecting changes in Markov network structure between two sets of samples is proposed, which directly learns the network structure change by estimating the ratio of Markovnetwork models.