• Publications
  • Influence
Comparison of asymptotic variances of inhomogeneous Markov chains with application to Markov chain Monte Carlo methods
In this paper, we study the asymptotic variance of sample path averages for inhomogeneous Markov chains that evolve alternatingly according to two different 7-reversible Markov transition kernels PExpand
  • 28
  • 3
  • PDF
Efficient MCMC for Gibbs Random Fields using pre-computation
Bayesian inference of Gibbs random fields (GRFs) is often referred to as a doubly intractable problem, since the likelihood function is intractable. The exploration of the posterior distribution ofExpand
  • 9
  • 2
  • PDF
Efficient Bayesian inference for exponential random graph models by correcting the pseudo-posterior distribution
TLDR
We introduce a pseudo-posterior distribution that approximates the likelihood function in the posterior distribution of Exponential random graph models and discuss the computational and statistical efficiency that result from this approach. Expand
  • 18
  • 1
  • PDF
Bayesian Model Selection for Exponential Random Graph Models via Adjusted Pseudolikelihoods
TLDR
This article specifies a method to adjust pseudolikelihoods to obtain a reasonable, yet tractable approximation to the likelihood. Expand
  • 17
  • 1
  • PDF
Informed sub-sampling MCMC: approximate Bayesian inference for large datasets
TLDR
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large datasets. Expand
  • 9
  • 1
  • PDF
On the use of Markov chain Monte Carlo methods for the sampling of mixture models: a statistical perspective
TLDR
In this paper we study asymptotic properties of different data-augmentation-type Markov chain Monte Carlo algorithms sampling from mixture models comprising discrete as well as continuous random variables and discuss and compare different algorithms based on this scheme. Expand
  • 3
  • 1
  • PDF
Online EM for functional data
TLDR
A novel approach to perform unsupervised sequential learning for functional data is proposed. Expand
  • 3
  • 1
  • PDF
Light and Widely Applicable MCMC: Approximate Bayesian Inference for Large Datasets
Light and Widely Applicable (LWA-) MCMC is a novel approximation of the Metropolis‐ Hastings kernel targeting a posterior distribution defined on a large number of observations. Inspired byExpand
  • 6
  • 1
  • PDF
Bayesian inference for misspecified exponential random graph models
Exponential Random Graph models are an important tool in network analysis for describing complicated dependency structures. However, Bayesian parameter estimation for these models is extremelyExpand
  • 2
  • 1