Adam M. Johansen

Learn More
The Auxiliary Particle Filter (APF) introduced by Pitt and Shephard (1999) is a very popular alternative to Sequential Importance Sampling and Resampling (SISR) algorithms to perform inference in state-space models. We propose a novel interpretation of the APF as an SISR algorithm. This interpretation allows us to present simple guidelines to ensure good(More)
Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in(More)
Sequential Monte Carlo methods are a very general class of Monte Carlo methods for sampling from sequences of distributions. Simple examples of these algorithms are used very widely in the tracking and signal processing literature. Recent developments illustrate that these techniques have much more general applicability, and can be applied very effectively(More)
This paper introduces a framework for simulating finite dimensional representations of (jump) diffusion sample paths over finite intervals, without discretisation error (exactly), in such a way that the sample path can be restored at any finite collection of time points. Within this framework we extend existing exact algorithms and introduce novel adaptive(More)
Following the Loss Distributional Approach (LDA), this article develops two procedures for simulation of an annual loss distribution for modeling of Operational Risk. First, we provide an overview of the typical compound-process LDA used widely in Operational Risk modeling, before expanding upon the current literature on evaluation and simulation of annual(More)
We develop strategies for Bayesian modelling as well as model comparison, averaging and selection for compartmental models with particular emphasis on those which occur in the analysis of Positron Emission Tomography (PET) data. Both modelling and computational issues are considered. It is shown that an additive normal error structure does not describe(More)
Standard methods for maximum likelihood parameter estimation in latent variable models rely on the Expectation-Maximization algorithm and its Monte Carlo variants. Our approach is different and motivated by similar considerations to simulated annealing; that is we build a sequence of artificial distributions whose support concentrates itself on the set of(More)
In this report, we propose an original approach to solve Fredholm equations of the second kind. We interpret the standard von Neumann expansion of the solution as an expectation with respect to a probability distribution de…ned on an union of subspaces of variable dimension. Based on this representation, it is possible to use trans-dimensional Markov Chain(More)
Model comparison for the purposes of selection, averaging and validation is a problem found throughout statistics and related disciplines. Within the Bayesian paradigm, these problems all require the calculation of the posterior probabilities of models within a particular class. Substantial progress has been made in recent years, but there are numerous(More)