Mikael Kuusela

Learn More
Variational Bayesian (VB) methods are typically only applied to models in the conjugate-exponential family using the variational Bayesian expectation maximisation (VB EM) algorithm or one of its variants. In this paper we present an efficient algorithm for applying VB to more general models. The method is based on specifying the functional form of the(More)
While variational Bayesian (VB) inference is typically done with the so called VB EM algorithm, there are models where it cannot be applied because either the E-step or the M-step cannot be solved analytically. In 2007, Honkela et al. introduced a recipe for a gradient-based algorithm for VB inference that does not have such a restriction. In this paper, we(More)
We study a novel type of a semi-supervised anomaly detection problem where the anomalies occur collectively among a background of normal data. Such problem arises in experimental high energy physics when one is trying to discover deviations from known Standard Model physics. We solve the problem by first fitting a mixture of Gaussians to a labeled(More)
PAR Mikael Johan KUUSELA " If everybody tells you it's possible, then you are not dreaming big enough. " — Bertrand Piccard and André Borschberg, while crossing the Pacific Ocean on a solar-powered aircraft To my parents Acknowledgements I would first and foremost like to sincerely thank my advisor Victor Panaretos for the opportunity of carrying out this(More)
Variational methods for approximate inference in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugate-exponential family. For them, variational Bayesian expectation maximization (VB EM) algorithms are not(More)
We study a semi-supervised anomaly detection problem where anomalies lie among the normal data. Instead of analyzing individual observations, anomalies are identified collectively based on deviations from the distribution of the normal data. We first model the normal data using a mixture of Gaussians and then use a variant of the EM algorithm to fit a(More)
Avainsanat: koneoppiminen, bayesilainen päättely, variaatio-Bayes-oppiminen, informaatiogeometria, luonnollinen konjugaattigradientti, Gaussin mikstuuri helsinki university of technology abstract of the bachelor's thesis It is a typical problem in machine learning that one wants to represent a given set of data using some parametric model. The parameters of(More)
  • 1