Stephen J. Roberts

Learn More
A Bayesian-based methodology is presented which automatically penalizes overcomplex models being fitted to unknown data. We show that, with a Gaussian mixture model, the approach is able to select an “optimal” number of components in the model and so partition data sets. The performance of the Bayesian method is compared to other methods of optimal model(More)
This review provides an introduction to the use of parametric modelling techniques for time series analysis, and in particular the application of autoregressive modelling to the analysis of physiological signals such as the human electroencephalogram. The concept of signal stationarity is considered and, in the light of this, both adaptive models, and(More)
We present a novel method of Bayesian image super-resolution in which marginalization is carried out over latent parameters such as geometric and photometric registration and the image pointspread function. Related Bayesian super-resolution approaches marginalize over the high-resolution image, necessitating the use of an unfavourable image prior, whereas(More)
There has been growing interest in subspace data modeling over the past few years. Methods such as principal component analysis, factor analysis, and independent component analysis have gained in popularity and have found many applications in image modeling, signal processing, and data compression, to name just a few. As applications and computing power(More)
Traditional feature extraction methods describe signals in terms of amplitude and frequency. This paper takes a paradigm shift and investigates four stochastic-complexity features. Their advantages are demonstrated on synthetic and physiological signals; the latter recorded during periods of Cheyne-Stokes respiration, anesthesia, sleep, and motor-cortex(More)
We describe a variational Bayes (VB) learning algorithm for generalized autoregressive (GAR) models. The noise is modeled as a mixture of Gaussians rather than the usual single Gaussian. This allows different data points to be associated with different noise levels and effectively provides robust estimation of AR coefficients. The VB framework is used to(More)
We present an overview of our research into brain-computer interfacing (BCI). This comprises an offline study of the effect of motor imagery on EEG and an online study that uses pattern classifiers incorporating parameter uncertainty and temporal information to discriminate between different cognitive tasks in real-time.
The detection of novel or abnormal input vectors is of importance in many monitoring tasks, such as fault detection in complex systems and detection of abnormal patterns in medical diagnostics. We have developed a robust method for novelty detection, which aims to minimize the number of heuristically chosen thresholds in the novelty decision process. We(More)
Extreme value theory (EVT) is a branch of statistics which concerns the distributions of data of unusually low or high value i.e. in the tails of some distribution. These extremal points are important in many applications as they represent the outlying regions of normal events against which we may wish to deene abnormal events. In the context of density(More)