#### Filter Results:

- Full text PDF available (179)

#### Publication Year

1993

2017

- This year (9)
- Last 5 years (61)
- Last 10 years (117)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Stephen J. Roberts, Dirk Husmeier, Iead Rezek, William D. Penny
- IEEE Trans. Pattern Anal. Mach. Intell.
- 1998

A Bayesian-based methodology is presented which automatically penalizes overcomplex models being fitted to unknown data. We show that, with a Gaussian mixture model, the approach is able to select an “optimal” number of components in the model and so partition data sets. The performance of the Bayesian method is compared to other methods of optimal model… (More)

- J Pardey, S Roberts, L Tarassenko
- Medical engineering & physics
- 1996

This review provides an introduction to the use of parametric modelling techniques for time series analysis, and in particular the application of autoregressive modelling to the analysis of physiological signals such as the human electroencephalogram. The concept of signal stationarity is considered and, in the light of this, both adaptive models, and… (More)

- Lyndsey C. Pickup, David P. Capel, Stephen J. Roberts, Andrew Zisserman
- Comput. J.
- 2009

We present a novel method of Bayesian image super-resolution in which marginalization is carried out over latent parameters such as geometric and photometric registration and the image pointspread function. Related Bayesian super-resolution approaches marginalize over the high-resolution image, necessitating the use of an unfavourable image prior, whereas… (More)

- Rizwan Choudrey, Stephen J. Roberts
- Neural Computation
- 2003

There has been growing interest in subspace data modeling over the past few years. Methods such as principal component analysis, factor analysis, and independent component analysis have gained in popularity and have found many applications in image modeling, signal processing, and data compression, to name just a few. As applications and computing power… (More)

- I A Rezek, S J Roberts
- IEEE transactions on bio-medical engineering
- 1998

Traditional feature extraction methods describe signals in terms of amplitude and frequency. This paper takes a paradigm shift and investigates four stochastic-complexity features. Their advantages are demonstrated on synthetic and physiological signals; the latter recorded during periods of Cheyne-Stokes respiration, anesthesia, sleep, and motor-cortex… (More)

This paper presents a method of independent component analysis which assesses the most probable number of source sequences from a larger number of observed sequences and estimates the unknown source sequences and mixing matrix. The estimation of the number of true sources is regarded as a model-order estimation problem and is tackled under a Bayesian… (More)

- Stephen J. Roberts, William D. Penny
- IEEE Trans. Signal Processing
- 2002

We describe a variational Bayes (VB) learning algorithm for generalized autoregressive (GAR) models. The noise is modeled as a mixture of Gaussians rather than the usual single Gaussian. This allows different data points to be associated with different noise levels and effectively provides robust estimation of AR coefficients. The VB framework is used to… (More)

- W D Penny, S J Roberts, E A Curran, M J Stokes
- IEEE transactions on rehabilitation engineering…
- 2000

We present an overview of our research into brain-computer interfacing (BCI). This comprises an offline study of the effect of motor imagery on EEG and an online study that uses pattern classifiers incorporating parameter uncertainty and temporal information to discriminate between different cognitive tasks in real-time.

- Stephen J. Roberts, Lionel Tarassenko
- Neural Computation
- 1994

The detection of novel or abnormal input vectors is of importance in many monitoring tasks, such as fault detection in complex systems and detection of abnormal patterns in medical diagnostics. We have developed a robust method for novelty detection, which aims to minimize the number of heuristically chosen thresholds in the novelty decision process. We… (More)

Extreme value theory (EVT) is a branch of statistics which concerns the distributions of data of unusually low or high value i.e. in the tails of some distribution. These extremal points are important in many applications as they represent the outlying regions of normal events against which we may wish to deene abnormal events. In the context of density… (More)