• Corpus ID: 14526499

Sharing Features among Dynamical Systems with Beta Processes

@inproceedings{Fox2009SharingFA,
  title={Sharing Features among Dynamical Systems with Beta Processes},
  author={Emily B. Fox and Erik B. Sudderth and Michael I. Jordan and Alan S. Willsky},
  booktitle={NIPS},
  year={2009}
}
We propose a Bayesian nonparametric approach to the problem of modeling related time series. Using a beta process prior, our approach is based on the discovery of a set of latent dynamical behaviors that are shared among multiple time series. The size of the set and the sharing pattern are both inferred from data. We develop an efficient Markov chain Monte Carlo inference method that is based on the Indian buffet process representation of the predictive distribution of the beta process. In… 

Figures from this paper

Joint Modeling of Multiple Related Time Series via the Beta Process
TLDR
This work proposes a Bayesian nonparametric approach to the problem of jointly modeling multiple related time series, and uses the sum-product algorithm to efficiently compute Metropolis-Hastings acceptance probabilities, and explores new dynamical behaviors via birth and death proposals.
JOINT MODELING OF MULTIPLE TIME SERIES VIA THE BETA PROCESS WITH APPLICATION TO MOTION CAPTURE SEGMENTATION
TLDR
A Bayesian nonparametric approach to the problem of jointly modeling multiple related time series by considering time series produced by motion capture sensors on the joints of people performing exercise routines, which demonstrates promising results on unsupervised segmentation of human motion capture data.
Effective Split-Merge Monte Carlo Methods for Nonparametric Models of Sequential Data
TLDR
This work develops new Markov chain Monte Carlo methods for the beta process hidden Markov model (BP-HMM), enabling discovery of shared activity patterns in large video and motion capture databases, and introduces split-merge moves based on sequential allocation that allow tractable analysis of hundreds of time series.
Bayesian Nonparametric Methods for Learning Markov Switching Processes
TLDR
A Bayesian nonparametric approach to learning Markov switching processes requires one to make fewer assumptions about the underlying dynamics, and thereby allows the data to drive the complexity of the inferred model.
Discovering shared and individual latent structure in multiple time series
TLDR
A nonparametric Bayesian method for exploratory data analysis and feature construction in continuous time series and applies this model to the task of tracking the physiological signals of premature infants, which obtains clinically significant insights as well as useful features for supervised learning tasks.
Reconstruction and prediction of random dynamical systems under borrowing of strength
We propose a Bayesian nonparametric model based on Markov Chain Monte Carlo (MCMC) methods for the joint reconstruction and prediction of discrete time stochastic dynamical systems, based on
Nonparametric discovery of activity patterns from video collections
  • M. Hughes, Erik B. Sudderth
  • Computer Science
    2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
  • 2012
TLDR
A nonparametric framework based on the beta process for discovering temporal patterns within a heterogenous video collection is proposed, adding data-driven MCMC moves to improve inference on realistic datasets and allowing global sharing of behavior transition parameters.
Streaming dynamic and distributed inference of latent geometric structures
TLDR
This work develops new models and algorithms for learning the temporal dynamics of the topic polytopes and related geometric objects that arise in topic model based inference through the connection between the modeling of topic polytope evolution, Beta-Bernoulli process and Hungarian matching algorithm.
Statistical Model Aggregation via Parameter Matching
TLDR
A general meta-modeling framework that learns shared global latent structures by identifying correspondences among local model parameterizations is developed, which is model-independent and applicable to a wide range of model types.
Joint reconstruction and prediction\break of random dynamical systems under\break borrowing of strength.
We propose a Bayesian nonparametric model based on Markov Chain Monte Carlo methods for the joint reconstruction and prediction of discrete time stochastic dynamical systems based on m-multiple
...
...

References

SHOWING 1-10 OF 39 REFERENCES
Nonparametric Bayesian Learning of Switching Linear Dynamical Systems
TLDR
This work develops a sampling algorithm that combines a truncated approximation to the Dirichlet process with efficient joint sampling of the mode and state sequences in an unknown number of persistent, smooth dynamical modes.
Gaussian Process Dynamical Models for Human Motion
TLDR
This work marginalize out the model parameters in closed form by using Gaussian process priors for both the dynamical and the observation mappings, which results in a nonparametric model for dynamical systems that accounts for uncertainty in the model.
An HDP-HMM for systems with state persistence
TLDR
A sampling algorithm is developed that employs a truncated approximation of the DP to jointly resample the full state sequence, greatly improving mixing rates and demonstrating the advantages of the sticky extension, and the utility of the HDP-HMM in real-world applications.
Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some fixed
Infinite latent feature models and the Indian buffet process
We define a probability distribution over equivalence classes of binary matrices with a finite number of rows and an unbounded number of columns. This distribution is suitable for use as a prior in
The Infinite Hidden Markov Model
We show that it is possible to extend hidden Markov models to have a countably infinite number of hidden states. By using the theory of Dirichlet processes we can implicitly integrate out the
Hierarchical Beta Processes and the Indian Buffet Process
TLDR
This work defines Bayesian hierarchies of beta processes and uses the connection to the beta process to develop posterior inference algorithms for the Indian buffet process and presents an application to document classification exploring a relationship between the hierarchical beta process and smoothed naive Bayes models.
A dynamic Bayesian network approach to figure tracking using learned dynamic models
TLDR
A novel DBN-based switching linear dynamic system (SLDS) model that is an approximate Viterbi inference technique for overcoming the intractability of exact inference in mixed-state DBNs is described and its application to figure motion analysis is presented.
Hierarchical Dirichlet Processes
TLDR
This work considers problems involving groups of data where each observation within a group is a draw from a mixture model and where it is desirable to share mixture components between groups, and considers a hierarchical model, specifically one in which the base measure for the childDirichlet processes is itself distributed according to a Dirichlet process.
Convergence rates of the Gibbs sampler, the Metropolis algorithm and other single-site updating dynamics
Sampling from a Markov random field II can be performed efficiently via Monta Carlo methods by simulating a Markov chain that converges weakly to II. We consider a class of local updating dynamics
...
...