# Expectation Maximization and Complex Duration Distributions for Continuous Time Bayesian Networks

@article{Nodelman2005ExpectationMA, title={Expectation Maximization and Complex Duration Distributions for Continuous Time Bayesian Networks}, author={Uri Nodelman and Christian R. Shelton and Daphne Koller}, journal={ArXiv}, year={2005}, volume={abs/1207.1402} }

Continuous time Bayesian networks (CTBNs) describe structured stochastic processes with finitely many states that evolve over continuous time. A CTBN is a directed (possibly cyclic) dependency graph over a set of variables, each of which represents a finite state continuous time Markov process whose transition model is a function of its parents. We address the problem of learning the parameters and structure of a CTBN from partially observed data. We show how to apply expectation maximization…

## 82 Citations

### Expectation Propagation for Continuous Time Bayesian Networks

- Computer Science, MathematicsUAI
- 2005

This work shows how CTBNs can be parameterized within the exponential family, and uses that insight to develop a message passing scheme in cluster graphs and allows for general queries conditioned on evidence over continuous time intervals and at discrete time points.

### Propagation for Dynamic Continuous Time Chain Event Graphs

- Computer ScienceArXiv
- 2020

This work presents a tractable exact inferential scheme analogous to the scheme in Kjaerulff (1992) for discrete Dynamic Bayesian Networks (DBNs) which employs standard junction tree inference by "unrolling" the DBN by an extension of the standard CEG propagation algorithm.

### Importance Sampling for Continuous Time Bayesian Networks

- Computer ScienceJ. Mach. Learn. Res.
- 2010

This paper first presents an approximate inference algorithm based on importance sampling, then extends it to continuous-time particle filtering and smoothing algorithms, and compares them to other approximate algorithms: expectation propagation and Gibbs sampling.

### EXTENDING INFERENCE IN CONTINUOUS TIME BAYESIAN NETWORKS by Liessman

- Computer Science
- 2013

This prospectus proposes to formalize both uncertain and negative evidence in the context of CTBNs and extend existing inference algorithms to be able to support these new types of evidence, and shows how methods for sensitivity analysis of Markov processes can be applied to the CTBN while exploiting the conditional independence structure of the network.

### Making Continuous Time Bayesian Networks More Flexible

- MathematicsPGM
- 2018

This paper generalizes the recently proposed hypoexponential continuous time Bayesian networks, by allowing any number of hypOexponential variables, i.e., variables having a hypoExponential time duration distribution, to be included.

### Representing Hypoexponential Distributions in Continuous Time Bayesian Networks

- Computer ScienceIPMU
- 2018

This work proposes an extension to support the modeling of the transitioning time as a hypoexponential distribution by introducing an additional hidden variable in continuous time Bayesian networks, and allows CTBNs to obtain memory, which is lacking in standardCTBNs.

### Continuous-Time Bayesian Networks with Clocks

- Computer ScienceICML
- 2020

This work introduces a set of node-wise clocks to construct a collection of graph-coupled semi-Markov chains and provides algorithms for parameter and structure inference, which make use of local dependencies and conduct experiments on synthetic data and a data-set generated through a benchmark tool for gene regulatory networks.

### Tutorial on Structured Continuous-Time Markov Processes

- MathematicsJ. Artif. Intell. Res.
- 2014

This work introduces continuous-time Markov process representations and algorithms for filtering, smoothing, expected sufficient statistics calculations, and model estimation, and provides the first connection between decision-diagrams and continuous- time Bayesian networks.

### Augmenting Continuous Time Bayesian Networks with Clocks

- Computer ScienceICML 2020
- 2020

This work introduces a set of node-wise clocks to construct a collection of graph-coupled semiMarkov chains and provides algorithms for parameter and structure inference, which make use of local dependencies and conduct experiments on synthetic data and a data-set generated through a benchmark tool for gene regulatory networks.

### Mean Field Variational Approximation for Continuous-Time Bayesian Networks

- Computer ScienceJ. Mach. Learn. Res.
- 2010

A mean field variational approximation is introduced in which a product of inhomogeneous Markov processes is used to approximate a joint distribution over trajectories, which leads to a globally consistent distribution, which can be efficiently queried.

## References

SHOWING 1-10 OF 28 REFERENCES

### Expectation Propagation for Continuous Time Bayesian Networks

- Computer Science, MathematicsUAI
- 2005

This work shows how CTBNs can be parameterized within the exponential family, and uses that insight to develop a message passing scheme in cluster graphs and allows for general queries conditioned on evidence over continuous time intervals and at discrete time points.

### Learning Continuous Time Bayesian Networks

- Computer ScienceUAI
- 2003

It is shown that CTBNs can provide a better fit to continuous-time processes than DBNs with a fixed time granularity, and can tailor the parameters and dependency structure to the different time granularities of the evolution of different variables.

### Extending Continuous Time Bayesian Networks

- Computer ScienceAAAI
- 2005

The first extension models arbitrary transition time distributions using Erlang-Coxian approximations, while maintaining tractable learning, and is a general method for reasoning about negative evidence, by introducing updates that assert no observable events occur over an interval of time.

### Continuous Time Bayesian Networks

- Computer ScienceUAI
- 2002

A probabilistic semantics for the language in terms of the generative model a CTBN defines over sequences of events is presented, and an algorithm for approximate inference which takes advantage of the structure within the process is provided.

### Learning Belief Networks in the Presence of Missing Values and Hidden Variables

- Computer ScienceICML
- 1997

This paper proposes a new method for learning network structure from incomplete data based on an extension of the Expectation-Maximization (EM) algorithm for model selection problems that performs search for the best structure inside the EM procedure.

### A model for reasoning about persistence and causation

- Philosophy, Computer Science
- 1989

A model of causal reasoning that accounts for knowledge concerning cause‐and‐effect relationships and knowledge concerning the tendency for propositions to persist or not as a function of time passing is described.

### An iterative method for solution of the likelihood equations for incomplete data from exponential families

- Mathematics
- 1976

The paper deals with the numerical solution of the likelihood equations for incomplete data from exponential families, that is for data being a function of exponential family data. Illustrative…

### Fitting Phase-type Distributions via the EM Algorithm

- Computer Science
- 1996

An extended EM algorithm is used to minimize the information divergence (maximize the relative entropy) in the density approximation case and fits to Weibull, log normal, and Erlang distributions are used as illustrations of the latter.

### On cox processes and credit risky securities

- Economics
- 1998

It is shown how to generalize a model of Jarrow, Lando and Turnbull (1997) to allow for stochastic transition intensities between rating categories and into default to reduce the technical issues of modeling credit risk.

### Recursive valuation of defaultable securities and the timing of resolution of uncertainty

- Economics
- 1996

We derive the implications of default risk for valuation of securities in an abstract setting in which the fractional default recovery rate and the hazard rate for default may depend on the market…