Corpus ID: 12412451

Exact Bayesian Structure Discovery in Bayesian Networks

@article{Koivisto2004ExactBS,
  title={Exact Bayesian Structure Discovery in Bayesian Networks},
  author={M. Koivisto and K. Sood},
  journal={J. Mach. Learn. Res.},
  year={2004},
  volume={5},
  pages={549-573}
}
Learning a Bayesian network structure from data is a well-motivated but computationally hard task. We present an algorithm that computes the exact posterior probability of a subnetwork, e.g., a directed edge; a modified version of the algorithm finds one of the most probable network structures. This algorithm runs in time O(n 2n + nk+1C(m)), where n is the number of network variables, k is a constant maximum in-degree, and C(m) is the cost of computing a single local marginal conditional… Expand
Advances in Exact Bayesian Structure Discovery in Bayesian Networks
TLDR
It is shown that the posterior probabilities for all the n (n - 1) potential edges can be computed in O(n 2^n) total time, by a forward-backward technique and fast Moebius transform algorithms, which are of independent interest. Expand
New skeleton-based approaches for Bayesian structure learning of Bayesian networks
TLDR
This work proposes two novel skeleton-based approaches to approximate a Bayesian solution to the BN learning problem: a new stochastic search which tries to find directed acyclic graph (DAG) structures with a non-negligible score; and a new Markov chain Monte Carlo method over the DAG space. Expand
Learning Bayesian networks with local structure, mixed variables, and exact algorithms
TLDR
It is shown that under modest restrictions on the possible branchings in the tree structure, it is feasible to find a structure that maximizes a Bayes score in a range of moderate-size problem instances, which enables global optimization of the Bayesian network structure, including the local structure. Expand
Computing Posterior Probabilities of Structural Features in Bayesian Networks
TLDR
An algorithm is developed that can compute the exact posterior probability of a subnetwork in O(3n) time and the posterior probabilities for all n(n - 1) potential edges in O3n total time and also assumes a bounded indegree but allows general structure priors. Expand
Parallel globally optimal structure learning of Bayesian networks
TLDR
This paper presents a parallel algorithm for exact structure learning of a Bayesian network that is communication-efficient and work-optimal up to O(1n@?2^n) processors and presents experimental results that characterize run-time behavior with respect to the number of variables, number of observations, and the bound on in-degree. Expand
A Parallel Algorithm for Exact Bayesian Structure Discovery in Bayesian Networks
TLDR
This work presents a parallel algorithm capable of computing the exact posterior probabilities for all $n(n-1)$ edges with optimal parallel space efficiency and nearly optimal parallel time efficiency and applies it to a biological data set for discovering the yeast pheromone response pathways. Expand
A parallel algorithm for exact Bayesian network inference
TLDR
This paper presents a parallel algorithm for exact Bayesian inference that is work-optimal and communication-efficient, and demonstrates the applicability of the method by an implementation on the IBM Blue Gene/L, with experimental results that exhibit near perfect scaling. Expand
Algorithms for Exact Structure Discovery in Bayesian Networks
TLDR
This thesis contributes to two areas of structure discovery in Bayesian networks: space–time tradeoffs and learning ancestor relations. Expand
Finding Optimal Bayesian Network Given a Super-Structure
Classical approaches used to learn Bayesian network structure from data have disadvantages in terms of complexity and lower accuracy of their results. However, a recent empirical study has shown thatExpand
Structural learning of bayesian networks using statistical constraints
Bayesian Networks are probabilistic graphical models that encode in a compact manner the conditional probabilistic relations over a set of random variables. In this thesis we address the NP-completeExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 31 REFERENCES
Being Bayesian About Network Structure. A Bayesian Approach to Structure Discovery in Bayesian Networks
TLDR
This paper shows how to efficiently compute a sum over the exponential number of networks that are consistent with a fixed order over network variables, and uses this result as the basis for an algorithm that approximates the Bayesian posterior of a feature. Expand
A Bayesian Approach to Learning Bayesian Networks with Local Structure
TLDR
A Bayesian approach to learning Bayesian networks that contain the more general decision-graph representations of the CPDs is investigated, and how to evaluate the posterior probability-- that is, the Bayesian score--of such a network, given a database of observed cases is described. Expand
Learning Bayesian Networks: Search Methods and Experimental Results
TLDR
A metric for computing the relative posterior probability of a network structure given data developed by Heckerman et al. (1994a,b,c) has a property useful for inferring causation from data and is described. Expand
Learning Bayesian Networks: The Combination of Knowledge and Statistical Data
TLDR
A methodology for assessing informative priors needed for learning Bayesian networks from a combination of prior knowledge and statistical data is developed and how to compute the relative posterior probabilities of network structures given data is shown. Expand
On Inclusion-Driven Learning of Bayesian Networks
TLDR
This paper introduces a condition for traversal operators, the inclusion boundary condition, which guarantees that the search strategy can avoid local maxima and carries out a set of experiments that show empirically the benefit of striving for the inclusion order when learning Bayesian networks from data. Expand
Optimal Structure Identification With Greedy Search
TLDR
This paper proves the so-called "Meek Conjecture", which shows that if a DAG H is an independence map of another DAG G, then there exists a finite sequence of edge additions and covered edge reversals in G such that H remains anindependence map of G and after all modifications G =H. Expand
Searching for Bayesian Network Structures in the Space of Restricted Acyclic Partially Directed Graphs
TLDR
This paper proposes a new local search method that uses a different search space, and which takes account of the concept of equivalence between network structures: restricted acyclic partially directed graphs (RPDAGs). Expand
A Bayesian method for the induction of probabilistic networks from data
TLDR
This paper presents a Bayesian method for constructing probabilistic networks from databases, focusing on constructing Bayesian belief networks, and extends the basic method to handle missing data and hidden variables. Expand
Computer-based probabilistic-network construction
TLDR
This dissertation demonstrates that nonparametric, efficient, computer-based algorithms for determining the important associations among variables in a domain are conceptually feasible, robust to noise, computationally efficient, theoretically sound, and that they generate models that can classify new cases accurately. Expand
A Bayesian Approach to Causal Discovery
We examine the Bayesian approach to the discovery of causal DAG models and compare it to the constraint-based approach. Both approaches rely on the Causal Markov condition, but the two differExpand
...
1
2
3
4
...