• Corpus ID: 6504017

An Improved Admissible Heuristic for Learning Optimal Bayesian Networks

@inproceedings{Yuan2012AnIA,
  title={An Improved Admissible Heuristic for Learning Optimal Bayesian Networks},
  author={Changhe Yuan and Brandon M. Malone},
  booktitle={UAI},
  year={2012}
}
Recently two search algorithms, A* and breadth-first branch and bound (BFBnB), were developed based on a simple admissible heuristic for learning Bayesian network structures that optimize a scoring function. The heuristic represents a relaxation of the learning problem such that each variable chooses optimal parents independently. As a result, the heuristic may contain many directed cycles and result in a loose bound. This paper introduces an improved admissible heuristic that tries to avoid… 

Figures and Tables from this paper

An Improved Lower Bound for Bayesian Network Structure Learning

A new partition method based on information extracted from the potential optimal parent sets (POPS) of the variables of a data set can significantly improve the efficiency and scalability of heuristic search-based structure learning algorithms.

Learning Optimal Bayesian Networks: A Shortest Path Perspective

An A* search algorithm that learns an optimal Bayesian network structure by only searching the most promising part of the solution space and a heuristic function that reduces the amount of relaxation by avoiding directed cycles within some groups of variables.

Finding Optimal Bayesian Network Structures with Constraints Learned from Data

The observation that there is useful information implicit in the POPS is made, which shows that solving the constrained subproblems significantly improves the efficiency and scalability of heuristic search-based structure learning algorithms.

Tightening Bounds for Bayesian Network Structure Learning

Methods for tightening the bounds of a breadth-first branch and bound algorithm by using more informed variablegroupings when creating the pattern databases and using an anytime learning algorithm are introduced.

A Depth-First Branch and Bound Algorithm for Learning Optimal Bayesian Networks

This work presents a new depth-first branch and bound algorithm that finds increasingly better solutions and eventually converges to an optimal Bayesian network upon completion and proves the optimality of these solutions about 10 times faster in some cases.

Evaluating Anytime Algorithms for Learning Optimal Bayesian Networks

Several anytime heuristic search-based algorithms are adapted to learn Bayesian networks to show that the anytime window A* algorithm usually finds higher-quality, often optimal, networks more quickly than other approaches, and that, surprisingly, while generating networks with few parents per variable are structurally simpler, they are harder to learn.

On Pruning for Score-Based Bayesian Network Structure Learning

New non-trivial theoretical upper bounds for the BDeu score are derived that considerably improve on the state-of-the-art and are a promising addition to BNSL methods.

Advances in Bayesian Network Learning using Integer Programming

After relating this BN learning problem to set covering and the multidimensional 0-1 knapsack problem, the various steps taken to allow efficient solving of this IP are described.

Learning Bayesian Networks with Thousands of Variables

A novel algorithm that effectively explores the space of possible parent sets of a node on the basis of an approximated score function that is computed in constant time and an improvement of an existing ordering-based algorithm for structure optimization.

References

SHOWING 1-10 OF 23 REFERENCES

Ordering-Based Search: A Simple and Effective Algorithm for Learning Bayesian Networks

It is shown that ordering-based search outperforms the standard baseline, and is competitive with recent algorithms that are much harder to implement.

Improving the Scalability of Optimal Bayesian Network Learning with External-Memory Frontier Breadth-First Branch and Bound Search

This paper develops a memory-efficient heuristic search algorithm for learning the structure of a Bayesian network that leverages the layered structure of the search graph of this problem so that no more than two layers of the graph need to be stored in memory at a time.

A Branch-and-Bound Algorithm for MDL Learning Bayesian Networks

This paper presents an efficient depth-first branch-and-bound algorithm for learning Bayesian network structures, based on the minimum description length (MDL) principle, for a given (consistent) variable ordering.

Learning Optimal Bayesian Networks Using A* Search

Empirical results show that the A* search algorithm significantly improves the time and space efficiency of existing methods on a set of benchmark datasets.

Learning Bayesian Network Structure using LP Relaxations

This work proposes to solve the combinatorial problem ofding the highest scoring Bayesian network structure from data by maintaining an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints.

A Simple Approach for Finding the Globally Optimal Bayesian Network Structure

It is shown that it is possible to learn the best Bayesian network structure with over 30 variables, which covers many practically interesting cases and offers a possibility for efficient exploration of the best networks consistent with different variable orderings.

Efficient Structure Learning of Bayesian Networks using Constraints

  • Cassio Polpo de CamposQ. Ji
  • Computer Science
    J. Mach. Learn. Res.
  • 2011
A branch-and-bound algorithm is presented that integrates structural constraints with data in a way to guarantee global optimality and the benefits of using the properties with state-of-the-art methods and with the new algorithm, able to handle larger data sets than before.

Finding optimal Bayesian networks by dynamic programming

This paper describes a “merely” exponential space/time algorithm for finding a Bayesian network that corresponds to a global maxima of a decomposable scoring function, such as BDeu or BIC.

Exact Bayesian Structure Discovery in Bayesian Networks

This work presents an algorithm that computes the exact posterior probability of a subnetwork, e.g., a directed edge, and shows that also in domains with a large number of variables, exact computation is feasible, given suitable a priori restrictions on the structures.

Bayesian network learning with cutting planes

The problem of learning the structure of Bayesian networks from complete discrete data with a limit on parent set size is considered and it is shown that this is a particularly fast method for exact BN learning.