• Corpus ID: 168170130

Learning Bayesian Networks with Low Rank Conditional Probability Tables

@article{Barik2019LearningBN,
  title={Learning Bayesian Networks with Low Rank Conditional Probability Tables},
  author={Adarsh Barik and Jean Honorio},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.12552}
}
In this paper, we provide a method to learn the directed structure of a Bayesian network using data. The data is accessed by making conditional probability queries to a black-box model. We introduce a notion of simplicity of representation of conditional probability tables for the nodes in the Bayesian network, that we call "low rankness". We connect this notion to the Fourier transformation of real valued set functions and propose a method which learns the exact directed structure of a `low… 

Figures and Tables from this paper

Efficient Bayesian network structure learning via local Markov boundary search
TLDR
The approach is information-theoretic and uses a local Markov boundary search procedure in order to recursively construct ancestral sets in the underlying graphical model, and sheds light on the minimal assumptions required to efficiently learn the structure of directed graphical models from data.
ON LOW RANK DIRECTED ACYCLIC GRAPHS
  • Computer Science
  • 2020
TLDR
It is demonstrated how to adapt existing methods for causal structure learning to take advantage of a low rank assumption regarding the (weighted) adjacency matrix of a DAG causal model and several useful results are established relating interpretable graphical conditions to the lowRank assumption.
Attribute Relation Modeling for Pulmonary Nodule Malignancy Reasoning
TLDR
A hybrid machine learning framework consisting of two relation modeling modules: Attribute Graph Network and Bayesian Network, which effectively take advantage of attributes and the correlations and dependencies among them to improve the classification performance of pulmonary nodules is proposed.

References

SHOWING 1-10 OF 33 REFERENCES
Bayesian Network Induction via Local Neighborhoods
TLDR
This work presents an efficient algorithm for learning Bayes networks from data by first identifying each node's Markov blankets, then connecting nodes in a maximally consistent way, and proves that under mild assumptions, the approach requires time polynomial in the size of the data and the number of nodes.
Learning Bayesian Networks is NP-Complete
TLDR
It is shown that the search problem of identifying a Bayesian network—among those where each node has at most K parents—that has a relative posterior probability greater than a given constant is NP-complete, when the BDe metric is used.
Exact Bayesian Structure Discovery in Bayesian Networks
TLDR
This work presents an algorithm that computes the exact posterior probability of a subnetwork, e.g., a directed edge, and shows that also in domains with a large number of variables, exact computation is feasible, given suitable a priori restrictions on the structures.
Learning Bayesian Network Structure from Massive Datasets: The "Sparse Candidate" Algorithm
TLDR
An algorithm that achieves faster learning by restricting the search space, which restricts the parents of each variable to belong to a small subset of candidates and is evaluated both on synthetic and real-life data.
Learning Bayesian Network Structure using LP Relaxations
TLDR
This work proposes to solve the combinatorial problem ofding the highest scoring Bayesian network structure from data by maintaining an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints.
A Simple Approach for Finding the Globally Optimal Bayesian Network Structure
TLDR
It is shown that it is possible to learn the best Bayesian network structure with over 30 variables, which covers many practically interesting cases and offers a possibility for efficient exploration of the best networks consistent with different variable orderings.
Learning Bayesian networks from data: An information-theory based approach
Active Learning for Structure in Bayesian Networks
TLDR
Experimental results show that active learning can substantially reduce the number of observations required to determine the structure of a domain.
Computationally and statistically efficient learning of causal Bayes nets using path queries
TLDR
This paper proposes a polynomial time algorithm for learning the exact correctly-oriented structure of the transitive reduction of any causal Bayesian network with high probability, by using interventional path queries, and shows how to learn thetransitive edges using also logarithmic sample complexity.
Exact Bayesian structure learning from uncertain interventions
We show how to apply the dynamic programming algorithm of Koivisto and Sood [KS04, Koi06], which computes the exact posterior marginal edge probabilities p(Gij = 1|D) of a DAG G given data D, to the
...
1
2
3
4
...