Staged trees and asymmetry-labeled DAGs
@article{Varando2021StagedTA, title={Staged trees and asymmetry-labeled DAGs}, author={Gherardo Varando and Federico Carli and Manuele Leonelli}, journal={ArXiv}, year={2021}, volume={abs/2108.01994} }
Bayesian networks are a widely-used class of probabilistic graphical models capable of representing symmetric conditional independence between variables of interest using the topology of the underlying graph. For categorical variables, they can be seen as a special case of the much more general class of models called staged trees, which can represent any type of non-symmetric conditional independence. Here we formalize the relationship between these two models and introduce a minimal Bayesian…
Figures and Tables from this paper
6 Citations
Structural Learning of Simple Staged Trees
- Computer ScienceArXiv
- 2022
It is shown that data-learned simple staged trees often outperform Bayesian networks in model fit and how the coalesced graph is used to identify non-symmetric conditional independences is illustrated.
Learning and interpreting asymmetry-labeled DAGs: a case study on COVID-19 fear
- Computer ScienceArXiv
- 2023
Novel structural learning algorithms are introduced for asymmetry-labeled DAGs to allow for a straightforward interpretation of the underlying dependence structure of Bayesian networks.
A new class of generative classifiers based on staged tree models
- Computer ScienceArXiv
- 2020
Staged tree classifiers are introduced, which formally account for context-speci-c independence in generative models and are constructed by a partitioning of the vertices of an event tree from which conditional independence can be formally read.
Highly Efficient Structural Learning of Sparse Staged Trees
- Computer SciencePGM
- 2022
This work introduces the first scalable structural learning algorithm for staged trees, which searches over a space of models where only a small number of dependencies can be imposed.
The curved exponential family of a staged tree
- MathematicsElectronic Journal of Statistics
- 2022
: Staged tree models are a discrete generalization of Bayesian networks. We show that these form curved exponential families and derive their natural parameters, sufficient statistic, and…
Context-Specific Causal Discovery for Categorical Data Using Staged Trees
- Computer ScienceArXiv
- 2021
New causal discovery algorithms based on staged tree models, which can represent complex and non-symmetric causal effects, are introduced and a new distance is introduced, inspired by the widely used structural interventional distance, to quantify the closeness between two staged trees in terms of their corresponding causal inference statements.
References
SHOWING 1-10 OF 57 REFERENCES
Structural Learning of Simple Staged Trees
- Computer ScienceArXiv
- 2022
It is shown that data-learned simple staged trees often outperform Bayesian networks in model fit and how the coalesced graph is used to identify non-symmetric conditional independences is illustrated.
Labeled directed acyclic graphs: a generalization of context-specific independence in directed graphical models
- Computer Science, MathematicsData Mining and Knowledge Discovery
- 2014
A novel class of labeled directed acyclic graph (LDAG) models for finite sets of discrete variables, and a novel prior distribution for the model structures that can appropriately penalize a model for its labeling complexity are developed.
Structure Learning for Bayesian Networks over Labeled DAGs
- Computer SciencePGM
- 2018
The first constraintbased learning method for LDAGs is presented, that employs a branch and bound for the especially computational demanding task of local score calculation, after which exact DAG search can be used.
The role of local partial independence in learning of Bayesian networks
- Computer ScienceInt. J. Approx. Reason.
- 2016
Conditional independence and chain event graphs
- Computer ScienceArtif. Intell.
- 2008
A new class of generative classifiers based on staged tree models
- Computer ScienceArXiv
- 2020
Staged tree classifiers are introduced, which formally account for context-speci-c independence in generative models and are constructed by a partitioning of the vertices of an event tree from which conditional independence can be formally read.
Context-Specific Independence in Bayesian Networks
- Computer ScienceUAI
- 1996
This paper proposes a formal notion of context-specific independence (CSI), based on regularities in the conditional probability tables (CPTs) at a node, and proposes a technique, analogous to (and based on) d-separation, for determining when such independence holds in a given network.
LEARNING STRUCTURED BAYESIAN NETWORKS: COMBINING ABSTRACTION HIERARCHIES AND TREE‐STRUCTURED CONDITIONAL PROBABILITY TABLES
- Computer ScienceComput. Intell.
- 2008
Tree‐Abstraction‐Based Search (TABS) is introduced, an approach for learning a data distribution by inducing the graph structure and parameters of a BN from training data that combines tree structure and attribute‐value hierarchies to compactly represent conditional probability tables.
A Separation Theorem for Chain Event Graphs
- MathematicsArXiv
- 2015
This paper introduces a separation theorem for CEGs, analogous to the d-separation theorem for BNs, which likewise allows an analyst to identify the conditional independence structure of their model from the topology of the graph.
A Logical Approach to Context-Specific Independence
- Computer ScienceWoLLIC
- 2016
This article defines an analogue of dependence logic suitable to express context-specific independence and study its basic properties, and considers the problem of finding inference rules for deriving non-local CSI and CI statements that logically follow from the structure of a LDAG but are not explicitly encoded by it.