Context-specific independence in graphical log-linear models

@article{Nyman2016ContextspecificII,
  title={Context-specific independence in graphical log-linear models},
  author={Henrik J. Nyman and Johan Pensar and Timo Koski and Jukka Corander},
  journal={Computational Statistics},
  year={2016},
  volume={31},
  pages={1493-1512}
}
Log-linear models are the popular workhorses of analyzing contingency tables. A log-linear parameterization of an interaction model can be more expressive than a direct parameterization based on probabilities, leading to a powerful way of defining restrictions derived from marginal, conditional and context-specific independence. However, parameter estimation is often simpler under a direct parameterization, provided that the model enjoys certain decomposability properties. Here we introduce a… 
Context-specific independencies in stratified chain regression graphical models
TLDR
This work proposes a particular chain graphical model able to represent context-specific independencies, which are conditional independencies holding for particular values of the variables in the conditioning set through labeled arcs, and provides also the Markov properties able to describe marginal, conditional, and context- specific independencies from this new chain graph.
Context-specific independence in graphical models
TLDR
The models introduced in this thesis enable the graphical representation of context-specific independencies, i.e. conditional independencies that hold only in a subset of the outcome space of the conditioning variables.
Log-linear models independence structure comparison
TLDR
This work presents a measure for the direct comparison of the independence structures of log-linear models, inspired by the Hamming distance comparison method used in undirected graphical models, and can be efficiently computed in terms of the number of variables of the domain, and is proven to be a distance metric.
Structure learning of context-specific graphical models
TLDR
Numerical experiments show that the increased flexibility of context-specific structures can more accurately emulate the dependence structure among the variables and thereby improve the predictive accuracy of the models.
Hierarchical Aitchison-Silvey models for incomplete binary sample spaces
Structure Learning of Contextual Markov Networks using Marginal Pseudo‐likelihood
TLDR
The marginal pseudo‐likelihood as an analytically tractable criterion for general contextual Markov networks is introduced and is shown to yield a consistent structure estimator.
Context-Specific and Local Independence in Markovian Dependence Structures
TLDR
Context-specific independence in different classes of Markovian probability models both for static and spatially or temporally organized variables, including Bayesian networks, Markov networks, and higher-order Markov chains are reviewed.
Context-specific independencies in hierarchical multinomial marginal models
TLDR
This paper considers the hierarchical multinomial marginal models and provides several original results about the representation of context-specific independencies through these models.
Context-specific Independence in Innovation Study
TLDR
This work focuses on the so called context-specific independence where the conditional independence holds only in a subspace of the outcome space and proposes a graphical representation of all the considered independencies taking advantages from the chain graph model.
Blankets Joint Posterior score for learning irregular Markov network structures
TLDR
The Blankets Joint Posterior score is designed for computing the posterior probability of structures given data and can improve the learning process when the solution structure is irregular, which is a property present in many real-world networks.
...
...

References

SHOWING 1-10 OF 27 REFERENCES
Stratified Graphical Models - Context-Specific Independence in Graphical Models
TLDR
A method for Bayesian learning of stratified graphical models is developed by deriving an analytical expression for the marginal likelihood of data under a specific subclass of decomposable stratified models.
Split models for contingency tables
Labeled directed acyclic graphs: a generalization of context-specific independence in directed graphical models
TLDR
A novel class of labeled directed acyclic graph (LDAG) models for finite sets of discrete variables, and a novel prior distribution for the model structures that can appropriately penalize a model for its labeling complexity are developed.
Decomposable graphical Gaussian model determination
TLDR
A hyper inverse Wishart prior distribution on the concentration matrix for each given graph is considered, containing only the elements for which the corresponding element of the inverse is nonzero, allowing all computations to be performed locally, at the clique level, which is a clear advantage for the analysis of large and complex datasets.
Labelled Graphical Models
A class of log‐linear models, referred to as labelled graphical models (LGMs), is introduced for multinomial distributions. These models generalize graphical models (GMs) by employing partial
Statistical Inference in Context Specific Interaction Models for Contingency Tables
Abstract.  Context specific interaction models is a class of interaction models for contingency tables in which interaction terms are allowed to vanish in specific contexts given by the levels of
Markov chain Monte Carlo model determination for hierarchical and graphical log-linear models
We use reversible jump Markov chain Monte Carlo methods (Green, 1995) to develop strategies for calculating posterior probabilities of hierarchical, graphical or decomposable log-linear models for
Marginal and simultaneous predictive classification using stratified graphical models
TLDR
The theoretical results are extended to predictive classifiers acknowledging feature dependencies either through graphical models or sparser alternatives defined as stratified graphical models and show the potential to substantially improve classification accuracy compared with both standard discriminative classifiers and the predictive classifier based on solely conditionally independent features.
Improving Markov Chain Monte Carlo Model Search for Data Mining
TLDR
The motivation of this paper is the application of MCMC model scoring procedures to data mining problems, involving a large number of competing models and other relevant model choice aspects, and the proposed MC3 algorithm, which provides an efficient general framework for computations with both Directed Acyclic Graphical (DAG) models and Undirected Decomposable Models (UDG).
...
...