• Corpus ID: 88519868

Synergy, suppression and immorality: forward differences of the entropy function

  title={Synergy, suppression and immorality: forward differences of the entropy function},
  author={Joe Whittaker and Florian Martin and Yang Xiang},
  journal={arXiv: Methodology},
Conditional mutual information is important in the selection and interpretation of graphical models. Its empirical version is well known as a generalised likelihood ratio test and that it may be represented as a difference in entropy. We consider the forward difference expansion of the entropy function defined on all subsets of the variables under study. The elements of this expansion are invariant to permutation of their suffices and relate higher order mutual informations to lower order ones… 

Figures and Tables from this paper



The Inequality Between the Coefficient of Determination and the Sum of Squared Simple Correlation Coefficients

The inequality between the coefficient of determination and the sum of two squared simple correlation coefficients in a two-variable regression model is reexamined through two relative measures. They

Markov Fields and Log-Linear Interaction Models for Contingency Tables

We use a close connection between the theory of Markov fields and that of log-linear interaction models for contingency tables to define and investigate a new class of models for such tables,

Discrete Multivariate Analysis: Theory and Practice

Discrete Multivariate Analysis is a comprehensive text and general reference on the analysis of discrete multivariate data, particularly in the form of multidimensional tables, and contains a wealth of material on important topics.

Suppression Situations in Multiple Linear Regression

This article proposes alternative expressions for the two most prevailing definitions of suppression without resorting to the standardized regression modeling. The formulation provides a simple basis

A characterization of Markov equivalence classes for acyclic digraphs

Undirected graphs and acyclic digraphs (ADGs), as well as their mutual extension to chain graphs, are widely used to describe dependencies among variables in multivariate distributions. In

Elements of Information Theory

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.

Suppressor Variables in Multiple Regression/Correlation

Since Horst (1941) first discussed and defined the suppressor variable in multiple regression/correlation, a number of more nearly precise definitions has been offered (Cohen and Cohen, 1975; Conger,

On Substantive Research Hypotheses, Conditional Independence Graphs and Graphical Chain Models

SUMMARY Graphs consisting of points, and lines or arrows as connections between selected pairs of points, are used to formulate hypotheses about relations between variables. Points stand for

Equivalence and Synthesis of Causal Models

The canonical representation presented here yields an efficient algorithm for determining when two embedded causal models reflect the same dependency information, which leads to a model theoretic definition of causation in terms of statistical dependencies.

Suppressor Variables and the Semipartial Correlation Coefficient

Since Horst (1941) initially introduced the concept, a number of investigators have suggested alternative definitions that both include a broader class of situations and are more precise. Following a