• Corpus ID: 243848022

The Weighted Generalised Covariance Measure

@inproceedings{Scheidegger2021TheWG,
  title={The Weighted Generalised Covariance Measure},
  author={Cyrill Scheidegger and Julia Horrmann and Peter Buhlmann},
  year={2021}
}
We introduce a new test for conditional independence which is based on what we call the weighted generalised covariance measure (WGCM). It is an extension of the recently introduced generalised covariance measure (GCM). To test the null hypothesis of X and Y being conditionally independent given Z, our test statistic is a weighted form of the sample covariance between the residuals of nonlinearly regressing X and Y on Z. We propose different variants of the test for both univariate and… 
Evaluating Independence and Conditional Independence Measures
  • Jian Ma
  • Computer Science, Mathematics
  • 2022
TLDR
It is found that most of the measures work well on the simulated data by presenting the right monotonicity of the simulations, and only a few can be considered as working well with reference to domain knowledge.

References

SHOWING 1-10 OF 51 REFERENCES
The hardness of conditional independence testing and the generalised covariance measure
It is a common saying that testing for conditional independence, i.e., testing whether whether two random vectors $X$ and $Y$ are independent, given $Z$, is a hard statistical problem if $Z$ is a
Conditional Distance Correlation
TLDR
A nonparametric measure of conditional dependence for multivariate random variables with arbitrary dimensions that is proven to be more powerful than some recently developed tests through the authors' numerical simulations and able to identify two conditionally associated gene expressions, which otherwise cannot be revealed.
A NONPARAMETRIC HELLINGER METRIC TEST FOR CONDITIONAL INDEPENDENCE
We propose a nonparametric test of conditional independence based on the weighted Hellinger distance between the two conditional densities, f(y|x,z) and f(y|x), which is identically zero under the
A Permutation-Based Kernel Conditional Independence Test
TLDR
A new kernel CI test is proposed that uses a single, learned permutation to convert the CI test problem into an easier two-sample test problem and has power competitive with state-of-the-art kernel CI tests.
The conditional permutation test for independence while controlling for confounders
TLDR
A general new method for testing the conditional independence of variables X and Y given a potentially high dimensional random vector Z that may contain confounding factors, and establishes bounds on the type I error in terms of the error in the approximation of the conditional distribution of X|Z.
Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information
TLDR
A fully non-parametric test for continuous data based on conditional mutual information combined with a local permutation scheme is presented, which efficiently adapts also to non-smooth distributions due to strongly nonlinear dependencies.
Testing Conditional Independence Via Empirical Likelihood
Causal Discovery Using Regression-Based Conditional Independence Tests
TLDR
This paper proposes a regression-based CI test to relax the test of x ⊥ y | Z to simpler unconditional independence tests, and proves that f and g can be easily estimated by regression, and existing causal learning algorithms can infer much more causal directions by using the proposed method.
Testing conditional independence using maximal nonlinear conditional correlation
In this paper, the maximal nonlinear conditional correlation of two random vectors X and Y given another random vector Z, denoted by ρ 1 (X, Y|Z), is defined as a measure of conditional association,
...
...