Identifying interesting relationships between pairs of variables in large data sets is increasingly important. Here, we present a measure of dependence for two-variable relationships: the maximalâ€¦ (More)

A measure of dependence is said to be equitable if it gives similar scores to equally noisy relationships of different types. Equitability is important in data exploration when the goal is toâ€¦ (More)

<lb>For high-dimensional data sets, it is common to evaluate a measure of dependence on<lb>every variable pair and retain the highest-scoring pairs for follow-up. If the statisticâ€¦ (More)

BACKGROUND
During an influenza pandemic, a substantial proportion of transmission is thought to occur in households. We used data on influenza progression in individuals and their contacts collectedâ€¦ (More)

How do we perceive the predictability of functions? We derive a rational measure of a functionâ€™s predictability based on Gaussian process learning curves. Using this measure, we show that theâ€¦ (More)

In exploratory data analysis, we are often interested in identifying promising pairwise associations for further analysis while filtering out weaker, less interesting ones. This can be accomplishedâ€¦ (More)

As data sets grow in dimensionality, non-parametric measures of dependence have seen increasing use in data exploration due to their ability to identify non-trivial relationships of all kinds. Oneâ€¦ (More)

The maximal information coefficient (MIC) is a tool for finding the strongest pairwise relationships in a data set with many variables [1]. MIC is useful because it gives similar scores to equallyâ€¦ (More)

Applying for information theory, we present a measure of dependence for three-variable relationships: the three variables maximal information coefficient (3D-MIC). It is a kind of maximalâ€¦ (More)