• Corpus ID: 53827957

Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)

@inproceedings{Cover1991ElementsOI,
  title={Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)},
  author={Thomas M. Cover and Joy A. Thomas},
  year={1991}
}

Functional Connectivity in Visual Areas from Total Correlation

TLDR
Analytical results show that pairwise Mutual Information cannot capture the effect of different intra-cortical inhibitory connections while the three-way Total Correlation can, and the presented analytical setting is useful to check empirical estimators of Total Cor correlation.

Improving Bacterial Genome Assembly Using a Test of Strand Orientation

TLDR
A statistical test based on tetranucleotide frequency is proposed, which determines whether two segments from the same genome are of the same or opposite orientation, and identifies 31 potential misassemblies in the NCBI database, several of which are further supported by a reassembly of the read data.

Synwalk - Community Detection via Random Walk Modelling

TLDR
This work builds upon a solid theoretical basis and detects communities by synthesizing the random walk induced by the given network from a class of candidate random walks, and indicates that Synwalk performs robustly on networks with varying mixing parameters and degree distributions.

Bounding Information Leakage in Machine Learning

TLDR
This paper identifies and bound the success rate of the worst-case membership inference attack, connecting it to the generalization error of the target model, and derives bounds on the mutual information between the sensitive attributes and model parameters.

The Role of Mutual Information in Variational Classifiers

TLDR
Borders to the generalization error of classifiers relying on stochastic encodings trained on the cross-entropy loss are derived, which provide an information-theoretic understanding of generalization in the so-called class of variational classifiers, which are regularized by a Kullback-Leibler (KL) divergence term.

Understanding the Behaviour of the Empirical Cross-Entropy Beyond the Training Distribution

TLDR
This paper attempts to study through the lens of information measures how a particular architecture behaves when the true probability law of the samples is potentially different at training and testing times, finding that the testing gap between the empirical cross-entropy and its statistical expectation can be bounded by the mutual information.

Understanding the Behaviour of the Empirical Cross-Entropy Beyond the Training Distribution

TLDR
This paper attempts to study through the lens of information measures how a particular architecture behaves when the true probability law of the samples is potentially different at training and testing times, finding that the testing gap between the empirical cross-entropy and its statistical expectation can be bounded by the mutual information.

Distributed Uniformity Testing

In the uniformity testing problem, we are given access to samples from some unknown distribution μ on a fixed domain \set1,..,n , and our goal is to distinguish the case where μ is the uniform
...