Corpus ID: 168169844

Where is the Information in a Deep Neural Network?

@article{Achille2019WhereIT,
  title={Where is the Information in a Deep Neural Network?},
  author={A. Achille and Giovanni Paolini and Stefano Soatto},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.12213}
}
Whatever information a deep neural network has gleaned from training data is encoded in its weights. How this information affects the response of the network to future data remains largely an open question. Indeed, even defining and measuring information entails some subtleties, since a trained network is a deterministic map, so standard information measures can be degenerate. We measure information in a neural network via the optimal trade-off between accuracy of the response and complexity of… Expand
Quantifying the effect of representations on task complexity
Eternal Sunshine of the Spotless Net: Selective Forgetting in Deep Networks
Extracting Robust and Accurate Features via a Robust Information Bottleneck
The distance between the weights of the neural network is meaningful
A General Framework for Uncertainty Estimation in Deep Learning
A DIFFUSION THEORY FOR DEEP LEARNING DYNAM-
A DIFFUSION THEORY FOR DEEP LEARNING DYNAM-
Analyzing Deep Neural Network’s Transferability via Fréchet Distance
...
1
2
3
4
...

References

SHOWING 1-10 OF 48 REFERENCES
Opening the Black Box of Deep Neural Networks via Information
Deep learning and the information bottleneck principle
On the Information Bottleneck Theory of Deep Learning
Estimating Information Flow in Neural Networks
Emergence of Invariance and Disentanglement in Deep Representations
Adaptive Estimators Show Information Compression in Deep Neural Networks
Mutual Information, Fisher Information, and Population Coding
The Information Complexity of Learning Tasks, their Structure and their Distance
...
1
2
3
4
5
...