Highly Influenced

@article{Tishby2015DeepLA, title={Deep learning and the information bottleneck principle}, author={Naftali Tishby and Noga Zaslavsky}, journal={2015 IEEE Information Theory Workshop (ITW)}, year={2015}, pages={1-5} }

- Published 2015 in 2015 IEEE Information Theory Workshop (ITW)
DOI:10.1109/ITW.2015.7133169

Deep Neural Networks (DNNs) are analyzed via the theoretical framework of the information bottleneck (IB) principle. We first show that any DNN can be quantified by the mutual information between the layers and the input and output variables. Using this representation we can calculate the optimal information theoretic limits of the DNN and obtain finite sample generalization bounds. The advantage of getting closer to the theoretical limit is quantifiable both by the generalization bound and by… CONTINUE READING

Highly Cited

This paper has 165 citations. REVIEW CITATIONS

Citations per Year

Semantic Scholar estimates that this publication has **165** citations based on the available data.

See our **FAQ** for additional information.

### GitHub repos referencing this paper

- SCFSD
Star Citizen Faction Ship Drawer

### Presentations referencing similar topics