Deep learning and the information bottleneck principle

  title={Deep learning and the information bottleneck principle},
  author={Naftali Tishby and Noga Zaslavsky},
  journal={2015 IEEE Information Theory Workshop (ITW)},
Deep Neural Networks (DNNs) are analyzed via the theoretical framework of the information bottleneck (IB) principle. We first show that any DNN can be quantified by the mutual information between the layers and the input and output variables. Using this representation we can calculate the optimal information theoretic limits of the DNN and obtain finite sample generalization bounds. The advantage of getting closer to the theoretical limit is quantifiable both by the generalization bound and by… CONTINUE READING
Highly Cited
This paper has 165 citations. REVIEW CITATIONS

2 Figures & Tables



Citations per Year

165 Citations

Semantic Scholar estimates that this publication has 165 citations based on the available data.

See our FAQ for additional information.

  • GitHub repos referencing this paper

    • SCFSD

      Star Citizen Faction Ship Drawer

  • Presentations referencing similar topics