Corpus ID: 6019591

Relevant sparse codes with variational information bottleneck

@inproceedings{Chalk2016RelevantSC,
  title={Relevant sparse codes with variational information bottleneck},
  author={Matthew Chalk and Olivier Marre and Ga{\vs}per Tka{\vc}ik},
  booktitle={NIPS},
  year={2016}
}
  • Matthew Chalk, Olivier Marre, Gašper Tkačik
  • Published in NIPS 2016
  • Mathematics, Computer Science
  • In many applications, it is desirable to extract only the relevant aspects of data. A principled way to do this is the information bottleneck (IB) method, where one seeks a code that maximizes information about a 'relevance' variable, Y, while constraining the information encoded about the original data, X. Unfortunately however, the IB method is computationally demanding when data are high-dimensional and/or non-gaussian. Here we propose an approximate variational scheme for maximizing a lower… CONTINUE READING

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 46 CITATIONS

    The Variational Deficiency Bottleneck

    VIEW 2 EXCERPTS
    CITES BACKGROUND

    Distributed Variational Representation Learning

    VIEW 3 EXCERPTS
    CITES METHODS

    Gaussian Lower Bound for the Information Bottleneck Limit

    VIEW 3 EXCERPTS
    CITES METHODS & BACKGROUND

    Extracting Robust and Accurate Features via a Robust Information Bottleneck

    VIEW 1 EXCERPT

    Learnability for the Information Bottleneck

    VIEW 1 EXCERPT
    CITES METHODS

    Entropy and mutual information in models of deep neural networks

    VIEW 1 EXCERPT
    CITES METHODS

    FILTER CITATIONS BY YEAR

    2016
    2020

    CITATION STATISTICS

    • 2 Highly Influenced Citations

    • Averaged 13 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.