Corpus ID: 219965876

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

  title={Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks},
  author={Felix Biggs and Benjamin Guedj},
  • Felix Biggs, Benjamin Guedj
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • We make three related contributions motivated by the challenge of training stochastic neural networks, particularly in a PAC-Bayesian setting: (1) we show how averaging over an ensemble of stochastic neural networks enables a new class of partially-aggregated estimators; (2) we show that these lead to provably lowervariance gradient estimates for non-differentiable signed-output networks; (3) we reformulate a PAC-Bayesian bound for these networks to derive a directly optimisable, differentiable… CONTINUE READING

    Tables and Topics from this paper.

    Explore Further: Topics Discussed in This Paper

    Tighter risk certificates for neural networks


    Publications referenced by this paper.
    Auto-Encoding Variational Bayes
    • 9,139
    • PDF
    Adam: A Method for Stochastic Optimization
    • 49,602
    • Highly Influential
    • PDF
    Weight Uncertainty in Neural Network
    • 339
    • Highly Influential
    Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data
    • 252
    • PDF
    Simple statistical gradient-following algorithms for connectionist reinforcement learning
    • 2,898
    • PDF
    Variational Dropout and the Local Reparameterization Trick
    • 545
    Non-vacuous Generalization Bounds at the ImageNet Scale: a PAC-Bayesian Compression Approach
    • 62
    • PDF
    Dichotomize and Generalize: PAC-Bayesian Binary Activated Deep Neural Networks
    • 7
    • PDF
    PAC-Bayesian Theory Meets Bayesian Inference
    • 68
    • PDF
    PAC-BAYESIAN SUPERVISED CLASSIFICATION: The Thermodynamics of Statistical Learning
    • 235
    • Highly Influential
    • PDF