Corpus ID: 219965876

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

@article{Biggs2020DifferentiablePO,
  title={Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks},
  author={Felix Biggs and Benjamin Guedj},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.12228}
}
We make three related contributions motivated by the challenge of training stochastic neural networks, particularly in a PAC-Bayesian setting: (1) we show how averaging over an ensemble of stochastic neural networks enables a new class of \emph{partially-aggregated} estimators; (2) we show that these lead to provably lower-variance gradient estimates for non-differentiable signed-output networks; (3) we reformulate a PAC-Bayesian bound for these networks to derive a directly optimisable… Expand
Tighter risk certificates for neural networks

References

SHOWING 1-10 OF 17 REFERENCES
Auto-Encoding Variational Bayes
Dichotomize and Generalize: PAC-Bayesian Binary Activated Deep Neural Networks
Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data
Non-vacuous Generalization Bounds at the ImageNet Scale: a PAC-Bayesian Compression Approach
Weight Uncertainty in Neural Network
Generalized Variational Inference: Three arguments for deriving new Posteriors
PAC-Bayesian Theory Meets Bayesian Inference
Monte Carlo Gradient Estimation in Machine Learning
Adam: A Method for Stochastic Optimization
Variational Dropout and the Local Reparameterization Trick
...
1
2
...