Corpus ID: 88520616

Bayesian Dropout

@inproceedings{Herlau2015BayesianD,
  title={Bayesian Dropout},
  author={Tue Herlau and M. M{\o}rup and M. Schmidt},
  year={2015}
}
  • Tue Herlau, M. Mørup, M. Schmidt
  • Published 2015
  • Mathematics
  • Dropout has recently emerged as a powerful and simple method for training neural networks preventing co-adaptation by stochastically omitting neurons. Dropout is currently not grounded in explicit modelling assumptions which so far has precluded its adoption in Bayesian modelling. Using Bayesian entropic reasoning we show that dropout can be interpreted as optimal inference under constraints. We demonstrate this on an analytically tractable regression model providing a Bayesian interpretation… CONTINUE READING

    Figures and Tables from this paper.

    Unifying the Dropout Family Through Structured Shrinkage Priors
    1

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 26 REFERENCES
    Fast dropout training
    300
    Variational Bayesian Inference with Stochastic Search
    294
    Approximate Riemannian Conjugate Gradient Learning for Fixed-Form Variational Bayes
    90
    Stochastic variational inference
    1460
    Learning with Marginalized Corrupted Features
    131
    RISK OF BAYESIAN INFERENCE IN MISSPECIFIED MODELS, AND THE SANDWICH COVARIANCE MATRIX
    96
    Riemann manifold Langevin and Hamiltonian Monte Carlo methods
    1000
    Improving neural networks by preventing co-adaptation of feature detectors
    4752
    Entropic Inference
    14
    MCMC Using Hamiltonian Dynamics
    1596