# Differential Bayesian Neural Nets

@article{Look2019DifferentialBN, title={Differential Bayesian Neural Nets}, author={Andreas Look and Melih Kandemir}, journal={ArXiv}, year={2019}, volume={abs/1912.00796} }

Neural Ordinary Differential Equations (N-ODEs) are a powerful building block for learning systems, which extend residual networks to a continuous-time dynamical system. We propose a Bayesian version of N-ODEs that enables well-calibrated quantification of prediction uncertainty, while maintaining the expressive power of their deterministic counterpart. We assign Bayesian Neural Nets (BNNs) to both the drift and the diffusion terms of a Stochastic Differential Equation (SDE) that models the…

## 5 Citations

Stochastic Differential Equations with Variational Wishart Diffusions

- Computer Science, MathematicsICML
- 2020

A Bayesian non-parametric way of inferring stochastic differential equations for both regression tasks and continuous-time dynamical modelling and experimental evidence that modelling diffusion often improves performance is provided.

Learning Partially Known Stochastic Dynamics with Empirical PAC Bayes

- Computer Science, MathematicsAISTATS
- 2021

A novel scheme for fitting heavily parameterized non-linear stochastic differential equations (SDEs) with a prior assigned on the parameters of the SDE drift and diffusion functions to achieve a Bayesian model, which provides an improved model fit accompanied with favorable extrapolation properties when provided a partial description of the environment dynamics.

Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations

- Computer Science, MathematicsArXiv
- 2021

This work demonstrates gradientbased stochastic variational inference in this infinite-parameter setting, producing arbitrarilyflexible approximate posteriors and derives a novel gradient estimator that approaches zero variance as the approximate posterior approaches the true posterior.

Bayesian Learning-Based Adaptive Control for Safety Critical Systems

- Engineering, Computer Science2020 IEEE International Conference on Robotics and Automation (ICRA)
- 2020

This work develops an adaptive control framework leveraging the theory of stochastic CLFs and Stochastic CBFs along with tractable Bayesian model learning via Gaussian Processes or Bayesian neural networks and demonstrates this architecture for high-speed terrestrial mobility targeting potential applications in safety-critical high- speed Mars rover missions.

Bayesian Neural Ordinary Differential Equations

- Computer ScienceArXiv
- 2020

This work demonstrates the successful integration of Neural ODEs with two methods of Bayesian Inference, and demonstrates the probabilistic identification of model specification in partially-described dynamical systems using universal ordinary differential equations.

## References

SHOWING 1-10 OF 12 REFERENCES

Black-Box Variational Inference for Stochastic Differential Equations

- Computer Science, MathematicsICML
- 2018

A standard mean-field variational approximation of the parameter posterior is used, and a recurrent neural network is introduced to approximate the posterior for the diffusion paths conditional on the parameters.

Neural Ordinary Differential Equations

- Computer Science, MathematicsNeurIPS
- 2018

This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks

- Computer Science, MathematicsAAAI
- 2016

This work proposes combining adaptive preconditioners with Stochastic Gradient Langevin Dynamics, and gives theoretical properties on asymptotic convergence and predictive risk, and empirical results for Logistic Regression, Feedforward Neural Nets, and Convolutional Neural Nets demonstrate that the preconditionsed SGLD method gives state-of-the-art performance.

Bayesian Learning via Stochastic Gradient Langevin Dynamics

- Mathematics, Computer ScienceICML
- 2011

In this paper we propose a new framework for learning from large scale datasets based on iterative learning from small mini-batches. By adding the right amount of noise to a standard stochastic…

Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

- Computer Science, MathematicsICML
- 2015

This work presents a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP), which works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients.

Neural SDE: Stabilizing Neural ODE Networks with Stochastic Noise

- Computer Science, MathematicsArXiv
- 2019

It is demonstrated that the Neural SDE network can achieve better generalization than the Neural ODE and is more resistant to adversarial and non-adversarial input perturbations.

Deep learning with differential Gaussian process flows

- Computer Science, MathematicsAISTATS
- 2019

A novel deep learning paradigm of differential flows that learn a stochastic differential equation transformations of inputs prior to a standard classification or regression function is proposed, demonstrating excellent results as compared to deep Gaussian processes and Bayesian neural networks.

Applied Stochastic Differential Equations

- Computer Science
- 2019

The topic of this book is stochastic differential equations (SDEs), which are differential equations that produce a different “answer” or solution trajectory each time they are solved, and the emphasis is on applied rather than theoretical aspects of SDEs.

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Mathematics, Computer ScienceICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.