Corpus ID: 208527256

Differential Bayesian Neural Nets

@article{Look2019DifferentialBN,
  title={Differential Bayesian Neural Nets},
  author={Andreas Look and Melih Kandemir},
  journal={ArXiv},
  year={2019},
  volume={abs/1912.00796}
}
Neural Ordinary Differential Equations (N-ODEs) are a powerful building block for learning systems, which extend residual networks to a continuous-time dynamical system. We propose a Bayesian version of N-ODEs that enables well-calibrated quantification of prediction uncertainty, while maintaining the expressive power of their deterministic counterpart. We assign Bayesian Neural Nets (BNNs) to both the drift and the diffusion terms of a Stochastic Differential Equation (SDE) that models the… Expand
Stochastic Differential Equations with Variational Wishart Diffusions
TLDR
A Bayesian non-parametric way of inferring stochastic differential equations for both regression tasks and continuous-time dynamical modelling and experimental evidence that modelling diffusion often improves performance is provided. Expand
Learning Partially Known Stochastic Dynamics with Empirical PAC Bayes
TLDR
A novel scheme for fitting heavily parameterized non-linear stochastic differential equations (SDEs) with a prior assigned on the parameters of the SDE drift and diffusion functions to achieve a Bayesian model, which provides an improved model fit accompanied with favorable extrapolation properties when provided a partial description of the environment dynamics. Expand
Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations
TLDR
This work demonstrates gradientbased stochastic variational inference in this infinite-parameter setting, producing arbitrarilyflexible approximate posteriors and derives a novel gradient estimator that approaches zero variance as the approximate posterior approaches the true posterior. Expand
Bayesian Learning-Based Adaptive Control for Safety Critical Systems
TLDR
This work develops an adaptive control framework leveraging the theory of stochastic CLFs and Stochastic CBFs along with tractable Bayesian model learning via Gaussian Processes or Bayesian neural networks and demonstrates this architecture for high-speed terrestrial mobility targeting potential applications in safety-critical high- speed Mars rover missions. Expand
Bayesian Neural Ordinary Differential Equations
TLDR
This work demonstrates the successful integration of Neural ODEs with two methods of Bayesian Inference, and demonstrates the probabilistic identification of model specification in partially-described dynamical systems using universal ordinary differential equations. Expand

References

SHOWING 1-10 OF 12 REFERENCES
Black-Box Variational Inference for Stochastic Differential Equations
TLDR
A standard mean-field variational approximation of the parameter posterior is used, and a recurrent neural network is introduced to approximate the posterior for the diffusion paths conditional on the parameters. Expand
Neural Ordinary Differential Equations
TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models. Expand
Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks
TLDR
This work proposes combining adaptive preconditioners with Stochastic Gradient Langevin Dynamics, and gives theoretical properties on asymptotic convergence and predictive risk, and empirical results for Logistic Regression, Feedforward Neural Nets, and Convolutional Neural Nets demonstrate that the preconditionsed SGLD method gives state-of-the-art performance. Expand
Bayesian Learning via Stochastic Gradient Langevin Dynamics
In this paper we propose a new framework for learning from large scale datasets based on iterative learning from small mini-batches. By adding the right amount of noise to a standard stochasticExpand
Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks
TLDR
This work presents a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP), which works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients. Expand
Neural SDE: Stabilizing Neural ODE Networks with Stochastic Noise
TLDR
It is demonstrated that the Neural SDE network can achieve better generalization than the Neural ODE and is more resistant to adversarial and non-adversarial input perturbations. Expand
Deep learning with differential Gaussian process flows
TLDR
A novel deep learning paradigm of differential flows that learn a stochastic differential equation transformations of inputs prior to a standard classification or regression function is proposed, demonstrating excellent results as compared to deep Gaussian processes and Bayesian neural networks. Expand
Applied Stochastic Differential Equations
TLDR
The topic of this book is stochastic differential equations (SDEs), which are differential equations that produce a different “answer” or solution trajectory each time they are solved, and the emphasis is on applied rather than theoretical aspects of SDEs. Expand
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
TLDR
A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. Expand
An equilibrium characterization of the term structure
Abstract The paper derives a general form of the term structure of interest rates. The following assumptions are made: (A.1) The instantaneous (spot) interest rate follows a diffusion process; (A.2)Expand
...
1
2
...