• Corpus ID: 247223125

Closed-form Continuous-time Neural Models

@inproceedings{Hasani2021ClosedformCN,
  title={Closed-form Continuous-time Neural Models},
  author={Ramin M. Hasani and Mathias Lechner and Alexander Amini and Lucas Liebenwein and Aaron Ray and Max Tschaikowski and Gerald Teschl and Daniela Rus},
  year={2021}
}
Continuous-time neural processes are performant sequential decisionmakers that are built by differential equations (DE). However, their expressive power when they are deployed on computers is bottlenecked by numerical DE solvers. This limitation has significantly slowed down scaling and understanding of numerous natural physical phenomena such as the dynamics of nervous systems. Ideally we would circumvent this bottleneck by solving the given dynamical system in closed-form. This is known to be… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 75 REFERENCES
Liquid Time-constant Networks
TLDR
This work introduces a new class of time-continuous recurrent neural network models that construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates, and demonstrates the approximation capability of Liquid Time-Constant Networks (LTCs) compared to modern RNNs.
Lipschitz Recurrent Neural Networks
TLDR
This work proposes a recurrent unit that describes the hidden state's evolution with two parts: a well-understood linear component plus a Lipschitz nonlinearity, which is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Coupled Oscillatory Recurrent Neural Network (coRNN): An accurate and (gradient) stable architecture for learning long time dependencies
TLDR
This work proposes a novel architecture for recurrent neural networks based on a time-discretization of a system of second-order ordinary differential equations, modeling networks of controlled nonlinear oscillators, and proves precise bounds on the gradients of the hidden states, leading to the mitigation of the exploding and vanishing gradient problem.
Learning Long-Term Dependencies in Irregularly-Sampled Time Series
TLDR
This work designs a new algorithm based on the long short-term memory (LSTM) that separates its memory from its time-continuous state within the RNN, allowing it to respond to inputs arriving at arbitrary time-lags while ensuring a constant error propagation through the memory path.
Neural Ordinary Differential Equations
TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.
Causal Navigation by Continuous-time Neural Networks
TLDR
The results demonstrate that causal continuous-time deep models can perform robust navigation tasks, where advanced recurrent models fail, and learn complex causal control representations directly from raw visual inputs and scale to solve a variety of tasks using imitation learning.
Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
TLDR
Backpropagation through the ODE solver allows each layer to adapt its internal time-step, enabling the network to learn task-relevant time-scales and exceed state-of-the-art performance among RNNs on permuted sequential MNIST.
Approximation of dynamical systems by continuous time recurrent neural networks
Dissecting Neural ODEs
TLDR
This work "open the box" and offers a system-theoretic perspective, including state augmentation strategies and robustness, with the aim of clarifying the influence of several design choices on the underlying dynamics.
AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks
TLDR
This paper draws connections between recurrent networks and ordinary differential equations and proposes a special form of recurrent networks called AntisymmetricRNN, able to capture long-term dependencies thanks to the stability property of its underlying differential equation.
...
...