Corpus ID: 4944472

NAIS-Net: Stable Deep Networks from Non-Autonomous Differential Equations

@article{Ciccone2018NAISNetSD,
  title={NAIS-Net: Stable Deep Networks from Non-Autonomous Differential Equations},
  author={Marco Ciccone and Marco Gallieri and J. Masci and Christian Osendorfer and F. Gomez},
  journal={ArXiv},
  year={2018},
  volume={abs/1804.07209}
}
  • Marco Ciccone, Marco Gallieri, +2 authors F. Gomez
  • Published 2018
  • Mathematics, Computer Science
  • ArXiv
  • This paper introduces "Non-Autonomous Input-Output Stable Network" (NAIS-Net), a very deep architecture where each stacked processing block is derived from a time-invariant non-autonomous dynamical system. Non-autonomy is implemented by skip connections from the block input to each of the unrolled processing stages and allows stability to be enforced so that blocks can be unrolled adaptively to a pattern-dependent processing depth. We prove that the network is globally asymptotically stable so… CONTINUE READING
    34 Citations
    Continuous-in-Depth Neural Networks
    • 1
    • PDF
    Tustin neural networks: a class of recurrent nets for adaptive MPC of mechanical systems
    • 2
    • PDF
    RESIDUAL NETWORKS CLASSIFY INPUTS BASED ON THEIR NEURAL TRANSIENT DYNAMICS
    ANODEV2: A Coupled Neural ODE Framework
    • 9
    • PDF
    IMEXnet: A Forward Stable Deep Neural Network
    • 15
    • PDF
    Towards nominal stability certification of deep learning-based controllers
    SNODE: Spectral Discretization of Neural ODEs for System Identification
    • 15
    • PDF
    Neural Controlled Differential Equations for Irregular Time Series
    • 20
    • PDF
    Parameterized Neural Ordinary Differential Equations: Applications to Computational Physics Problems
    Towards Adaptive Residual Network Training: A Neural-ODE Perspective
    • 2
    • PDF

    References

    SHOWING 1-10 OF 61 REFERENCES
    Highway and Residual Networks learn Unrolled Iterative Estimation
    • 133
    • PDF
    Multi-level Residual Networks from Dynamical Systems View
    • 84
    • PDF
    Stable Architectures for Deep Neural Networks
    • 208
    • Highly Influential
    • PDF
    The Reversible Residual Network: Backpropagation Without Storing Activations
    • 172
    • PDF
    Maximum Principle Based Algorithms for Deep Learning
    • 81
    • PDF
    Input output stability of recurrent neural networks
    • 24
    • Highly Influential
    FractalNet: Ultra-Deep Neural Networks without Residuals
    • 441
    • PDF
    On orthogonality and learning recurrent networks with long term dependencies
    • 107
    • PDF
    Adaptive Computation Time for Recurrent Neural Networks
    • 217
    • PDF