# Non-Euclidean Contractivity of Recurrent Neural Networks

@article{Davydov2021NonEuclideanCO, title={Non-Euclidean Contractivity of Recurrent Neural Networks}, author={A. Davydov and Anton V. Proskurnikov and Francesco Bullo}, journal={ArXiv}, year={2021}, volume={abs/2110.08298} }

Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, entrainment, and computational efficiency. These properties can all be established through the development of a comprehensive contractivity theory for neural networks. This paper makes three sets of contributions. First, regarding `1/`∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights, monotonicity…

## Figures from this paper

## References

SHOWING 1-10 OF 42 REFERENCES

New conditions for global stability of neural networks with application to linear and quadratic programming problems

- Mathematics
- 1995

In this paper, we present new conditions ensuring existence, uniqueness, and Global Asymptotic Stability (GAS) of the equilibrium point for a large class of neural networks. The results are…

Non-Euclidean Contraction Theory for Robust Nonlinear Stability

- Mathematics, Computer Science
- 2021

This work introduces weak pairings as a framework to study contractivity with respect to arbitrary norms, and establishes five equivalent characterizations for contraction, including the one-sided Lipschitz condition for the vector field as well as matrix measure and Demidovich conditions for the corresponding Jacobian.

Qualitative analysis of neural networks

- Mathematics
- 1989

Results from the qualitative theory of large-scale interconnected dynamical systems are surveyed and utilized to develop a qualitative theory for the Hopfield model of neural networks. Such networks…

Lipschitz Bounded Equilibrium Networks

- Computer Science, EngineeringArXiv
- 2020

New parameterizations of equilibrium neural networks defined by implicit equations are introduced and well-posedness (existence of solutions) is shown under less restrictive conditions on the network weights and more natural assumptions on the activation functions: that they are monotone and slope restricted.

A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks

- Mathematics, Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2014

The purpose of this paper is to provide a comprehensive review of the research on stability of continuous-time recurrent Neural networks, including Hopfield neural networks, Cohen-Grossberg neural networks and related models.

A Contractive Approach to Separable Lyapunov Functions for Monotone Systems

- Computer Science, MathematicsAutom.
- 2019

This paper proposes an algorithm based on sum-of-squares programming to compute separable Lyapunov functions for monotone systems that are also contractive, that is, the distance between any pair of trajectories exponentially decreases.

On a class of globally stable neural circuits

- Mathematics
- 1994

The authors show that diagonal stability of the interconnection matrix leads to a simple proof of the existence, uniqueness, and global asymptotic stability of the equilibrium of a Hopfield-Tank…

Global Asymptotic Stability of a Class of Dynamical Neural Networks

- Mathematics, MedicineInt. J. Neural Syst.
- 2003

It is proved the existence and uniqueness of the equilibrium, and a quadratic-type Lyapunov function is given for the flow of a competitive neural system with fast and slow dynamic variables and thus proves the global stability of theilibrium point.

A note on the global stability of dynamical neural networks

- Mathematics
- 2002

It is shown that the additive diagonal stability condition on the interconnection matrix of a neural network, together with the assumption that the activation functions are nondecreasing, guarantees…

Necessary and sufficient condition for absolute stability of neural networks

- Mathematics
- 1994

The main result in this paper is that for a neural circuit of the Hopfield type with a symmetric connection matrix T, the negative semidefiniteness of T is a necessary and sufficient condition for…