• Corpus ID: 239016165

Non-Euclidean Contractivity of Recurrent Neural Networks

  title={Non-Euclidean Contractivity of Recurrent Neural Networks},
  author={A. Davydov and Anton V. Proskurnikov and Francesco Bullo},
Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, entrainment, and computational efficiency. These properties can all be established through the development of a comprehensive contractivity theory for neural networks. This paper makes three sets of contributions. First, regarding `1/`∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights, monotonicity… 

Figures from this paper


New conditions for global stability of neural networks with application to linear and quadratic programming problems
In this paper, we present new conditions ensuring existence, uniqueness, and Global Asymptotic Stability (GAS) of the equilibrium point for a large class of neural networks. The results are
Non-Euclidean Contraction Theory for Robust Nonlinear Stability
This work introduces weak pairings as a framework to study contractivity with respect to arbitrary norms, and establishes five equivalent characterizations for contraction, including the one-sided Lipschitz condition for the vector field as well as matrix measure and Demidovich conditions for the corresponding Jacobian.
Qualitative analysis of neural networks
Results from the qualitative theory of large-scale interconnected dynamical systems are surveyed and utilized to develop a qualitative theory for the Hopfield model of neural networks. Such networks
Lipschitz Bounded Equilibrium Networks
New parameterizations of equilibrium neural networks defined by implicit equations are introduced and well-posedness (existence of solutions) is shown under less restrictive conditions on the network weights and more natural assumptions on the activation functions: that they are monotone and slope restricted.
A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks
The purpose of this paper is to provide a comprehensive review of the research on stability of continuous-time recurrent Neural networks, including Hopfield neural networks, Cohen-Grossberg neural networks and related models.
A Contractive Approach to Separable Lyapunov Functions for Monotone Systems
  • S. Coogan
  • Computer Science, Mathematics
  • 2019
This paper proposes an algorithm based on sum-of-squares programming to compute separable Lyapunov functions for monotone systems that are also contractive, that is, the distance between any pair of trajectories exponentially decreases.
On a class of globally stable neural circuits
The authors show that diagonal stability of the interconnection matrix leads to a simple proof of the existence, uniqueness, and global asymptotic stability of the equilibrium of a Hopfield-Tank
Global Asymptotic Stability of a Class of Dynamical Neural Networks
It is proved the existence and uniqueness of the equilibrium, and a quadratic-type Lyapunov function is given for the flow of a competitive neural system with fast and slow dynamic variables and thus proves the global stability of theilibrium point.
A note on the global stability of dynamical neural networks
It is shown that the additive diagonal stability condition on the interconnection matrix of a neural network, together with the assumption that the activation functions are nondecreasing, guarantees
Necessary and sufficient condition for absolute stability of neural networks
The main result in this paper is that for a neural circuit of the Hopfield type with a symmetric connection matrix T, the negative semidefiniteness of T is a necessary and sufficient condition for