Absolute stability of analytic neural networks: an approach based on finite trajectory length

@article{Forti2004AbsoluteSO,
  title={Absolute stability of analytic neural networks: an approach based on finite trajectory length},
  author={Mauro Forti and Alberto Tesi},
  journal={IEEE Transactions on Circuits and Systems I: Regular Papers},
  year={2004},
  volume={51},
  pages={2460-2469}
}
A neural network (NN) is said to be convergent (or completely stable) when each trajectory tends to an equilibrium point (a stationary state). A stronger property is that of absolute stability, which means that convergence holds for any choice of the neural network parameters, and any choice of the nonlinear functions, within specified and well characterized sets. In particular, the property of absolute stability requires that the NN be convergent also when, for some parameter values, it… CONTINUE READING
Highly Cited
This paper has 35 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 20 extracted citations

The Lojasiewicz Exponent at an Equilibrium Point of a Standard CNN is 1/2

I. J. Bifurcation and Chaos • 2006
View 9 Excerpts
Highly Influenced

Stability of Analytic Neural Networks With Event-Triggered Synaptic Feedbacks

IEEE Transactions on Neural Networks and Learning Systems • 2016
View 5 Excerpts
Highly Influenced

Stability of Hopfield neural networks with event-triggered feedbacks

2014 International Joint Conference on Neural Networks (IJCNN) • 2014
View 10 Excerpts
Highly Influenced

Convergence of a Subclass of Cohen–Grossberg Neural Networks via the Łojasiewicz Inequality

IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) • 2008
View 9 Excerpts
Highly Influenced

A Two-Layer Recurrent Neural Network for Nonsmooth Convex Optimization Problems

IEEE Transactions on Neural Networks and Learning Systems • 2015

A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks

IEEE Transactions on Neural Networks and Learning Systems • 2014
View 3 Excerpts

References

Publications referenced by this paper.
Showing 1-10 of 23 references

Neurons with graded response have collective computational properties like those of two-state neurons.

Proceedings of the National Academy of Sciences of the United States of America • 1984
View 17 Excerpts
Highly Influenced

Cellular neural networks: theory

View 4 Excerpts
Highly Influenced

A more rigorous proof of complete stability of cellular neural networks

C. W. Wu, L. O. Chua
IEEE Trans. Circuits Syst. I, vol. 44, pp. 370–371, Apr. 1997. • 1997
View 5 Excerpts
Highly Influenced

Convergent activation dynamics in continuous time networks

Neural Networks • 1989
View 8 Excerpts
Highly Influenced

Qualitative analysis and synthesis of a class of neural networks

J. H. Li, A. N. Michel, W. Porod
IEEE Trans. Circuits Syst., vol. 35, pp. 976–986, Aug. 1988. • 1988
View 4 Excerpts
Highly Influenced

Absolute stability of global pattern formation and parallel memory storage by competitive neural networks

IEEE Transactions on Systems, Man, and Cybernetics • 1983
View 7 Excerpts
Highly Influenced