Learn More
In this letter, without assuming the boundedness of the activation functions , we discuss the dynamics of a class of delayed neural networks with discontinuous activation functions. A relaxed set of sufficient conditions is derived, guaranteeing the existence, uniqueness, and global stability of the equilibrium point. Convergence behaviors for both state(More)
Principal component and minor component extractions provide powerful techniques in many information processing fields. There have been proposed a number of algorithms for principal and minor component (or subspace) extraction, which have different dynamical behaviors. In this paper, we give rigorous stability analysis of these algorithms, obtaining a(More)
—In this paper, we investigate synchronization of an array of linearly coupled identical connected neural networks with delays; Variational method is used to investigate local synchronization. Global exponential stability is studied, too. We do not assume that the coupling matrix is symmetric or irreducible. The linear matrix inequality approach is used to(More)
In this paper, we discuss dynamics of Cohen–Grossberg neural networks with discontinuous activations functions. We provide a relax set of sufficient conditions based on the concept of Lyapunov diagonally stability (LDS) for Cohen–Grossberg networks to be absolutely stable. Moreover, under certain conditions we prove that the system is exponentially stable(More)
The purpose of this paper is to investigate neural network capability systematically. The main results are: 1) every Tauber-Wiener function is qualified as an activation function in the hidden layer of a three-layered neural network; 2) for a continuous function in S'(R(1 )) to be a Tauber-Wiener function, the necessary and sufficient condition is that it(More)