Learn More
It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method, is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so(More)
In the most basic application of Ant Colony Optimization (ACO), a set of artificial ants find the shortest path between a source and a destination. Ants deposit pheromone on paths they take, preferring paths that have more pheromone on them. Since shorter paths are traversed faster, more pheromone accumulates on them in a given time, attracting more ants(More)
This paper formalizes a general technique to combine different methods in the solution of large systems of nonlinear equations using parallel asynchronous implementations on distributed-memory multiprocessor systems. Such combinations of methods, referred to as Team Algorithms, are evaluated as a way of obtaining desirable properties of different methods(More)
A gradient system with discontinuous righthand side that solves an underdetermined system of linear equations in the L/sub 1/ norm is presented. An upper bound estimate for finite time convergence to a solution set of the system of linear equations is shown by means of the Persidskii form of the gradient system and the corresponding nonsmooth diagonal type(More)
It is shown that the assumption of D-stability of the interconnection matrix, together with the standard assumptions on the activation functions, guarantee the existence of a unique equilibrium under a synchronous mode of operation as well as a class of asynchronous modes. For the synchronous mode, these assumptions are also shown to imply local asymptotic(More)