#### Filter Results:

#### Publication Year

1992

2015

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

#### Data Set Used

Learn More

It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method, is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so… (More)

This paper formalizes a general technique to combine different methods in the solution of large systems of nonlinear equations using parallel asynchronous implementations on distributed-memory multiprocessor systems. Such combinations of methods, referred to as Team Algorithms, are evaluated as a way of obtaining desirable properties of different methods… (More)

— A gradient system with discontinuous righthand side that solves an underdetermined system of linear equations in the L1 norm is presented. An upper bound estimate for finite time convergence to a solution set of the system of linear equations is shown by means of the Persidskii form of the gradient system and the corresponding non-smooth diagonal type… (More)

It is shown that the assumption of D-stability of the interconnection matrix, together with the standard assumptions on the activation functions, guarantee the existence of a unique equilibrium under a synchronous mode of operation as well as a class of asynchronous modes. For the synchronous mode, these assumptions are also shown to imply local asymptotic… (More)

This paper develops an algorithm that extracts explanatory rules from microarray data, which we treat as time series, using genetic programming (GP) and fuzzy logic. Reverse polish notation is used (RPN) to describe the rules and to facilitate the GP approach. The algorithm also allows for the insertion of prior knowledge, making it possible to find sets of… (More)

— The standard conjugate gradient (CG) method uses orthogonality of the residues to simplify the formulas for the parameters necessary for convergence. In adaptive filtering, the sample-by-sample update of the correlation matrix and the cross-correlation vector causes a loss of the residue orthogonality in a modified online algorithm, which, in turn,… (More)

Gradient dynamical systems with discontinuous righthand sides are designed using Persidskii-type nonsmooth Lyapunov functions to work as support vector machines (SVMs) for the discrimination of nonseparable classes. The gradient systems are obtained from an exact penalty method applied to the constrained quadratic optimization problems, which are… (More)