#### Filter Results:

#### Publication Year

1997

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

We present a new matrix-free method for the large-scale trust-region subproblem, assuming that the approximate Hessian is updated by the L-BFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix whose inverse can be expressed analytically, without using… (More)

In this work, an efficient training algorithm for feedforward neural networks is presented. It is based on a scaled version of the conjugate gradient method suggested by Perry, which employs the spectral steplength of Barzilai and Borwein that contains second order information without estimating the Hessian matrix. The learning rate is automatically adapted… (More)

A new method for the computation of the global minimum of a continuously differentiable real–valued function f of n variables is presented. This method, which is composed of two parts, is based on the combinatorial topology concept of the degree of a mapping associated with an oriented polyhedron. In the first part, interval arithmetic is implemented for a… (More)

We present a nearly-exact method for the large scale trust region sub-problem (TRS) based on the properties of the minimal-memory BFGS method. Our study in concentrated in the case where the initial BFGS matrix can be any scaled identity matrix. The proposed method is a variant of the Moré-Sorensen method that exploits the eigenstructure of the approximate… (More)

We present a branch-and-prune algorithm for univariate optimization. Pruning is achieved by using first order information of the objective function by means of an interval evaluation of the derivative over the current interval. First order information aids fourfold. Firstly, to check monotonicity. Secondly, to determine optimal centers which, along with the… (More)

A new evolutionary algorithm for the global optimization of multimodal functions is presented. The algorithm is essentially a parallel direct search method which maintains a populations of individuals and utilizes an evolution operator to evolve them. This operator has two functions. Firstly, to exploit the search space as much as possible, and secondly to… (More)

— We present a new method for computing verified enclosures for the global minimum value and all global minimum points of univariate functions subject to bound constrains. The method works within the branch and bound framework and incorporates inner and outer pruning steps by using first order information of the objective function by means of an interval… (More)

Methods of interval arithmetic can be used to reliably find with certainty all solutions to nonlinear systems of equations. In such methods, the system is transformed into a linear interval system and a preconditioned interval Gauss-Seidel method may then be used to compute such solution bounds. In this work, a new heuristic for solving polynomial systems… (More)

Brain tumours grading is a crucial step for determining treatment planning and patient management. The grade of a tumour is defined by pathologists after reviewing biopsies under the microscope, a procedure that has been proven highly subjective. In this work, we propose a computer-based system for the automatic classification of astrocytomas that can be… (More)

We present an interval branch-and-prune algorithm for computing verified enclosures for the global minimum and all global minimizers of univariate functions subject to bound constraints. The algorithm works within the branch-and-bound framework and uses first order information of the objective function. In this context, we investigate valuable properties of… (More)