D. G. Sotiropoulos

Learn More
We present a new matrix-free method for the large-scale trust-region subproblem, assuming that the approximate Hessian is updated by the L-BFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix whose inverse can be expressed analytically, without using(More)
In this work, an efficient training algorithm for feedforward neural networks is presented. It is based on a scaled version of the conjugate gradient method suggested by Perry, which employs the spectral steplength of Barzilai and Borwein that contains second order information without estimating the Hessian matrix. The learning rate is automatically adapted(More)
Artificial neural networks have been widely used for knowledge extraction from biomedical datasets and constitute an important role in bio-data exploration and analysis.In this work, we proposed a new curvilinear algorithm for training large neural networks which is based on the analysis of the eigenstructure of the memoryless BFGS matrices. The proposed(More)
We present a branch-and-prune algorithm for univariate optimization. Pruning is achieved by using first order information of the objective function by means of an interval evaluation of the derivative over the current interval. First order information aids fourfold. Firstly, to check monotonicity. Secondly, to determine optimal centers which, along with the(More)
We present a new method for computing verified enclosures for the global minimum value and all global minimum points of univariate functions subject to bound constrains. The method works within the branch and bound framework and incorporates inner and outer pruning steps by using first order information of the objective function by means of an interval(More)
We present an interval branch-and-prune algorithm for computing verified enclosures for the global minimum and all global minimizers of univariate functions subject to bound constraints. The algorithm works within the branch-and-bound framework and uses first order information of the objective function. In this context, we investigate valuable properties of(More)
We present a newmatrix-free method for the computation of negative curvature directions based on the eigenstructure of minimal-memory BFGS matrices. We determine via simple formulas the eigenvalues of these matrices and we compute the desirable eigenvectors by explicit forms. Consequently, a negative curvature direction is computed in such a way that avoids(More)