A New Unconstrained Optimization Method for Imprecise Function and Gradient Values

@article{Vrahatis1996ANU,
  title={A New Unconstrained Optimization Method for Imprecise Function and Gradient Values},
  author={Michael N. Vrahatis and George S. Androulakis and George Manoussakis},
  journal={Journal of Mathematical Analysis and Applications},
  year={1996},
  volume={197},
  pages={586-607}
}
.A new algorithm for unconstrained optimization is presented which is based on a modified one-dimensional bisection method. The algorithm actually uses only the signs of function and gradient values. Thus it can be applied to problems with imprecise function and gradient values. It converges in one iteration on quadratic functions of n variables, it rapidly minimizes general functions and it does not require evaluation or estimation of the matrix of second partial derivatives. The algorithm has… 

Global Optimization for Imprecise Problems

A new method for the computation of the global minimum of a continuously differentiable real—valued function f of n variables is presented and can be successfully applied to problems with imprecise function and gradient values.

A Derivative Free Minimization Method For Noisy Functions

An unconstrained minimization method which is based on Powell's derivative free method is presented and it can be successfully applied to problems with imprecise function values.

A New Preconditioned Inexact Line-Search Technique for Unconstrained Optimization

Numerical results show that the new inexact line-search and the new preconditioned conjugate gradient search directions are efficient for solving unconstrained nonlinear optimization problem in many situations.

A NON-MONOTONE CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION

A new algorithm for finding the unconstrained minimum of a continuously dierentiable function f(x) in n variables, based on a conic model function, which does not involve the conjugacy matrix or the Hessian of the model function.

DIMENSION REDUCING METHODS FOR SYSTEMS OF NONLINEAR EQUATIONS AND UNCONSTRAINED OPTIMIZATION : A REVIEW

Although these methods use reduction to simpler one, they converge quadratically, and incorporate the advantages of nonlinear SOR and Newton’s algorithms, and since they do not directly perform function evaluations, they can be applied to problems with imprecise function values.

The non-monotone conic algorithm

A new algorithm for finding the unconstrained minimum of a continuously differentiate function f (x) in n variables is presented, based on a conic model function, which does not involve the conjugacy matrix or the Hessian of the model function.

New Sequential and Parallel Derivative-Free Algorithms for Unconstrained Minimization

It is proved that, under mild assumptions, a sufficient decrease condition holds for a nonsmooth function and sequential and parallel derivative-free algorithms for finding a local minimum of smooth and nonsm Smooth functions of practical interest are presented.

Sign-methods for training with imprecise error function and gradient values

The proposed approach seems practically useful when training is affected by technology imperfections, limited precision in operations and data, hardware component variations and environmental changes that cause unpredictable deviations of parameter values from the designed configuration.
...

References

SHOWING 1-10 OF 37 REFERENCES

A Rapidly Convergent Descent Method for Minimization

A number of theorems are proved to show that it always converges and that it converges rapidly, and this method has been used to solve a system of one hundred non-linear simultaneous equations.

Numerical methods for unconstrained optimization and nonlinear equations

Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.

Algorithm 666: Chabis: a mathematical software package for locating and evaluating roots of systems of nonlinear equations

The user interface to CHABIS is described and several details of its implementation are presented, as well as an example of its usage.

Asymptotic near optimality of the bisection method

It is proved that the answer to the question whether bisection is nearly optimal in theasymptotic worst case sense is positive for the classF of functions having zeros of infinite multiplicity and information consisting of evaluations of continuous linear functionals.

Iterative solution of nonlinear equations in several variables

Convergence of Minimization Methods An Annotated List of Basic Reference Books Bibliography Author Index Subject Index.

Bisection is optimal

SummaryWe seek an approximation to a zero of a continuous functionf:[a,b]→ℝ such thatf(a)≦0 andf(b)≧0. It is known that the bisection algorithm makes optimal use ofn function evaluations, i.e.,

Testing Unconstrained Optimization Software

A relatwely large but easy-to-use collection of test functions and designed gmdelines for testing the reliability and robustness of unconstrained optimization software.

Numerical Recipes in C: The Art of Sci-entific Computing

This is the revised and greatly expanded Second Edition of the hugely popular Numerical Recipes, with over 100 new routines (now well over 300 in all), plus upgraded versions of many of the original routines.

Matrix Iterative Analysis

Matrix Properties and Concepts.- Nonnegative Matrices.- Basic Iterative Methods and Comparison Theorems.- Successive Overrelaxation Iterative Methods.- Semi-Iterative Methods.- Derivation and