Learn More
We consider the global optimization of a nonsmooth (nondifferentiable) nonconvex real function. We introduce a variable metric descent method adapted to nonsmooth situations, which is modified by the incorporation of suitable random perturbations. Convergence to a global minimum is established and a simple method for the generation of suitable perturbations(More)
The random perturbation of generalized reduced gradient method for optimization under nonlinear differentiable constraints is proposed. Generally speaking, a particular iteration of this method proceeds in two phases. In the Restoration Phase, feasibility is restored by means of the resolution of an auxiliary nonlinear problem, a generally nonlinear system(More)
We present a random perturbation of the projected variable metric method for solving linearly constrained nonsmooth (i.e., nondifferentiable) nonconvex optimization problems, and we establish the convergence to a global minimum for a locally Lipschitz continuous objective function which may be nondifferentiable on a countable set of points. Numerical(More)
In this paper, we proposed an implementation of stochastic perturbation of reduced gradient and bisection (SPRGB) method for optimizing a non-convex differentiable function subject to linear equality constraints and non-negativity bounds on the variables. In particular, at each iteration, we compute a search direction by reduced gradient, and optimal line(More)
  • 1