#### Filter Results:

- Full text PDF available (4)

#### Publication Year

2006

2017

- This year (1)
- Last 5 years (4)
- Last 10 years (7)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

We consider the global optimization of a nonsmooth (nondifferentiable) nonconvex real function. We introduce a variable metric descent method adapted to nonsmooth situations, which is modified by the incorporation of suitable random perturbations. Convergence to a global minimum is established and a simple method for the generation of suitable perturbations… (More)

- Abdelkrim El Mouatasim
- J. Applied Mathematics
- 2010

The random perturbation of generalized reduced gradient method for optimization under nonlinear differentiable constraints is proposed. Generally speaking, a particular iteration of this method proceeds in two phases. In the Restoration Phase, feasibility is restored by means of the resolution of an auxiliary nonlinear problem, a generally nonlinear system… (More)

- Abdelkrim El Mouatasim, Rachid Ellaia, A. Al-Hossain
- Optimization Letters
- 2012

- Abdelkrim El Mouatasim, Rachid Ellaia, José Eduardo Souza de Cursi
- Applied Mathematics and Computation
- 2014

- Abdelkrim El Mouatasim, Mohamed Wakrim
- Signal, Image and Video Processing
- 2015

- Abdelkrim El Mouatasim, Rachid Ellaia, José Eduardo Souza de Cursi
- Applied Mathematics and Computer Science
- 2011

We present a random perturbation of the projected variable metric method for solving linearly constrained nonsmooth (i.e., nondifferentiable) nonconvex optimization problems, and we establish the convergence to a global minimum for a locally Lipschitz continuous objective function which may be nondifferentiable on a countable set of points. Numerical… (More)

- Abdelkrim El Mouatasim
- Numerical Algorithms
- 2017

In this paper, we proposed an implementation of stochastic perturbation of reduced gradient and bisection (SPRGB) method for optimizing a non-convex differentiable function subject to linear equality constraints and non-negativity bounds on the variables. In particular, at each iteration, we compute a search direction by reduced gradient, and optimal line… (More)

In this paper, the global optimization of a nonconvex objective function under linear and nonlinear differentiable constraints is studied, a reduced gradient and GRG descent methods with random perturbation is proposed and it is desired to establish the global convergence of the algorithm. Some numerical examples are also given by the problems of… (More)

- ‹
- 1
- ›