A Global Optimization Algorithm ( GOP ) For Certain Classes

Abstract

A large number of nonlinear optimization problems involve bilinear, quadratic and/or polynomial functions in their objective function and/or constraints. In this paper, a theoretical approach is proposed for global optimization in constrained nonconvex NLP problems. The original nonconvex problem is decomposed into primal and relaxed dual subproblems by introducing new transformation variables if necessary and partitioning of the resulting variable set. The decomposition is designed to provide valid upper and lower bounds on the global optimum through the solutions of the primal and relaxed dual subproblems respectively. New theoretical results are presented that enable the rigorous solution of the relaxed dual problem. The approach is used in the development of a Global OPtimization algorithm (GOP). The algorithm is proved to attain nite-convergence and-global optimality. An example problem is used to illustrate the GOP algorithm both computationally and geometrically. In an accompanying paper (Visweswaran and Floudas, 1990), application of the theory and the GOP algorithm to various classes of optimization problems, as well as computational results of the approach are provided.

01020'94'97'00'03'06'09'12'15
Citations per Year

230 Citations

Semantic Scholar estimates that this publication has 230 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Floudas1990AGO, title={A Global Optimization Algorithm ( GOP ) For Certain Classes}, author={Achilleas Floudas and V . VisweswaranDepartment}, year={1990} }