Learn More
We present two new constraint qualifications (CQ) that are weaker than the recently introduced Relaxed Constant Positive Linear Dependence (RCPLD) constraint qualification. RCPLD is based on the assumption that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact(More)
In this work we introduce a relaxed version of the constant positive linear dependence constraint qualification (CPLD) that we call RCPLD. This development is inspired by a recent generalization of the constant rank constraint qualification from Minchenko and Stakhovski that was called RCR. We show that RCPLD is enough to ensure the convergence of an(More)
In this work we present new weak conditions that ensure the validity of necessary second order optimality conditions (SOC) for nonlinear optimization. We are able to prove that weak and strong SOCs hold for all Lagrange multipliers using Abadie-type assumptions. We also prove weak and strong SOCs for at least one Lagrange multiplier imposing the(More)
In this paper we investigate how to efficiently apply Approximate-Karush-Kuhn-Tucker (AKKT) proximity measures as stopping criteria for optimization algorithms that do not generate approximations to Lagrange multipliers, in particular, Genetic Algorithms. We prove that for a wide range of constrained optimization problems the KKT error measurement tends to(More)
We consider the minimization of a convex function on a compact poly-hedron defined by linear equality constraints and nonnegative variables. We define the Levenberg-Marquardt (L-M) and central trajectories starting at the analytic center and using the same parameter, and show that they satisfy a primal-dual relationship, being close to each other for large(More)
  • 1