# A Moving Balls Approximation Method for a Class of Smooth Constrained Minimization Problems

@article{Auslender2010AMB, title={A Moving Balls Approximation Method for a Class of Smooth Constrained Minimization Problems}, author={Alfred Auslender and Ron Shefi and Marc Teboulle}, journal={SIAM J. Optim.}, year={2010}, volume={20}, pages={3232-3259} }

We introduce a new algorithm for a class of smooth constrained minimization problems which is an iterative scheme that generates a sequence of feasible points that approximates the constraints set by a sequence of balls and is accordingly called the Moving Balls Approximation algorithm (MBA). The computational simplicity of MBA, which uses first order data information, makes it suitable for large scale problems. Theoretical and computational properties of MBA in its primal and dual forms are…

## 19 Citations

### Level Constrained First Order Methods for Function Constrained Optimization

- Computer Science, Mathematics
- 2022

We present a new feasible proximal gradient method for constrained optimization where both the objective and constraint functions are given by summation of a smooth, possibly nonconvex function and a…

### Convergence Rate Analysis of a Sequential Convex Programming Method with Line Search for a Class of Constrained Difference-of-Convex Optimization Problems

- Mathematics, Computer ScienceSIAM J. Optim.
- 2021

This paper analyzes the convergence rate of the sequence generated by the sequential convex programming method with monotone line search for a class of difference-of-convex (DC) optimization problems with multiple smooth inequality constraints, and deduces the KL exponent of the extended objective function from its Lagrangian in the convex settings, under additional assumptions on the constraint functions.

### A very simple SQCQP method for a class of smooth convex constrained minimization problems with nice convergence results

- Mathematics, Computer ScienceMath. Program.
- 2013

A new and very simple algorithm for a class of smooth convex constrained minimization problems which is an iterative scheme related to sequential quadratically constrained quadratic programming methods, called SSQM, which uses first-order information to solve large scale problems.

### Retraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraints

- Computer Science, Mathematics
- 2021

This paper proposes first-order feasible methods for difference-of-convex (DC) programs with smooth inequality and simple geometric constraints, and shows that the extended objective of a large class of Euclidean norm regularized convex optimization problems is a KL function with exponent 1 2 ; consequently, the algorithm is locally linearly convergent when applied to these problems.

### The multiproximal linearization method for convex composite problems

- Computer Science, MathematicsMath. Program.
- 2020

This work shows through several numerical experiments how the use of multiple proximal terms can be decisive for problems with complex geometries.

### Majorization-Minimization Procedures and Convergence of SQP Methods for Semi-Algebraic and Tame Programs

- MathematicsMath. Oper. Res.
- 2016

This work studies general maximization-minimization procedures produced by families of strongly convex subproblems and establishes the convergence of sequences generated by these types of schemes to critical points using techniques from semi-algebraic geometry and variational analysis.

### Convergence analysis on the sequential convex programming method with monotone line search for multiple constrained difference-of-convex model

- Mathematics
- 2020

In this paper, we study the sequential convex programming method with monotone line search (SCPls) in [34] for a class of difference-of-convex (DC) optimization problems with multiple smooth…

### Sequential Convex Programming Methods for A Class of Structured Nonlinear Programming

- Mathematics, Computer ScienceArXiv
- 2012

A variant of the exact SCP method for SNLP is proposed in which nonmonotone scheme and “local” Lipschitz constants of the associated functions are used and a similar convergence result is established.

### An Extended Sequential Quadratically Constrained Quadratic Programming Algorithm for Nonlinear, Semidefinite, and Second-Order Cone Programming

- Computer Science, MathematicsJ. Optim. Theory Appl.
- 2013

The purpose of this paper is to establish global convergence results without boundedness assumptions on any of the iterative sequences built by the algorithm, for nonlinear, semidefinite, and second-order cone programs.

### Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity

- Computer ScienceMath. Oper. Res.
- 2021

It is shown that by using directions obtained in an SQP-like fashion convergence to generalized stationary points can be proved, and the iteration complexity of the general diminishing stepsize methods for nonconvex, constrained optimization problems is considered.

## References

SHOWING 1-7 OF 7 REFERENCES

### Exact Matrix Completion via Convex Optimization

- Computer Science, MathematicsFound. Comput. Math.
- 2009

It is proved that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries, and that objects other than signals and images can be perfectly reconstructed from very limited information.

### The ε-strategy in variational analysis : illustration with the closed convexification of a function

- Mathematics, Computer Science
- 2011

This work shall see how to recover all the (exact) minimizers of the relaxed version of the original problem (by closedconvexification of the objective function) in terms of the ε-minimizers ofthe original problem.

### A FORMULA FOR THE SET OF OPTIMAL SOLUTIONS OF A RELAXED MINIMIZATION PROBLEM. APPLICATIONS TO SUBDIFFERENTIAL CALCULUS

- Mathematics, Computer Science
- 2008

In the in…nite dimensional setting, we provide a general formula for the optimal set of a relaxed minimization problem in terms of the approximate minima of the data function. Various applications to…

### When Tensor Decomposition meets Compressed Sensing

- Computer Science
- 2010

The Canonical Polyadic decomposition (CP), sometimes referred to as Parafac, allows to restore identifiability in some Blind Identification problems, and a computationally feasible variant of Kruskal's uniqueness condition is obtained.

### The varepsilon-strategy in variational analysis: illustration with the closed convexification of a function

- Mathematics
- 2011

### Just relax: convex programming methods for identifying sparse signals in noise

- Computer ScienceIEEE Transactions on Information Theory
- 2006

A method called convex relaxation, which attempts to recover the ideal sparse signal by solving a convex program, which can be completed in polynomial time with standard scientific software.

### Matrix rank minimization with applications, Ph.D Thesis, Stanford University

- 2002