A general system for heuristic minimization of convex functions over non-convex sets

@article{Diamond2018AGS,
  title={A general system for heuristic minimization of convex functions over non-convex sets},
  author={Steven Diamond and Reza Takapoui and Stephen P. Boyd},
  journal={Optimization Methods and Software},
  year={2018},
  volume={33},
  pages={165 - 193}
}
We describe general heuristics to approximately solve a wide variety of problems with convex objective and decision variables from a non-convex set. The heuristics, which employ convex relaxations, convex restrictions, local neighbour search methods, and the alternating direction method of multipliers, require the solution of a modest number of convex problems, and are meant to apply to general problems, without much tuning. We describe an implementation of these methods in a package called… 

Exterior-point Optimization for Nonconvex Learning

TLDR
It is demonstrated that the NExOS algorithm, in spite of being general purpose, outperforms specialized methods on several examples of well-known nonconvex learning problems involving sparse and low-rank optimization.

On Convergence of Heuristics Based on Douglas-Rachford Splitting and ADMM to Minimize Convex Functions over Nonconvex Sets†

  • Shuvomoy Das Gupta
  • Computer Science
    2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2018
TLDR
This paper describes optimal solutions of minimization problems involving convex cost functions over nonconvex constraint sets and establishes sufficient conditions under which the Douglas-Rachford splitting heuristic either converges to a point or its cluster points form a nonempty compact connected set.

Operator splitting methods for convex optimization : analysis and implementation

TLDR
It is shown that the alternating direction method of multipliers (ADMM) can detect infeasible problems, and OSQP is a novel general-purpose solver for quadratic programs (QPs) based on ADMM, which significantly outperforms a common approach based on convex relaxation of the original nonconvex problem.

Exterior-point Operator Splitting for Nonconvex Learning

TLDR
NExOS, a novel linearly convergent first-order algorithm tailored for constrained nonconvex learning problems, is presented and it is shown that in spite of being general purpose, NExOS is able to compute high quality solutions very quickly and is competitive with specialized algorithms.

Algorithms and software for projections onto intersections of convex and non-convex sets with applications to inverse problems

TLDR
Results show that the regularization of inverse problems in physical parameter estimation and image processing benefit from working with all available prior information and are not limited to one or two regularizers because of algorithmic, computational, or hyper-parameter selection issues.

Optimal representative sample weighting

TLDR
This work considers the problem of assigning weights to a set of samples or data records, with the goal of achieving a representative weighting, and describes the open-source implementation rsw and applies it to a skewed sample of the CDC BRFSS dataset.

A Relax-and-Round Approach to Complex Lattice Basis Reduction

TLDR
This work introduces a relaxed version of the problem that, while still nonconvex, has an easily identifiable family of solutions and constructs a subset of such solutions by performing a greedy search and applying a projection operator (element-wise rounding) to enforce the original constraint.

Alternating direction method of multipliers as a simple effective heuristic for mixed-integer nonlinear optimization

TLDR
A variation of the alternating direction method of multipliers (ADMM) is utilized as a simple heuristic for mixed-integer nonlinear optimization problems in structural optimization.

Multiblock ADMM Heuristics for Mixed-Binary Optimization on Classical and Quantum Computers

TLDR
A decomposition-based approach to extend the applicability of current approaches to “quadratic plus convex” mixed binary optimization (MBO) problems, so as to solve a broad class of real-world optimization problems.

Convex restrictions in physical design

TLDR
This work focuses on the specific case in which each physical design parameter is the ratio of two field variables, and shows in many practical cases there exist globally optimal designs whose design parameters are maximized or minimized at each point in the domain.
...

References

SHOWING 1-10 OF 101 REFERENCES

A simple effective heuristic for embedded mixed-integer quadratic programming

TLDR
A fast optimization algorithm for approximately minimizing convex quadratic functions over the intersection of affine and separable constraints (i.e., the Cartesian product of possibly nonconvex real sets) that is based on a variation of the alternating direction method of multipliers (ADMM).

Variations and extension of the convex–concave procedure

TLDR
This work investigates the convex–concave procedure, a local heuristic that utilizes the tools of convex optimization to find local optima of difference of conveX (DC) programming problems, and generalizes the algorithm to include vector inequalities.

Evaluation of Convex Optimization Techniques for the Weighted Graph-Matching Problem in Computer Vision

We present a novel approach to the weighted graph-matching problem in computer vision, based on a convex relaxation of the underlying combinatorial optimization problem. The approach always computes

Alternating direction method of multipliers for real and complex polynomial optimization models

TLDR
The main ingredient of the approach is to apply the classical alternating direction method of multipliers based on the augmented Lagrangian function to fully exploit the multi-block structure of the polynomial functions, even though the optimization model encountered is highly non-linear and non-convex.

A polyhedral branch-and-cut approach to global optimization

TLDR
This paper facilitates the reliable use of nonlinear convex relaxations in global optimization via a polyhedral branch-and-cut approach and proves that, if the convexity of a univariate or multivariate function is apparent by decomposing it into convex subexpressions, the relaxation constructor automatically exploits this convexITY in a manner that is much superior to developing polyhedral outer approximators for the original function.

Semidefinite Programming

TLDR
A survey of the theory and applications of semidefinite programs and an introduction to primaldual interior-point methods for their solution are given.

Convex Optimization

TLDR
A comprehensive introduction to the subject of convex optimization shows in detail how such problems can be solved numerically with great efficiency.

On convex relaxation of graph isomorphism

TLDR
This study proposes easy to check spectral properties to characterize a wide family of graphs for which it is proved that for friendly graphs, the convex relaxation is guaranteed to find the exact isomorphism or certify its inexistence, and extends this result to approximately isomorphic graphs.

Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers

TLDR
It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.

On the linear convergence of the alternating direction method of multipliers

TLDR
This paper establishes the global R-linear convergence of the ADMM for minimizing the sum of any number of convex separable functions, assuming that a certain error bound condition holds true and the dual stepsize is sufficiently small.
...