Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank Constraints

@article{Bertsimas2020MixedProjectionCO,
  title={Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank Constraints},
  author={Dimitris Bertsimas and Ryan Cory-Wright and Jean Pauphilet},
  journal={ArXiv},
  year={2020},
  volume={abs/2009.10395}
}
We propose a framework for modeling and solving low-rank optimization problems to certifiable optimality. We introduce symmetric projection matrices that satisfy $Y^2=Y$, the matrix analog of binary variables that satisfy $z^2=z$, to model rank constraints. By leveraging regularization and strong duality, we prove that this modeling paradigm yields tractable convex optimization problems over the non-convex set of orthogonal projection matrices. Furthermore, we design outer-approximation… 

FrankWolfe.jl: A High-Performance and Flexible Toolbox for Frank–Wolfe Algorithms and Conditional Gradients

FrankWolfe.jl is an open-source implementation of several popular Frank–Wolfe and conditional gradients variants for first-order constrained optimization, allowing for easy extension and relying on few assumptions regarding the user-provided functions.

A new perspective on low-rank optimization

This work combines the matrix perspective function with orthogonal projection matrices to develop a matrix perspective reformulation technique that reliably obtains strong relaxations for a variety of low-rank problems, including reduced rank regression, non-negative matrix factorization, and factor analysis.

Sparse PCA With Multiple Components

This work designs tight semidefinite relaxations and proposes tractable second-order cone versions of these relaxations which supply high-quality upper bounds and investigates the performance of the methods in spiked covariance settings.

Computational complexity of decomposing a symmetric matrix as a sum of positive semidefinite and diagonal matrices

It is proved that when the rank of the positive semidefinite matrix in the decomposition is bounded above by an absolute constant, the problem can be solved in polynomial time and that many of these low-rank decomposition problems are complete in the first-order theory of the reals.

Mixed integer linear optimization formulations for learning optimal binary classification trees

This paper proposes four mixed integer linear optimization (MILO) formulations for designing optimal binary classification trees and provides theoretical comparisons between these formulations and the strongest flow-based MILO formulation of Aghaei et al. (2021).

On the convex hull of convex quadratic optimization problems with indicators

The new theory presented here unifies several previously established results, and paves the way toward utilizing polyhedral methods to analyze the convex hull of mixed-integer nonlinear sets.

Compact extended formulations for low-rank functions with indicator variables

This paper proposes a new disjunctive representation of the sets under study, which leads to compact formulations with size exponential in the rank of the function, but polynomial in the number of variables, and shows how to project out the additional variables for the case of rank-one functions.

Sparse Plus Low Rank Matrix Decomposition: A Discrete Optimization Approach

This work introduces a novel formulation for SLR that directly models the underlying discreteness of the problem and develops an alternating minimization heuristic to compute high quality solutions and a novel semidefinite relaxation that provides meaningful bounds for the solutions returned by the heuristic.

Ideal formulations for constrained convex optimization problems with indicator variables

This paper gives the convex hull description of the epigraph of the composition of a one-dimensional convex function and an affine function under arbitrary combinatorial constraints and gives a short proof that for a separable objective function, the perspective reformulation is ideal independent from the constraints of the problem.

Exterior-point Optimization for Nonconvex Learning

It is demonstrated that the NExOS algorithm, in spite of being general purpose, outperforms specialized methods on several examples of well-known nonconvex learning problems involving sparse and low-rank optimization.

References

SHOWING 1-10 OF 131 REFERENCES

Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization

It is shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.

Solving Rank-Constrained Semidefinite Programs in Exact Arithmetic

  • Simone Naldi
  • Computer Science, Mathematics
    J. Symb. Comput.
  • 2016
This paper designs an exact algorithm for solving rank-constrained semidefinite programs, whose complexity is essentially quadratic on natural degree bounds associated to the given optimization problem: for subfamilies of the problem where the size of the feasible matrix is fixed, the complexity is polynomial in the number of variables.

A unified approach to mixed-integer optimization: Nonlinear formulations and scalable algorithms

This work proposes a unified framework to address a family of classical mixed-integer optimization problems, including network design, facility location, unit commitment, sparse portfolio selection, binary quadratic optimization and sparse learning problems, and establishes that a general-purpose numerical strategy, which combines cutting-plane, first-order and local search methods, solves these problems faster and at a larger scale than state-of-the-art mixed- integer linear or second-order cone methods.

Rank-one Convexification for Sparse Regression

These relaxations can be formulated as semidefinite optimization problems in an extended space and are stronger and more general than the state-of-the-art formulations, including the perspective reformulation and formulations with the reverse Huber penalty and the minimax concave penalty functions.

Outer approximation with conic certificates for mixed-integer convex problems

The robustness of Pajarito is demonstrated by solving diverse MI-conic problems involving mixtures of positive semidefinite, second-order, and exponential cones, and is competitive with CPLEX’s specialized MISOCP algorithm.

A UNIFIED APPROACH TO MIXED-INTEGER OPTIMIZATION

We propose a unified framework to address a family of classical mixed-integer op4 timization problems with logically constrained decision variables, including network design, facility 5 location,

Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm

A low-rank factorization model is proposed and a nonlinear successive over-relaxation (SOR) algorithm is constructed that only requires solving a linear least squares problem per iteration to improve the capacity of solving large-scale problems.

Regularization vs. Relaxation: A conic optimization perspective of statistical variable selection

This paper shows that a popular sparsity-inducing concave penalty function known as the Minimax Concave Penalty (MCP), and the reverse Huber penalty derived in a recent work by Pilanci, Wainwright and Ghaoui, can both be derived as special cases of a lifted convex relaxation called the perspective relaxation.

Sparse learning via Boolean relaxations

  • Mert PilanciMartin J. WainwrightLaurent El Ghaoui
  • Computer Science
    Mathematical Programming
  • 2015
Novel relaxations for cardinality-constrained learning problems, including least-squares regression as a special but important case, and it is shown that randomization based on the relaxed solution offers a principled way to generate provably good feasible solutions.

Enhancing RLT relaxations via a new class of semidefinite cuts

In this paper, we propose a mechanism to tighten Reformulation-Linearization Technique (RLT) based relaxations for solving nonconvex programming problems by importing concepts from semidefinite
...