Pseudo basic steps: bound improvement guarantees from Lagrangian decomposition in convex disjunctive programming

@article{Papageorgiou2018PseudoBS,
  title={Pseudo basic steps: bound improvement guarantees from Lagrangian decomposition in convex disjunctive programming},
  author={Dimitri J. Papageorgiou and Francisco Trespalacios},
  journal={EURO Journal on Computational Optimization},
  year={2018},
  volume={6},
  pages={55-83}
}
An elementary, but fundamental, operation in disjunctive programming is a basic step, which is the intersection of two disjunctions to form a new disjunction. Basic steps bring a disjunctive set in regular form closer to its disjunctive normal form and, in turn, produce relaxations that are at least as tight. An open question is: What are guaranteed bounds on the improvement from a basic step? In this paper, using properties of a convex disjunctive program’s hull reformulation and multipliers… 

Convex Mixed-Integer Nonlinear Programs Derived from Generalized Disjunctive Programming using Cones

The closed-form of several convex functions and their respective perspectives in conic sets are presented, allowing users to easily formulate their conic GDP problems, and the conic structure can be exploited to solve these challenging MICP problems more efficiently.

P-split formulations: A class of intermediate formulations between big-M and convex hull for disjunctive constraints

We develop a class of mixed-integer formulations for disjunctive constraints intermediate to the big-M and convex hull formulations in terms of relaxation strength. The main idea is to capture the

Between steps: Intermediate relaxations between big-M and convex hull formulations

This work develops a class of relaxations in between the bigM and convex hull formulations of disjunctions, drawing advantages from both, and proves that, under certain assumptions, the relaxations form a hierarchy starting from a big-M equivalent and converging to the convex Hull.

Partition-based formulations for mixed-integer optimization of trained ReLU neural networks

This paper introduces a class of mixed-integer formulations for trained ReLU neural networks by partitioning node inputs into a number of groups and forming the convex hull over the partitions via disjunctive programming.

Satisfiability modulo theories for process systems engineering

Improvement of K-Means Algorithm for Accelerated Big Data Clustering

In view of some defects exposed by the traditional k-means algorithm, this paper mainly improves and analyzes from two aspects.

References

SHOWING 1-10 OF 23 REFERENCES

A hierarchy of relaxations for nonlinear convex generalized disjunctive programming

Algorithmic Approach for Improved Mixed-Integer Reformulations of Convex Generalized Disjunctive Programs

An algorithmic approach to improve mixed-integer models that are originally formulated as convex generalized disjunctive programs (GDPs) by using a hybrid reformulation of GDP that seeks to exploit both of the advantages attributed to the two common GDP-to-MILP/MINLP transformations, the Big-M, and the Hull reformulation.

Disjunctive Programming: Properties of the Convex Hull of Feasible Points

  • E. Balas
  • Mathematics
    Discret. Appl. Math.
  • 1998

Disjunctive programming and a hierarchy of relaxations for discrete optimization problems

We discuss a new conceptual framework for the convexification of discrete optimization problems, and a general technique for obtaining approximations to the convex hull of the feasible set. The

A hierarchy of relaxations for linear generalized disjunctive programming

On handling indicator constraints in mixed integer programming

It is argued that aggressive bound tightening is often overlooked in MIP, while it represents a significant building block for enhancing MIP technology when indicator constraints and disjunctive terms are present, and a pair of computationally effective algorithmic approaches are devised that exploit it.

On mathematical programming with indicator constraints

This paper significantly extends some existing results that allow to work in the original space of variables for two relevant special cases where the disjunctions corresponding to the logical implications have two terms in two different directions.

Systematic modeling of discrete-continuous optimization models through generalized disjunctive programming

This work presents a modeling framework, generalized disjunctive programming (GDP), which represents problems in terms of Boolean and continuous variables, allowing the representation of constraints as algebraic equations, disjunctions and logic propositions.

A branch-and-cut method for 0-1 mixed convex programming

The disjunctive approach of Balas, Ceria, and Cornuéjols and devevlop a branch-and-cut method for solving 0-1 convex programming problems and shows that cuts can be generated by solving a single convex program.