Optimizing over an ensemble of neural networks
@inproceedings{Wang2021OptimizingOA, title={Optimizing over an ensemble of neural networks}, author={Keliang Wang and Leonardo Lozano and Carlos Henrique Cardonha and David Bergman}, year={2021} }
We study optimization problems where the objective function is modeled through feedforward neural networks with rectified linear unit (ReLU) activation. Recent literature has explored the use of a single neural network to model either uncertain or complex elements within an objective function. However, it is well known that ensembles of neural networks produce more stable predictions and have better generalizability than models with single neural networks, which motivates the investigation of…
Figures and Tables from this paper
References
SHOWING 1-10 OF 53 REFERENCES
Output Range Analysis for Deep Feedforward Neural Networks
- Computer ScienceNFM
- 2018
An efficient range estimation algorithm that iterates between an expensive global combinatorial search using mixed-integer linear programming problems, and a relatively inexpensive local optimization that repeatedly seeks a local optimum of the function represented by the NN is presented.
A lagrangian propagator for artificial neural networks in constraint programming
- Computer ScienceConstraints
- 2015
A new network-level propagator based on a non-linear Lagrangian relaxation that is solved with a subgradient algorithm is proposed, capable of dramatically reducing the search tree size on a thermal-aware dispatching problem on multicore CPUs.
Deterministic Global Optimization with Artificial Neural Networks Embedded
- Computer ScienceJ. Optim. Theory Appl.
- 2019
The proposed method is based on relaxations of algorithms using McCormick relaxations in a reduced space employing the convex and concave envelopes of the nonlinear activation function for deterministic global optimization of optimization problems with artificial neural networks embedded.
Mixed-Integer Optimization with Constraint Learning
- EconomicsArXiv
- 2021
This paper presents a meta-analyses of the Fajemisin-Birbil-Denhertog model, a parallel version of the model used in the Bouchut-Boyaval model, which shows how the model changed over time from simple to efficient to effective and vice versa.
ReLU Networks as Surrogate Models in Mixed-Integer Linear Programs
- Computer ScienceComput. Chem. Eng.
- 2019
ENTMOOT: A Framework for Optimization over Ensemble Tree Models
- Computer Science, Environmental ScienceComput. Chem. Eng.
- 2021
Partition-based formulations for mixed-integer optimization of trained ReLU neural networks
- Computer ScienceNeurIPS
- 2021
This paper introduces a class of mixed-integer formulations for trained ReLU neural networks by partitioning node inputs into a number of groups and forming the convex hull over the partitions via disjunctive programming.
Research and development of neural network ensembles: a survey
- Computer ScienceArtificial Intelligence Review
- 2016
Different approaches on the development and the latest studies on NNE are summarized, followed by detailed descriptions of individual neural network generation method, conclusion generation method and fusion based on granular computing and NNE.
Strong mixed-integer programming formulations for trained neural networks
- Computer ScienceIPCO
- 2019
A generic framework is presented that provides a way to construct sharp or ideal formulations for the maximum of d affine functions over arbitrary polyhedral input domains and corroborate this computationally, showing that these formulations are able to offer substantial improvements in solve time on verification tasks for image classification networks.