# OptNet: Differentiable Optimization as a Layer in Neural Networks

@inproceedings{Amos2017OptNetDO, title={OptNet: Differentiable Optimization as a Layer in Neural Networks}, author={Brandon Amos and J. Z. Kolter}, booktitle={ICML}, year={2017} }

This paper presents OptNet, a network architecture that integrates optimization problems (here, specifically in the form of quadratic programs) as individual layers in larger end-to-end trainable deep networks. These layers encode constraints and complex dependencies between the hidden states that traditional convolutional and fully-connected layers often cannot capture. In this paper, we explore the foundations for such an architecture: we show how techniques from sensitivity analysis, bilevel… CONTINUE READING

#### Supplemental Code

GITHUB REPO

Via Papers with Code

OptNet: Differentiable Optimization as a Layer in Neural Networks

199 Citations

Physarum Powered Differentiable Linear Programming Layers and Applications

- Mathematics, Computer Science
- 2020

Homogeneous Linear Inequality Constraints for Neural Network Activations

- Mathematics, Computer Science
- 2020

1

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 42 REFERENCES

On Differentiating Parameterized Argmin and Argmax Problems with Application to Bi-level Optimization

- Computer Science, Mathematics
- 2016

64

On solving constrained optimization problems with neural networks: a penalty method approach

- Computer Science, Medicine
- 1993

102

A Bilevel Optimization Approach for Parameter Learning in Variational Models

- Mathematics, Computer Science
- 2013

110