An efficient nonconvex reformulation of stagewise convex optimization problems
@article{Bunel2020AnEN, title={An efficient nonconvex reformulation of stagewise convex optimization problems}, author={Rudy Bunel and Oliver Hinder and Srinadh Bhojanapalli and Krishnamurthy Dvijotham}, journal={ArXiv}, year={2020}, volume={abs/2010.14322} }
Convex optimization problems with staged structure appear in several contexts, including optimal control, verification of deep neural networks, and isotonic regression. Off-the-shelf solvers can solve these problems but may scale poorly. We develop a nonconvex reformulation designed to exploit this staged structure. Our reformulation has only simple bound constraints, enabling solution via projected gradient methods and their accelerated variants. The method automatically generates a sequence…
Figures and Tables from this paper
7 Citations
PRIMA: general and precise neural network certification via scalable convex hull approximations
- Computer ScienceProc. ACM Program. Lang.
- 2022
The results show that PRIMA is significantly more precise than the state-of-the-art, verifying robustness to input perturbations for up to 20%, 30%, and 34% more images than existing work on ReLU-, Sigmoid-, and Tanh-based networks, respectively.
PRIMA: Precise and General Neural Network Certification via Multi-Neuron Convex Relaxations
- Computer Science
- 2021
The results show that PRIMA is significantly more precise than state-of-the-art, verifying robustness for up to 14%, 30%, and 34% more images than existing work on ReLU-, Sigmoid-, and Tanh-based networks, respectively.
Verifying Probabilistic Specifications with Functional Lagrangians
- Computer ScienceArXiv
- 2021
This work derives theoretical properties of the framework, which can handle arbitrary probabilistic specifications, and demonstrates empirically that the framework can handle a diverse set of networks, including Bayesian neural networks with Gaussian posterior approximations, MC-dropout networks, and verify specifications on adversarial robustness and out-of-distribution OOD detection.
Precise Multi-Neuron Abstractions for Neural Network Certification
- Computer ScienceArXiv
- 2021
The results show that PRIMA is significantly more precise than the state-of-the-art, verifying robustness for up to 16, 30%, and 34% more images than prior work on ReLU-, Sigmoid-, and Tanh-based networks, respectively.
Make Sure You're Unsure: A Framework for Verifying Probabilistic Specifications
- Computer ScienceNeurIPS
- 2021
This work introduces a general formulation of probabilistic specifications for neural networks, and shows that an optimal choice of functional multipliers leads to exact verification (i.e., sound and complete verification), and for specific forms of multipliers, develops tractable practical verification algorithms.
Formal verification of neural networks for safety-critical tasks in deep reinforcement learning
- Computer ScienceUAI
- 2021
A novel metric for the evaluation of models in safety critical tasks, the violation rate is introduced, providing a new formulation for the safety properties that aims to ensure that the agent always makes rational decisions.
The Convex Relaxation Barrier, Revisited: Tightened Single-Neuron Relaxations for Neural Network Verification
- Computer ScienceNeurIPS
- 2020
This work improves the effectiveness of propagation- and linear-optimization-based neural network verification algorithms with a new tightened convex relaxation for ReLU neurons that considers the multivariate input space of the affine pre-activation function preceding the ReLU.
References
SHOWING 1-10 OF 46 REFERENCES
The non-convex Burer-Monteiro approach works on smooth semidefinite programs
- Computer ScienceNIPS
- 2016
It is shown that the low-rank Burer--Monteiro formulation of SDPs in that class almost never has any spurious local optima, including applications such as max-cut, community detection in the stochastic block model, robust PCA, phase retrieval and synchronization of rotations.
Accelerated Proximal Gradient Methods for Nonconvex Programming
- Computer Science, MathematicsNIPS
- 2015
This paper is the first to provide APG-type algorithms for general nonconvex and nonsmooth problems ensuring that every accumulation point is a critical point, and the convergence rates remain O(1/k2) when the problems are convex.
Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding
- Computer Science, MathematicsJ. Optim. Theory Appl.
- 2016
We introduce a first-order method for solving very large convex cone programs. The method uses an operator splitting method, the alternating directions method of multipliers, to solve the homogeneous…
Local Minima and Convergence in Low-Rank Semidefinite Programming
- Computer Science, MathematicsMath. Program.
- 2005
The local minima of LRSDPr are classified and the optimal convergence of a slight variant of the successful, yet experimental, algorithm of Burer and Monteiro is proved, which handles L RSDPr via the nonconvex change of variables X=RRT.
Convex Optimization: Algorithms and Complexity
- Computer ScienceFound. Trends Mach. Learn.
- 2015
This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms and provides a gentle introduction to structural optimization with FISTA, saddle-point mirror prox, Nemirovski's alternative to Nesterov's smoothing, and a concise description of interior point methods.
Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Computer Science, MathematicsMath. Program.
- 2021
It is proved that for strongly convex problems, O (1/t^2) is the best possible convergence rate, while it is known that gradient methods can have linear convergence on unconstrained problems.
A Convex Relaxation Barrier to Tight Robustness Verification of Neural Networks
- Computer ScienceNeurIPS
- 2019
This paper unify all existing LP-relaxed verifiers, to the best of the knowledge, under a general convex relaxation framework, which works for neural networks with diverse architectures and nonlinearities and covers both primal and dual views of robustness verification.
Strong mixed-integer programming formulations for trained neural networks
- Computer ScienceIPCO
- 2019
A generic framework is presented that provides a way to construct sharp or ideal formulations for the maximum of d affine functions over arbitrary polyhedral input domains and corroborate this computationally, showing that these formulations are able to offer substantial improvements in solve time on verification tasks for image classification networks.
A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- Mathematics, Computer ScienceJournal of Mathematical Imaging and Vision
- 2010
A first-order primal-dual algorithm for non-smooth convex optimization problems with known saddle-point structure can achieve O(1/N2) convergence on problems, where the primal or the dual objective is uniformly convex, and it can show linear convergence, i.e. O(ωN) for some ω∈(0,1), on smooth problems.
The conjugate gradient method for optimal control problems
- Mathematics
- 1967
This paper extends the conjugate gradient minimization method of Fletcher and Reeves to optimal control problems. The technique is directly applicable only to unconstrained problems; if terminal…