Formal Verification of Piece-Wise Linear Feed-Forward Neural Networks

@inproceedings{Ehlers2017FormalVO,
  title={Formal Verification of Piece-Wise Linear Feed-Forward Neural Networks},
  author={R{\"u}diger Ehlers},
  booktitle={ATVA},
  year={2017}
}
We present an approach for the verification of feed-forward neural networks in which all nodes have a piece-wise linear activation function. [] Key Method We present a specialized verification algorithm that employs this approximation in a search process in which it infers additional node phases for the non-linear nodes in the network from partial node phase assignments, similar to unit propagation in classical SAT solving.
Advances in verification of ReLU neural networks
TLDR
A solver for verification of ReLU neural networks which combines mixed integer programming with specialized branching and approximation techniques is implemented and able to solve the verification problem for instances which do not have independent bounds for each input neuron.
Branch and Bound for Piecewise Linear Neural Network Verification
TLDR
A family of algorithms based on Branch-and-Bound (BaB), which identifies new methods that combine the strengths of multiple existing approaches, accomplishing significant performance improvements over previous state of the art and introduces an effective branching strategy on ReLU non-linearities.
Neural Network Branch-and-Bound for Neural Network Verification
TLDR
This work proposes a novel machine learning framework that can be used for designing an effective branching strategy as well as for computing better lower bounds, and learns two graph neural networks that both directly treat the network they want to verify as a graph input and perform forward-backward passes through the GNN layers.
Neural Network Branching for Neural Network Verification
TLDR
This work proposes a novel framework for designing an effective branching strategy for BaB, and learns a graph neural network (GNN) to imitate the strong branching heuristic behaviour.
Verification of Binarized Neural Networks
TLDR
The problem of formal verification of Binarized Neural Networks, which have recently been proposed as a power-efficient alternative to more traditional learning networks, is studied and it is proved that the problem of optimal factoring is NP-hard.
Reluplex: a calculus for reasoning about deep neural networks
TLDR
A novel, scalable, and efficient technique based on the simplex method, extended to handle the non-convex Rectified Linear Unit (ReLU) activation function, which is a crucial ingredient in many modern neural networks.
MASSIVELY PARALLEL INCOMPLETE VERIFIERS
TLDR
This paper proposes to use the backward mode linear relaxation based perturbation analysis (LiRPA) to replace LP during the BaB process, which can be efficiently implemented on the typical machine learning accelerators such as GPUs and TPUs and demonstrates an order of magnitude speedup compared to existing LP-based approaches.
Towards Scalable Complete Verification of Relu Neural Networks via Dependency-based Branching
TLDR
An efficient method that implements branching on the ReLU states on the basis of a notion of dependency between the nodes results in dividing the original verification problem into a set of sub-problems whose MILP formulations require fewer integrality constraints.
Pruning and Slicing Neural Networks using Formal Verification
  • O. Lahav, Guy Katz
  • Computer Science
    2021 Formal Methods in Computer Aided Design (FMCAD)
  • 2021
TLDR
This work presents a framework and a methodology for discovering redundancies in DNNs — i.e., for finding neurons that are not needed, and can be removed in order to reduce the size of the DNN.
Enhancing Robustness Verification for Deep Neural Networks via Symbolic Propagation
TLDR
A variety of local robustness properties and a -global robustness property of DNNs are focused on, and novel strategies to combine the constraint solving and abstraction-based approaches to work with these properties are investigated.
...
...

References

SHOWING 1-10 OF 34 REFERENCES
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
TLDR
Results show that the novel, scalable, and efficient technique presented can successfully prove properties of networks that are an order of magnitude larger than the largest networks verified using existing methods.
An Abstraction-Refinement Approach to Verification of Artificial Neural Networks
TLDR
A solution to verify their safety using abstractions to Boolean combinations of linear arithmetic constraints, and it is shown that whenever the abstract MLP is declared to be safe, the same holds for the concrete one.
Challenging SMT solvers to verify neural networks
TLDR
An extensive evaluation of the current state-of-the-art SMT solvers is presented and their potential in the promising domain of MLP verification is assessed.
Towards Verification of Artificial Neural Networks
TLDR
This paper takes a typical control problem, namely the Cart Pole System, and a model of its physical environment and uses bounded model checking (BMC) to study safety verification of this system, and shows that extending the solver by special deduction routines can reduce both memory consumption and computation time on such instances significantly.
Safety Verification of Deep Neural Networks
TLDR
A novel automated verification framework for feed-forward multi-layer neural networks based on Satisfiability Modulo Theory (SMT) is developed, which defines safety for an individual decision in terms of invariance of the classification within a small neighbourhood of the original image.
Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
TLDR
The "exponential linear unit" (ELU) which speeds up learning in deep neural networks and leads to higher classification accuracies and significantly better generalization performance than ReLUs and LReLUs on networks with more than 5 layers.
Accurate ICP-based floating-point reasoning
TLDR
This paper proposes techniques in order to mitigate the problem of aliasing during interval reasoning by including bitwise integer operations and cast operations between integer and floating-point arithmetic and outperforms solvers relying on bit-blasting.
An Extensible SAT-solver
TLDR
This article presents a small, complete, and efficient SAT-solver in the style of conflict-driven learning, as exemplified by Chaff, and includes among other things a mechanism for adding arbitrary boolean constraints.
A Fast Linear-Arithmetic Solver for DPLL(T)
TLDR
A new Simplex-based linear arithmetic solver that can be integrated efficiently in the DPLL(T) framework by enabling fast backtracking, supporting a priori simplification to reduce the problem size, and providing an efficient form of theory propagation.
Locating Minimal Infeasible Constraint Sets in Linear Programs
TLDR
A formulation aid which analyzes infeasible LPs and identifies minimal sets of inconsistent constraints from among the perhaps very large set of constraints defining the problem, speeding the repair of the model.
...
...