Towards Scalable Complete Verification of ReLU Neural Networks via Dependency-based Branching
Author
Abstract
We introduce an efficient method for the complete verification of ReLU-based feed-forward neural networks. The method implements branching on the ReLU states on the basis of a notion of dependency between the nodes. This results in dividing the original verification problem into a set of sub-problems whose MILP formulations require fewer integrality constraints. We evaluate the method on all of the ReLU-based fully connected networks from the first competition for neural network verification. The experimental results obtained show 145% performance gains over the present state-of-the-art in complete verification.
Year of Conference
2021
Conference Name
International Joint Conference on Artificial Intelligence
Edition
30th
Publisher
IJCAI Organization