License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.CONCUR.2023.26
URN: urn:nbn:de:0030-drops-190205
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2023/19020/
Go to the corresponding LIPIcs Volume Portal


Isac, Omri ; Zohar, Yoni ; Barrett, Clark ; Katz, Guy

DNN Verification, Reachability, and the Exponential Function Problem

pdf-format:
LIPIcs-CONCUR-2023-26.pdf (0.7 MB)


Abstract

Deep neural networks (DNNs) are increasingly being deployed to perform safety-critical tasks. The opacity of DNNs, which prevents humans from reasoning about them, presents new safety and security challenges. To address these challenges, the verification community has begun developing techniques for rigorously analyzing DNNs, with numerous verification algorithms proposed in recent years. While a significant amount of work has gone into developing these verification algorithms, little work has been devoted to rigorously studying the computability and complexity of the underlying theoretical problems. Here, we seek to contribute to the bridging of this gap. We focus on two kinds of DNNs: those that employ piecewise-linear activation functions (e.g., ReLU), and those that employ piecewise-smooth activation functions (e.g., Sigmoids). We prove the two following theorems:
(i) the decidability of verifying DNNs with a particular set of piecewise-smooth activation functions, including Sigmoid and tanh, is equivalent to a well-known, open problem formulated by Tarski; and
(ii) the DNN verification problem for any quantifier-free linear arithmetic specification can be reduced to the DNN reachability problem, whose approximation is NP-complete. These results answer two fundamental questions about the computability and complexity of DNN verification, and the ways it is affected by the network’s activation functions and error tolerance; and could help guide future efforts in developing DNN verification tools.

BibTeX - Entry

@InProceedings{isac_et_al:LIPIcs.CONCUR.2023.26,
  author =	{Isac, Omri and Zohar, Yoni and Barrett, Clark and Katz, Guy},
  title =	{{DNN Verification, Reachability, and the Exponential Function Problem}},
  booktitle =	{34th International Conference on Concurrency Theory (CONCUR 2023)},
  pages =	{26:1--26:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-299-0},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{279},
  editor =	{P\'{e}rez, Guillermo A. and Raskin, Jean-Fran\c{c}ois},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2023/19020},
  URN =		{urn:nbn:de:0030-drops-190205},
  doi =		{10.4230/LIPIcs.CONCUR.2023.26},
  annote =	{Keywords: Formal Verification, Computability Theory, Deep Neural Networks}
}

Keywords: Formal Verification, Computability Theory, Deep Neural Networks
Collection: 34th International Conference on Concurrency Theory (CONCUR 2023)
Issue Date: 2023
Date of publication: 07.09.2023


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI