As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Neural networks find applications in many safety-critical systems that raise concerns about their deployment: Are we sure the network will never advise doing anything violating a set of safety constraints? Formal verification has been recently applied to prove whether an existing neural network is certified for some property (i.e., if it satisfies the property for all possible inputs) or not. Formal verification can prove that a network respects the property, but cannot fix a network that does not respect it. In this paper we focus on the automated synthesis of certified neural networks, that is, on how to automatically build a network that is guaranteed to respect some required properties. We exploit a Counter Example Guided Inductive Synthesis (CEGIS) loop that alternates Deep Learning, Formal Verification, and a novel data generation technique that augments the training data to synthesize certified networks in a fully automatic way. An application of a proof-of-concept implementation of the framework shows the feasibility of the approach.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.