Rule-Based Testing of Neural Networks.

SE4SafeML 2023: Proceedings of the 1st International Workshop on Dependability and Trustworthiness of Safety-Critical Systems with Machine Learned Components(2023)

引用 0|浏览13
暂无评分
摘要
Adequate testing of deep neural networks (DNNs) is challenging due to lack of formal requirements and specifications of functionality. In this work, we aim to improve DNN testing by addressing this central challenge. The core idea is to drive testing of DNNs from rules abstracting the network behavior. These rules are automatically extracted from a trained model based on monitoring its neuron values when running on a set of labeled data, and are validated on a separate test set. We show how these rules can be leveraged to improve fundamental testing activities, such as generating test oracles and supporting testing coverage with semantic meaning.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要