Deep Validation: Toward Detecting Real-World Corner Cases for Deep Neural Networks

2019 49th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN)(2019)

引用 30|浏览90
暂无评分
摘要
The exceptional performance of Deep neural networks (DNNs) encourages their deployment in safety-and dependability-critical systems. However, DNNs often demonstrate erroneous behaviors in real-world corner cases. Existing countermeasures center on improving the testing and bug-fixing practice. Unfortunately, building a bug-free DNN-based system is almost impossible currently due to its black-box nature, so anomaly detection is imperative in practice. Motivated by the idea of data validation in a traditional program, we propose and implement Deep Validation, a novel framework for detecting real-world error-inducing corner cases in a DNN-based system during runtime. We model the specifications of DNNs by resorting to their training data and cast checking input validity of DNNs as the problem of discrepancy estimation. Deep Validation achieves excellent detection results against various corner case scenarios across three popular datasets. Consequently, Deep Validation greatly complements existing efforts and is a crucial step toward building safe and dependable DNN-based systems.
更多
查看译文
关键词
neural networks,classification,safety,anomaly detection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要