Pros and Cons of Weight Pruning for Out-of-Distribution Detection: An Empirical Survey.

IJCNN(2023)

引用 0|浏览5
暂无评分
摘要
Deep neural networks (DNNs) perform well on samples from the training distribution. However, DNNs deployed in the real world are exposed to out-of-distribution (OOD) samples, which refer to the samples from distributions that differ from the training distribution. OOD detection is indispensable to the DNNs as OOD samples can cause unexpected behaviors for them. This paper empirically explores the effectiveness of weight pruning of DNNs for OOD detection in a post-hoc setting (i.e., performing OOD detection based on pretrained DNN models). We conduct experiments on image, text, and tabular datasets to thoroughly evaluate OOD detection performance of weight-pruned DNNs. Our experimental results bring the following three novel findings: (i) Weight pruning improves OOD detection performance more significantly with a Mahalanobis distance-based detection approach, which performs OOD detection on DNN hidden representations using the Mahalanobis distance, than with logit-based detection approaches. (ii) Weight-pruned DNNs tend to extract global features of inputs, which improves the OOD detection on samples much dissimilar to the in-distribution samples. (iii) The weights that are useless for classification are often useful for OOD detection, and thus weight importance should not be quantified as the sensitivity of weights only to classification error. On the basis of these findings, we advocate practical techniques of DNN weight pruning that enable weight-pruned DNNs to maintain both OOD detection and classification capabilities.
更多
查看译文
关键词
Out-of-Distribution,Weight Pruning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要