Robustness to distribution shifts of compressed networks for edge devices
CoRR(2024)
摘要
It is necessary to develop efficient DNNs deployed on edge devices with
limited computation resources. However, the compressed networks often execute
new tasks in the target domain, which is different from the source domain where
the original network is trained. It is important to investigate the robustness
of compressed networks in two types of data distribution shifts: domain shifts
and adversarial perturbations. In this study, we discover that compressed
models are less robust to distribution shifts than their original networks.
Interestingly, larger networks are more vulnerable to losing robustness than
smaller ones, even when they are compressed to a similar size as the smaller
networks. Furthermore, compact networks obtained by knowledge distillation are
much more robust to distribution shifts than pruned networks. Finally,
post-training quantization is a reliable method for achieving significant
robustness to distribution shifts, and it outperforms both pruned and distilled
models in terms of robustness.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要