Bridging Knowledge Distillation Gap for Few-sample Unsupervised Semantic Segmentation

Ping Li, Junjie Chen, Chen Tang

Information Sciences(2024)

引用 0|浏览0
暂无评分
摘要
Due to privacy, security, and costly labeling of images, unsupervised semantic segmentation with very few samples has become a promising direction, but still remains unexplored. This inspires us to introduce the few-sample unsupervised semantic segmentation task, which is very challenging because generalizing the segmentation model from only a few unlabeled images is far from sufficient. We address this problem in the knowledge distillation perspective, by proposing a medium-sized auxiliary network as the bridge, which narrows down the semantic knowledge gap between teacher network (large) and student network (small). To this end, we develop the Knowledge Distillation Bridge (KDB) framework for few-sample unsupervised semantic segmentation. In particular, it consists of the teacher-auxiliary-student architecture, which adopts the block-wise distillation that encourages the auxiliary to imitate the teacher and the student to imitate the auxiliary. In this way, the knowledge gap between the source feature distribution and the target one is reduced, allowing the student with the smaller network to be readily deployed in highly-demanding environment. Meanwhile, each channel characterizes different semantics in feature map, which motivates us to distill the features of decoder in a channel-wise manner. Extensive experiments on two benchmarks including Pascal VOC2012 and Cityscapes demonstrate the promising performance of the proposed method, which strikes a good balance between precision and speed, e.g., it achieves the inference speed of 230 fps for a 512×512 image.
更多
查看译文
关键词
Unsupervised semantic segmentation,knowledge distillation,block-wise / channel-wise distillation,few-sample learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要