ESDB: Expand The Shrinking Decision Boundary via One-to-Many Information Matching for Continual Learning with Small Memory

Kunchi Li, Hongyang Chen,Jun Wan,Shan Yu

IEEE Transactions on Circuits and Systems for Video Technology(2024)

引用 0|浏览0
暂无评分
摘要
Rehearsal methods based on knowledge distillation (KD) have been widely used in continual learning (CL). However, given memory constraints, few exemplars contain limited variations of previously learned tasks, impeding the effectiveness of KD in retaining long-term knowledge. The decision boundaries learned by the typical KD strategy overfit the limited exemplars, leading to “shrunk boundaries" of the old classes. To tackle this problem, we propose a novel KD strategy, called One-to-Many Information Matching method (O2MIM), which generates interpolated data by mixing samples between old and new classes, disentangles the supervision information from them and assigns supervision information to them in favor of the old classes. By doing so, the supervision information from a single exemplar can be matched with multiple information from different interpolated images. Moreover, O2MIM utilizes one trainable parameter to create an adaptive KD loss, thereby facilitating a flexible matching process with the designated supervision information. Consequently, O2MIM exploits the exemplar corset more effectively, expanding the shrunk decision boundaries towards the new classes. Next, to incorporate new classes into our classification model, we apply an effective classification training strategy to train a debiased classifier. Combining it with O2MIM, we propose the method of Expanding the Shrinking Decision Boundaries (ESDB), which simultaneously transfers knowledge from the old model via O2MIM and learns new classes by the classification training strategy. Extensive experiments demonstrate that ESDB achieves state-of-the-art performance on diverse CL benchmarks. We also confirm that O2MIM can be used with various label-mixing methods to improve overall performance in CL. The code is available at: https://github.com/CSTiger77/ESDB.
更多
查看译文
关键词
Continual learning,One-to-many information matching,Expanding decision boundary,Mixed data
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要