CEAT: Continual Expansion and Absorption Transformer for Non-Exemplar Class-Incremental Learning
arxiv(2024)
摘要
In real-world applications, dynamic scenarios require the models to possess
the capability to learn new tasks continuously without forgetting the old
knowledge. Experience-Replay methods store a subset of the old images for joint
training. In the scenario of more strict privacy protection, storing the old
images becomes infeasible, which leads to a more severe plasticity-stability
dilemma and classifier bias. To meet the above challenges, we propose a new
architecture, named continual expansion and absorption transformer (CEAT). The
model can learn the novel knowledge by extending the expanded-fusion layers in
parallel with the frozen previous parameters. After the task ends, we
losslessly absorb the extended parameters into the backbone to ensure that the
number of parameters remains constant. To improve the learning ability of the
model, we designed a novel prototype contrastive loss to reduce the overlap
between old and new classes in the feature space. Besides, to address the
classifier bias towards the new classes, we propose a novel approach to
generate the pseudo-features to correct the classifier. We experiment with our
methods on three standard Non-Exemplar Class-Incremental Learning (NECIL)
benchmarks. Extensive experiments demonstrate that our model gets a significant
improvement compared with the previous works and achieves 5.38
4.92
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要