Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning
CVPR 2024(2024)
摘要
Class-Incremental Learning (CIL) requires a learning system to continually
learn new classes without forgetting. Despite the strong performance of
Pre-Trained Models (PTMs) in CIL, a critical issue persists: learning new
classes often results in the overwriting of old ones. Excessive modification of
the network causes forgetting, while minimal adjustments lead to an inadequate
fit for new classes. As a result, it is desired to figure out a way of
efficient model updating without harming former knowledge. In this paper, we
propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL. To enable model
updating without conflict, we train a distinct lightweight adapter module for
each new task, aiming to create task-specific subspaces. These adapters span a
high-dimensional feature space, enabling joint decision-making across multiple
subspaces. As data evolves, the expanding subspaces render the old class
classifiers incompatible with new-stage spaces. Correspondingly, we design a
semantic-guided prototype complement strategy that synthesizes old classes' new
features without using any old class instance. Extensive experiments on seven
benchmark datasets verify EASE's state-of-the-art performance. Code is
available at: https://github.com/sun-hailong/CVPR24-Ease
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要