Elastic Feature Consolidation for Cold Start Exemplar-free Incremental Learning
CoRR(2024)
摘要
Exemplar-Free Class Incremental Learning (EFCIL) aims to learn from a
sequence of tasks without having access to previous task data. In this paper,
we consider the challenging Cold Start scenario in which insufficient data is
available in the first task to learn a high-quality backbone. This is
especially challenging for EFCIL since it requires high plasticity, which
results in feature drift which is difficult to compensate for in the
exemplar-free setting. To address this problem, we propose a simple and
effective approach that consolidates feature representations by regularizing
drift in directions highly relevant to previous tasks and employs prototypes to
reduce task-recency bias. Our method, called Elastic Feature Consolidation
(EFC), exploits a tractable second-order approximation of feature drift based
on an Empirical Feature Matrix (EFM). The EFM induces a pseudo-metric in
feature space which we use to regularize feature drift in important directions
and to update Gaussian prototypes used in a novel asymmetric cross entropy loss
which effectively balances prototype rehearsal with data from new tasks.
Experimental results on CIFAR-100, Tiny-ImageNet, ImageNet-Subset and
ImageNet-1K demonstrate that Elastic Feature Consolidation is better able to
learn new tasks by maintaining model plasticity and significantly outperform
the state-of-the-art.
更多查看译文
关键词
Computer vision,continual learning,class-incremental learning,exemplar free,lifelong learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要