Few-Shot Class Incremental Learning via Robust Transformer Approach

Information Sciences(2024)

Cited 0|Views5
No score
Abstract
Few-Shot Class-Incremental Learning (FSCIL)presents an extension of the Class Incremental Learning (CIL)problem where a model is faced with the problem of data scarcity while addressing the Catastrophic Forgetting (CF)problem. This problem remains an open problem because all recent works are built upon the Convolutional Neural Networks (CNNs)performing sub-optimally compared to the transformer approaches. Our paper presents Robust Transformer Approach (ROBUSTA)built upon the Compact Convolutional Transformer (CCT). The issue of overfitting due to few samples is overcome with the notion of the stochastic classifier, where the classifier's weights are sampled from a distribution with mean and variance vectors, thus increasing the likelihood of correct classifications, and the batch-norm layer to stabilize the training process. The issue of CFis dealt with the idea of delta parameters, small task-specific trainable parameters while keeping the backbone networks frozen. A non-parametric approach is developed to infer the delta parameters for the model's predictions. The prototype rectification approach is applied to avoid biased prototype calculations due to the issue of data scarcity. The advantage of ROBUSTAis demonstrated through a series of experiments in the benchmark problems where it is capable of outperforming prior arts with big margins without any data augmentation protocols.
More
Translated text
Key words
Continual Learning,Class Incremental Learning,Few-Shot Class-Incremental Learning,Few-Shot Learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined