Continual relation extraction via linear mode connectivity and interval cross training

Knowledge-Based Systems(2023)

引用 7|浏览20
暂无评分
摘要
Continual relation extraction techniques aim to meet the requirements of real-world applications, in which new data and relations emerge constantly. To strike a balance between continual learning ability and computational cost, many previous works on replay-based methods emphasized on employing different strategies to use a smaller number of stored replay samples to maintain the continual ability of the model. However, catastrophic forgetting is prone to occur if a limited number of previously stored replay samples have been used. In this paper, we proposed a continual learning method which contains three learning phases, namely preliminary learning, memory retention, and memory reconsolidation. In more detail, the model is fine-tuned on the model of the last previous task firstly in the preliminary learning stage. During the memory retention stage, the method utilizes the characteristics of the model’s linear mode connectivity between the multi-task learning model and the continual learning model to retain the knowledge of previous tasks as much as possible. At the memory reconsolidation stage, it uses the previously stored samples as well as the ones of the current task for interval cross training to reconsolidate the learned knowledge. In order to evaluate the continual learning ability of the model learning by the proposed method, we tested it on three well-studied benchmarks, and the experimental results show that our algorithm performs better than the other compared algorithms in terms of the average accuracy as well as overall accuracy.
更多
查看译文
关键词
Continual learning,Relation extraction,Memory retention,Memory reconsolidation,Linear mode connectivity,Interval cross training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要