Multi-task fine-tuning for generating keyphrases in a scientific domain

2023 IX International Conference on Information Technology and Nanotechnology (ITNT)(2023)

引用 0|浏览1
暂无评分
摘要
Automatic selection of keyphrases (keywords) is a major challenge to finding and systematizing scholarly documents. This paper investigates the efficiency of using titles of scientific papers as additional information for keyphrase generation. We propose an approach to multi-task fine-tuning the BART model using control codes 1 . It is shown that the suggested approach can improve the performance of BART for the task of keyphrase generation. In some cases, the presented model outperforms state-of-the-art models for keyphrase extraction. Moreover, the results have demonstrated that multitask fine-tuning also increases the performance of title generation.
更多
查看译文
关键词
natural language processing,automatic summarization,text generation,scientific text,BART,multi-task learning,keyword extraction,keyphrase generation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要