MeetSum: Transforming Meeting Transcript Summarization using Transformers!

arxiv(2021)

引用 0|浏览2
暂无评分
摘要
Creating abstractive summaries from meeting transcripts has proven to be challenging due to the limited amount of labeled data available for training neural network models. Moreover, Transformer-based architectures have proven to beat state-of-the-art models in summarizing news data. In this paper, we utilize a Transformer-based Pointer Generator Network to generate abstract summaries for meeting transcripts. This model uses 2 LSTMs as an encoder and a decoder, a Pointer network which copies words from the inputted text, and a Generator network to produce out-of-vocabulary words (hence making the summary abstractive). Moreover, a coverage mechanism is used to avoid repetition of words in the generated summary. First, we show that training the model on a news summary dataset and using zero-shot learning to test it on the meeting dataset proves to produce better results than training it on the AMI meeting dataset. Second, we show that training this model first on out-of-domain data, such as the CNN-Dailymail dataset, followed by a fine-tuning stage on the AMI meeting dataset is able to improve the performance of the model significantly. We test our model on a testing set from the AMI dataset and report the ROUGE-2 score of the generated summary to compare with previous literature. We also report the Factual score of our summaries since it is a better benchmark for abstractive summaries since the ROUGE-2 score is limited to measuring word-overlaps. We show that our improved model is able to improve on previous models by at least 5 ROUGE-2 scores, which is a substantial improvement. Also, a qualitative analysis of the summaries generated by our model shows that these summaries and human-readable and indeed capture most of the important information from the transcripts.
更多
查看译文
关键词
transforming meeting transcript summarization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要