RBPSum: An extractive summarization approach using Bi-Stream Attention and Position Residual Connection.

IJCNN(2023)

Cited 0|Views6
No score
Abstract
Extractive text summarization is a well-studied downstream task of natural language processing that aims to select sentences as a summary of the document's critical information. For the systems built based on pre-trained language models, the large volume of parameters can cause the model performance gets significantly degradation when the training data is insufficient. However, acquiring high-quality labeled data is a time-consuming and laborious task. Previous works mainly focus on accuracy, and few of them pay attention to the robustness of the model. Hence training a robust and high-quality model is the concern of this work. In this work, we propose RBPSum, a robust extractive summarization model based on the pre-trained language model. Through the experiments, we find that under the situation of restricted data size, sentence position information plays a critical role in extractive summarization. In terms of ROUGE metrics, our model outperforms the previous state-of-the-art approaches when using the entire training set, and around a third of the training set produces competitive results.
More
Translated text
Key words
Natural language processing,Pre-trained model,Text summarization
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined