A Local Attention-based Neural Networks for Abstractive Text Summarization.

Ngoc-Khuong Nguyen, Viet Ha Nguyen,Anh-Cuong Le

Asia Pacific Information Technology Conference(2023)

引用 0|浏览0
暂无评分
摘要
Text summarization is a process of generating a concise summary of a given text. It is a popular research topic, and many studies have used sequence-to-sequence deep neural network models such as Long Short Term Memory (LSTM) or Gated Recurrent Unit (GRU) to tackle it. These models consist of two phases: encoding the input text and generating the summary. However, these models may lose information during the encoding phase, particularly when using deep layers, leading to inaccurate summaries. In this paper, we propose a bi-directional LSTM model with a Recurrent Residual Attention mechanism to address this issue. We tested our model on the Amazon Reviews dataset from the Stanford Network Analysis Project and found that it performed better than standard LSTM models and outperformed previous studies.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要