Chrome Extension
WeChat Mini Program
Use on ChatGLM

Utilizing External Knowledge with Multi-granularity Attention for Review Reading Comprehension

2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021)(2021)

Cited 0|Views11
No score
Abstract
Machine Reading Comprehension (MRC) tasks are designed to enable the machine to understand a given text and answer questions. Motivated by recent success of MRC, Review Reading Comprehension (RRC) tasks using products reviews were proposed and a corresponding RRC dataset called ReviewRC was constructed. In this paper, to handle the gap between human beings and RRC models, we firstly propose an enhanced post-training approach based on BERT by data augmentation. Specifically, we utilize external knowledge including task-awareness knowledge, domain-awareness knowledge and entity-relationship knowledge to overcome task challenge and domain challenge of RRC. Also, prior knowledge is leveraged to enrich the semantic representation through sentence-level information. Then, we put forward an end-to-end model named Knowledge with Multi-granularity Attention Network (KMA-NET), in which multi-granularity attention is added to explore different levels of interactive information. Finally, experimental results on ReviewRC demonstrate that the proposed KMA-NET outperforms the baseline models.
More
Translated text
Key words
review reading comprehension, external knowledge, multi-granularity attention, end-to-end model
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined