Memory-Aware Attentive Control for Community Question Answering With Knowledge-Based Dual Refinement

IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS(2023)

Cited 2|Views13
No score
Abstract
The question answering system in open domain enables a machine to automatically select and generate the answer for questions posed by humans in a natural language form on the website. Previous approaches seek effective ways of extracting the semantic features between question and answer, but the contextual information effects in semantic matching are still limited by short-term memory. As an alternative, we propose an internal knowledge-based end-to-end model, enhanced by an attentive memory network for both answer selection and answer generation tasks by considering the full advantages of the semantics and multifacts (i.e., timescales, topics, and context). In detail, we design a long-term memory to learn the top- $k$ fine-grained similarity representations, where two memory-aware mechanisms aggregate the series of semantic word-level and sentence-level similarities to support the coarse contextual information. Furthermore, we propose a novel memory refinement mechanism with the two-dimensional of writing heads that offer an efficient approach to multiview selection of the salient word pairs. In the training stage, we adopt the transformer-based transfer learning skill to effectively pretrain the model. Experimentally, we compare the state-of-the-art approaches on four public datasets, the experimental results show that the proposed model achieves competitive performance.
More
Translated text
Key words
Semantics,Task analysis,Knowledge based systems,Transformers,Memory management,Context modeling,Bit error rate,Attention mechanism,distributed memories,information retrieval,knowledge-based systems,memory architecture
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined