谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Part-Of-Speech And Position Attention Mechanism Based Blstm For Question Answering System

2018 INTERNATIONAL CONFERENCE ON IMAGE AND VIDEO PROCESSING, AND ARTIFICIAL INTELLIGENCE(2018)

引用 2|浏览0
暂无评分
摘要
Attention based bidirectional long short-term memory networks have been increasingly concerned and widely used in Natural Language Processing tasks. Motivated by the performance of attention mechanism, various attentive models have been proposed to prompt the effectiveness of question answering. However, there are few researches that have focused on the impact of positional information on question answering, which has been proved effective in information retrieval. In this paper, we assume that if a word appears both in the question sentence and answer sentence, words close to it should be paid more attention to, since they are more likely to contain potential valuable information for the question. Moreover, there also has few researches that consider part-of-speech into question answering. We argue that words except nouns, verbs and pronouns tend to contain less useful information than nouns, verbs and pronouns, so that we can neglect the positional impact of them. Based on both assumptions above, we propose a part-of-speech and position attention mechanism based bidirectional long short-term memory networks for question answering system, abbreviated in DPOS-ATT-BLSTM, which cooperates with traditional attention mechanism to obtain attentive answer representations. We experiment on the Chinese medicinal dataset collected from the http://www.xywy.com/ and http://www.haodf.com/, and comparative experiments are made comparing with methods based on traditional attention mechanism. The experimental results demonstrate the good performance and efficiency of our proposed model.
更多
查看译文
关键词
question answering, bidirectional long short-term memory network, positional information, part-of-speech attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要