Question Answering With Hierarchical Attention Networks

2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2019)

引用 1|浏览14
暂无评分
摘要
We investigate hierarchical attention networks for the task of question answering. For this purpose, we propose two different approaches: in the first, a document vector representation is built hierarchically from word-to-sentence level which is then used to infer the right answer. In the second, pointer sum attention is utilized to directly infer an answer from the attention values of the word and sentence representations. We evaluate our approach on the Children's Book Test, a cloze-style question answering dataset, and analyze the generated attention distributions. Our results show that, although a hierarchical approach does not offer much improvement over a shallow baseline, it does indeed offer a large performance boost when combining word and sentence attention with pointer sum attention.
更多
查看译文
关键词
hierarchical attention networks, recurrent neural networks, pointer sum attention, question answering
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要