Direction-sensitive relation extraction using Bi-SDP attention model

Knowledge-Based Systems(2020)

引用 23|浏览62
暂无评分
摘要
Relation extraction is a crucial task of natural language processing (NLP). It plays a key role in question answering, web search, and information retrieval and so on. Previous research on this task has verified the effectiveness of using attention mechanisms, shortest dependency paths (SDP) and LSTM. However, most of these methods focus on learning a semantic representation of the whole sentence, highlighting the importance of partial words, or pruning the sentence with SDP. They ignore the lose of information in these methods, such as the dependency relation of each word and preposition words to indicate the relation direction. Besides, the SDP-based approach is prone to over-pruning. Based on the above observations, this paper presents a framework with a Bi-directional SDP (Bi-SDP) attention mechanism to tackle these challenges. The Bi-SDP is a novel representation of SDP, including original SDP and its reverse. The attention mechanism, based on Bi-SDP, builds a parallel word-level attention to capture relational semantic words and directional words. Furthermore, we explored a novel pruning strategy to minimize the length of input instance and the number of RNN cells simultaneously. Moreover, experiments are conducted on two datasets: SemEval-2010 Task 8 dataset and KBP37 dataset. Compared with the previous public models, our method can achieve better competitive performance on the SemEval-2010 Task 8 dataset and outperform existing models on the KBP37 dataset. Additionally, our experimental results also evidence that the directional prepositions in sentences are useful for relation extraction and can improve the performance of relationship with apparent physical direction.
更多
查看译文
关键词
Relation extraction,Shortest dependency path,Recurrent neural network,Self-attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要