Multiple dependence representation of attention graph convolutional network relation extraction model

IET CYBER-PHYSICAL SYSTEMS: THEORY & APPLICATIONS(2023)

引用 0|浏览0
暂无评分
摘要
Dependency analysis can better help neural network to capture semantic features in sentences, so as to extract entity relation. Currently, hard pruning strategies and soft pruning strategies based on dependency tree structure coding have been proposed to balance beneficial additional information and adverse interference in extraction tasks. A new model based on graph convolutional networks, which uses a variety of representations describing dependency trees from different perspectives and combining these representations to obtain a better sentence representation for relation classification is proposed. A newly defined module is added, and this module uses the attention mechanism to capture deeper semantic features from the context representation as the global semantic features of the input text, thus helping the model to capture deeper semantic information at the sentence level for relational extraction tasks. In order to get more information about a given entity pair from the input sentence, the authors also model implicit co-references (references) to entities. This model can extract semantic features related to the relationship between entities from sentences to the maximum extent. The results show that the model in this paper achieves good results on SemEval2010-Task8 and KBP37 datasets. A new model based on graph convolutional networks, which uses a variety of representations describing dependency trees from different perspectives and combines these representations to obtain a better sentence representation for relation classification is proposed. A newly defined module is added, and this module uses the attention mechanism to capture deeper semantic features from the context representation as the global semantic features of the input text, thus helping the model to capture deeper semantic information at the sentence level for relational extraction tasks. In order to get more information about a given entity pair from the input sentence, the authors also model implicit co-references (references) to entities.image
更多
查看译文
关键词
complex networks, data analysis, information networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要