Integrating Syntactic and Semantic Knowledge in AMR Parsing with Heterogeneous Graph Attention Network

ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2023)

引用 3|浏览0
暂无评分
摘要
Abstract Meaning Representation (AMR) parsing is the task of translating a sentence to an AMR semantic graph which captures the basic meaning of the sentence, and is empowered by pre-trained Transformer models recently. These models encode the syntactic and semantic knowledge implicitly through self-supervised pre-trained tasks. We argue that encoding the syntactic and semantic knowledge explicitly is beneficial to AMR parsing and can improve data efficiency. Specifically, syntactic dependency and semantic role labeling (SRL) have similar sub-structures with AMR. In this work, we propose a novel linguistic knowledge-enhanced AMR parsing model, which augments the pre-trained Transformer with syntactic dependency and semantic role labeling structures of sentences. By applying a heterogeneous graph attention network, we can obtain syntactically and semantically augmented word representation, which can be integrated using an attentive integration layer and gating mechanism. Experimental results show that our model achieves state-of-the-art performance on different benchmarks, especially in out-of-domain and low-resource scenarios.
更多
查看译文
关键词
AMR parsing,heterogeneous graph attention network,syntactic dependency,semantic role labeling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要