Synchronous Syntactic Attention For Transformer Neural Machine Translation

ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP(2021)

引用 4|浏览23
暂无评分
摘要
This paper proposes a novel attention mechanism for Transformer Neural Machine Translation, "Synchronous Syntactic Attention," inspired by synchronous dependency grammars. The mechanism synchronizes source-side and target-side syntactic self-attentions by minimizing the difference between target-side self-attentions and the source-side self-attentions mapped by the encoder-decoder attention matrix. The experiments show that the proposed method improves the translation performance on WMT14 En-De, WMT16 En-Ro, and AS-PEC Ja-En (up to +0.38 points in BLEU).
更多
查看译文
关键词
synchronous syntactic attention,translation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要