Fine-Tuning Self-Supervised Multilingual Sequence-To-Sequence Models for Extremely Low-Resource NMT

2021 Moratuwa Engineering Research Conference (MERCon)(2021)

引用 2|浏览3
暂无评分
摘要
Neural Machine Translation (NMT) tends to perform poorly in low-resource language settings due to the scarcity of parallel data. Instead of relying on inadequate parallel corpora, we can take advantage of monolingual data available in abundance. Training a denoising self-supervised multilingual sequence-to-sequence model by noising the available large scale monolingual corpora is one way to utiliz...
更多
查看译文
关键词
neural machine translation,pre-trained models,fine-tuning,denoising autoencoder,low-resource languages
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要