An Empirical study of Unsupervised Neural Machine Translation: analyzing NMT output, model's behavior and sentences' contribution
CoRR(2023)
摘要
Unsupervised Neural Machine Translation (UNMT) focuses on improving NMT
results under the assumption there is no human translated parallel data, yet
little work has been done so far in highlighting its advantages compared to
supervised methods and analyzing its output in aspects other than translation
accuracy. We focus on three very diverse languages, French, Gujarati, and
Kazakh, and train bilingual NMT models, to and from English, with various
levels of supervision, in high- and low- resource setups, measure quality of
the NMT output and compare the generated sequences' word order and semantic
similarity to source and reference sentences. We also use Layer-wise Relevance
Propagation to evaluate the source and target sentences' contribution to the
result, expanding the findings of previous works to the UNMT paradigm.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要