Faster and More Robust Low-Resource Nearest Neighbor Machine Translation.

NLPCC (2)(2023)

Cited 0|Views5
No score
Abstract
Transformer-based neural machine translation (NMT) models have achieved performance close to human-level on some languages, but still suffer from poor interpretability and scalability of the models. Many advanced studies enhance the model’s translation ability by building external memory modules and utilizing retrieval operations, however, it suffers from poor robustness and low decoding efficiency while improving the model performance, especially for low-resource translation tasks. In this paper, we propose a confidence-based gating mechanism to optimize the decoding efficiency by building a sub-network to determine the confidence of the model’s own translation capability and then decide whether the current translation needs to be retrieved from the memory module. By reducing the number of retrievals to improve the model’s translation speed without degrading the translation quality as much as possible. In addition, we use a nonparametric dynamic Monte Carlo-based algorithm to fuse retrieval probabilities and model predictions to improve the generalization and robustness of the model. Extensive experiments on different datasets demonstrate the effectiveness of our method.
More
Translated text
Key words
translation,low-resource
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined