Slot Lost in Translation? Not Anymore: A Machine Translation Model for Virtual Assistants with Type-Independent Slot Transfer.

IWSSIP(2023)

引用 0|浏览0
暂无评分
摘要
In this article, we present a machine translation model adapted to the domain of intelligent virtual assistants (IVA) that can be used to translate training and evaluation resources. Our model is capable of transferring natural language understanding (NLU) slots between source and target language. Slots can be annotated with a simple XML-like format and no specific types are expected, which makes our model task-independent. We evaluate the quality of translations with BLEU metric and slot transfer using Fl-score on the dataset prepared for the purpose of this model creation as well as the WMT dataset. We report an improvement of 17.21 BLEU in the IVAs domain when compared to the baseline M2M100 model and slot F1 of 65.47% for sentences with multiple slot types and 87.54% for sentences with single slot type. To analyze the quality of our model, we trained NLU models from translated training resources and compared them with models trained from the original data. Results indicate that our model is particularly good for the translation of intent detection models. We released an English-to-Polish translation model, and tools for model training and evaluation. We also created a corpus for the community to foster further research on multilingual IVA.
更多
查看译文
关键词
machine translation,virtual assistant,multilingual natural language understanding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要