Multi-task learning with graph attention networks for multi-domain task-oriented dialogue systems

Knowledge-Based Systems(2023)

引用 4|浏览50
暂无评分
摘要
A task-oriented dialogue system (TOD) is an important application of artificial intelligence. In the past few years, works on multi-domain TODs have attracted increased research attention and have seen much progress. A main challenge of such dialogue systems is finding ways to deal with cross-domain slot sharing and dialogue act temporal planning. However, existing studies seldom consider the models’ reasoning ability over the dialogue history; moreover, existing methods overlook the structure information of the ontology schema, which makes them inadequate for handling multi-domain TODs. In this paper, we present a multi-task learning framework equipped with graph attention networks (GATs) to probe the above two challenges. In the method, we explore a dialogue state GAT consisting of a dialogue context subgraph and an ontology schema subgraph to alleviate the cross-domain slot sharing issue. We further construct a GAT-enhanced memory network using the updated nodes in the ontology subgraph to filter out the irrelevant nodes to acquire the needed dialogue states. For dialogue act temporal planning, a similar GAT and corresponding memory network are proposed to obtain fine-grained dialogue act representation. Moreover, we design an entity detection task to improve the capability of soft gate, which determines whether the generated tokens are from the vocabulary or knowledge base. In the training phase, four training tasks are combined and optimized simultaneously to facilitate the response generation process. The experimental results for automatic and human evaluations show that the proposed model achieves superior results compared to the state-of-the-art models on the MultiWOZ 2.0 and MultiWOZ 2.1 datasets.
更多
查看译文
关键词
Task-oriented dialogue system,Natural response generation,Graph attention network,Memory network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要