Improving Persona Understanding for Persona-based Dialogue Generation with Diverse Knowledge Selection.

ICPR(2022)

Cited 0|Views26
No score
Abstract
A significant goal in an open-domain dialogue system is to make chatbots generate more persona coherent responses given a context. To achieve this goal, some researchers attempt to introduce persona information into neural dialogue models. However, these neural dialogue models describe excessively persona traits during the conversation, which still suffer from the problem of generating boring and meaningful responses. In this paper, we divide the open-domain personalized dialogue generation task into two processes, persona recognition and persona fusion. In the persona recognition process, we use the pretraining model to encode the personas and conversation history independently, which is beneficial to the persona information fusion. Then, we design a dynamic persona fusion mechanism to effectively mine the relevance of dialogue context and persona information, and dynamically predict whether to incorporate persona features in the process of the dialogues. Our model outperforms with 1.23% in Acc., 0.36% in BLEU, 0.92% in F1, and 0.036% in Distinct than baseline models. The experimental results on the ConvAI2 dataset illustrate that the proposed model is superior to baseline approaches for generating more coherent and persona consistent responses.
More
Translated text
Key words
dialogue generation,persona understanding,diverse knowledge selection,persona-based
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined