Contrastive Learning with Dialogue Attributes for Neural Dialogue Generation

ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2023)

Cited 0|Views37
No score
Abstract
Designing an effective learning method remains a challenge in neural dialogue generation systems as it requires the training objective to well approximate the intrinsic human-preferred dialogue properties. Conventional training approaches such as maximum likelihood estimation focus on modeling general syntactic patterns and may fail to capture intricate conversational characteristics. Contrastive dialogue learning offers an effective training schema by explicitly training a neural dialogue model on multiple positive and negative conversational pairs. However, constructing contrastive learning pairs is non-trivial, and multiple dialogue attributes have been found to be crucial for governing the human judgments of conversations. This paper proposes to guide the response generation with attribute-aware contrastive learning to improve the overall quality of the generated responses, where contrastive learning samples are generated according to various important dialogue attributes each specializing in a different principle of conversation. Extensive experiments show that our proposed techniques are crucial to achieving superior model performance.
More
Translated text
Key words
Dialogue Generation,Contrastive Learning,Conversational Attributes,Adversarial Perturbations
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined