Chrome Extension
WeChat Mini Program
Use on ChatGLM

Enhanced Simple Question Answering with Contrastive Learning

Knowledge Science, Engineering and Management (KSEM)(2022)

Cited 0|Views27
No score
Abstract
Answer natural language questions on knowledge bases (KBQA) has attracted wide attention. Several techniques have been developed for answering simple questions. These techniques mostly rely on deep networks to perform classification for relation prediction. Nowadays, contrastive learning has shown its powers in improving performances of classification, while most prior techniques do not gain benefit from this. In light of these, we propose a novel approach to answering simple questions on knowledge bases. Our approach has two key features. (1) It leverages pre-trained transformers to gain better performance on entity linking. (2) It employs a contrastive learning based model for relation prediction. We experimentally verify the performance of our approach, and show that our approach achieves an accuracy of 83.54%, which beats existing state-of-the-art techniques, on a typical benchmark dataset; we also conduct a deep analysis to show advantages of our technique, especially its sub-modules.
More
Translated text
Key words
Knowledge base,Question answering,Contrastive learning,Transfer learning,Pre-trained model
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined