Chrome Extension
WeChat Mini Program
Use on ChatGLM

Prerequisite Learning with Pre-trained Language and Graph Embedding Models.

NLPCC(2021)

Cited 3|Views124
No score
Abstract
Prerequisite learning is to automatically identify prerequisite relations between concepts. This paper proposes a new prerequisite learning approach based on pre-trained language model and graph embedding model. In our approach, pre-trained language model BERT is fine-tuned to encode latent features from concept descriptions; graph embedding model Node2Vec is first pre-trained on citation graph of concepts, and then is fine-tuned for generating discriminative features for prerequisite learning. Two models are jointly optimized to get latent features containing both textual and structural information of concepts. Experiments on manually annotated datasets show that our proposed approach achieves better results than the state-of-the-art prerequisite learning approaches.
More
Translated text
Key words
learning,language,graph,pre-trained
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined