Chrome Extension
WeChat Mini Program
Use on ChatGLM

An Interdisciplinary Literature Classifier Based on Multi-task Multi-label Learning

Lianxi Wang, Zhuolin Chen, Nankai Lin, Xixuan Huang

2021 International Conference on Asian Language Processing (IALP)(2021)

Cited 1|Views0
No score
Abstract
Interdisciplinary integration is one of the motive power of scientific innovation and development. In order to improve the classification effect of interdisciplinary literature, this paper adopts multi-task learning method to learn interdisciplinary literature categories with similar research topic. Aiming at the imbalance and intersectionality of the distribution of the categories of the literature in the field of Library and Information Science, this paper proposes a classification framework for interdisciplinary literature based on multi-task learning. The framework is based on BERT and improves the classification effect of the model in minority categories by introducing the machine reading comprehension task, which predicts the position of keywords in titles and abstracts. The results show that the multi-task learning method is more effective than decision tree, support vector machine, convolutional neural network, recurrent neural network and pre-trained models. In addition, compared with cost-sensitive method, the proposed method is more helpful for the minority class, and its Macro-F1 value has reached 74.84%.
More
Translated text
Key words
Interdisciplinary Literature Classification,Multi-task Learning,Pre-trained Model
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined