Chrome Extension
WeChat Mini Program
Use on ChatGLM

Chinese Text Classification Method Based On Bert Word Embedding

Ziniu Wang, Zhilin Huang,Jianling Gao

2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020)(2020)

Cited 11|Views5
No score
Abstract
In this paper,we enhance the semantic representation of the word through the BERT pre-training language model, dynamically generates the semantic vector according to the context of the character, and then inputs the character vector embedded as a character-level word vector sequence into the CapsNet.We builted the BiGRU module in the capsule network for text feature extraction, and introduced attention mechanism to focus on key information.We use the corpus of baidu's Chinese question and answer data set and only take the types of questions as classified samples to conduct experiments.We used the separate BERT network and the CapsNet as a comparative experiment. Finally, the experimental results show that the model effect is better than using one of the models alone, and the effect is improved.
More
Translated text
Key words
Text Classification, BERT, CapsNet, Word Embedding, BiGRU, Attention mechanism
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined