Chinese named entity recognition based on adaptive transformer

Yan Yang, Guozhe Yin

MLNLP(2022)

引用 0|浏览2
暂无评分
摘要
In recent years, lexical enhancement methods have been the focus of research in NER tasks. In Chinese NER research,the SoftLexion model has achieved the best results on several Chinese datasets using lexical augmentation methods. In this paper,SoftLexion is investigated, and it is found that the model does not encode the input text characters but directly uses the corresponding embedding vectors found from the pre-trained character embedding matrix. Secondly, Bi-LSTM is used in the sequence modeling layer,but this method is inferior to Transformer in terms of contextual feature extraction capability, long-range feature capture capability, and parallel computing capability, so the research work in this paper adopts the latest Adaptive Transformer to encode the input text of the model and then fuse it with lexical information. The character representation of the fused lexical information is then sequence modeled by the adaptive Transformer and finally decoded by the tag decoding layer. In this paper, experiments are conducted on three Chinese datasets and the study shows that the model performs better with the addition of a character encoding layer and sequence modeling by Adaptive Transformer.
更多
查看译文
关键词
Vocabulary enhancement, Chinese named entity recognition, caracter encoding, Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要