To BERT or Not to BERT: Comparing Task-specific and Task-agnostic Semi-Supervised Approaches for Sequence Tagging.
Conference on Empirical Methods in Natural Language Processing(2020)
Key words
Topic Modeling,Sequence-to-Sequence Learning,Part-of-Speech Tagging
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined