Multi-Task Learning for Argumentation Mining in Low-Resource Settings
north american chapter of the association for computational linguistics(2018)
摘要
We investigate whether and where multi-task learning (MTL) can improve performance on NLP problems related to argumentation mining (AM), in particular argument component identification. Our results show that MTL performs particularly well (and better than single-task learning) when little training data is available for the main task, a common scenario in AM. Our findings challenge previous assumptions that conceptualizations across AM datasets are divergent and that MTL is difficult for semantic or higher-level tasks.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络