Transfer Learning Through Knowledge-Infused Representations with Contextual Experts.

AIAI (2)(2023)

引用 0|浏览2
暂无评分
摘要
In recent years, transfer learning in natural language processing has been dominated by incredibly large models following a pretraining-finetuning approach. A problem with these models is that the increasing model size goes hand in hand with increasing training costs. In this work, we instead evaluate the transfer learning capabilities of the recently introduced knowledge-infused representations. Previously, these infused representations have been shown to infuse experts’ knowledge into a downstream model when the expert and downstream model operates on the same task domain. We extend this by investigating the effects of different expert task configurations on the performance of the downstream model. Our results show that differing expert and downstream tasks do not affect the downstream model. This indicates a desired robustness of the model towards adding irrelevant information. Simultaneously, the ability to transport important information is retained as we continue to see a significant performance improvement when adding two experts of differing tasks. Overall, this solidifies the potential knowledge-infused representations have regarding the ability to generalize across different tasks and their ability to recycle old computations for smaller new downstream models.
更多
查看译文
关键词
contextual experts,representations,knowledge-infused
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要