ESSR: Evolving Sparse Sharing Representation for Multi-task Learning

IEEE Transactions on Evolutionary Computation(2023)

引用 0|浏览0
暂无评分
摘要
Multi-task learning uses knowledge transfer among tasks to improve the generalization performance of all tasks. For deep multi-task learning, knowledge transfer is often implemented via sharing all hidden features of tasks. A major shortcoming is that it can lead to negative knowledge transfer across tasks when task correlation is weak. To overcome it, this paper proposes an evolutionary method to learn sparse sharing representations adaptively. By embedding the neural network optimization into evolutionary multitasking, our proposed method finds an optimal combination of tasks and sharing features. It can identify negative correlation and redundant features and then remove them from the hidden feature set. Thus, an optimal sparse sharing subnetwork can be produced for each task. Experiment results show that the proposed method achieve better learning performance with a smaller inference model than other related methods.
更多
查看译文
关键词
Multi-task learning,evolutionary multitasking optimization,knowledge transfer,sharing representation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要