Towards Evolutionary Multi-Task Convolutional Neural Architecture Search

IEEE Transactions on Evolutionary Computation(2023)

引用 1|浏览5
暂无评分
摘要
Evolutionary neural architecture search (ENAS) methods have been successfully used to design convolutional neural network (CNN) architectures automatically. These methods have achieved excellent performance in creating a specific neural architecture for a single task but are less efficient for multiple tasks. Existing ENAS frameworks always repeatedly perform the search from scratch for each task, even though these tasks may be solved by similar CNN architectures. This work presents an evolutionary multi-task convolutional neural architecture search (MTNAS) framework to enable efficient architecture searches in multi-task scenarios by incorporating architectural similarities. The proposed MTNAS constructs architectures for different tasks simultaneously by implementing a knowledge-sharing mechanism among multiple search processes. Specifically, promising architectures found in one search process can be transferred and reused to generate high-quality architectures for others. Furthermore, we devise an adaptive strategy to dynamically adjust the frequency of knowledge transfer, aiming to alleviate the potential effect of negative transfer. Extensive experiments demonstrate that MTNAS can outperform state-of-the-art NAS methods or achieve comparable performance in different tasks but with 2× less search cost.
更多
查看译文
关键词
Convolutional neural network,evolutionary neural architecture search,evolutionary multi-task optimization,knowledge transfer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要