Towards optimisers that 'Keep Learning'
PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION(2023)
摘要
We consider optimisation in the context of the need to apply an optimiser to a continual stream of instances from one or more domains, and consider how such a system might 'keep learning': by drawing on past experience to improve performance and learning how to both predict and react to instance and/or domain drift.
更多查看译文
关键词
Optimisation,continual learning,transfer-learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要