MEOM: Memory-Efficient Online Meta-recommender for Cold-Start Recommendation.

Yan Luo,Ruoqian Zhang

Asia-Pacific Web Conference(2022)

引用 0|浏览3
暂无评分
摘要
Online recommender systems aim to provide timely recommended results by constantly updating the model with new interactions. However, existing methods require sufficient personal data and fail to accurately support online recommendations for cold-start users. Although the state-of-the-art method adopts meta-learning to solve this problem, it requires to recall previously seen data for model update, which is impractical due to linearly increasing memory over time. In this paper, we propose a memory-efficient online meta-recommender MEOM that can avoid the explicit use of historical data while achieving high accuracy. The recommender adopts MAML as a meta-learner and particularly adapts it to online scenarios with effective regularization. Specifically, an online regularization method is designed to summarize the time-varying model and historical task gradients, such that overall model optimization direction can be acquired to parameterize a meaningful regularizer as a penalty for next round model update. The regularizer is then utilized to guide model updates with prior knowledge in a memory-efficient and accurate manner. Besides, to avoid task-overfitting, an adaptive learning rate strategy is adopted to control model adaptation by more suitable learning rates in dual levels. Experimental results on two real-world datasets show that our method can significantly reduce memory consumption while keeping accuracy performance.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要