Towards Efficient Communication and Secure Federated Recommendation System via Low-rank Training
WWW 2024(2024)
摘要
Federated Recommendation (FedRec) systems have emerged as a solution to
safeguard users' data in response to growing regulatory concerns. However, one
of the major challenges in these systems lies in the communication costs that
arise from the need to transmit neural network models between user devices and
a central server. Prior approaches to these challenges often lead to issues
such as computational overheads, model specificity constraints, and
compatibility issues with secure aggregation protocols. In response, we propose
a novel framework, called Correlated Low-rank Structure (CoLR), which leverages
the concept of adjusting lightweight trainable parameters while keeping most
parameters frozen. Our approach substantially reduces communication overheads
without introducing additional computational burdens. Critically, our framework
remains fully compatible with secure aggregation protocols, including the
robust use of Homomorphic Encryption. The approach resulted in a reduction of
up to 93.75
recommendation performance across datasets. Code for reproducing our
experiments can be found at https://github.com/NNHieu/CoLR-FedRec.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要