Position-aware compositional embeddings for compressed recommendation systems

Neurocomputing(2024)

引用 0|浏览7
暂无评分
摘要
GNN recommendation models like NGCF (Wang et al., 2019), LightGCN (He et al., 2020), map each user/item to a unique embedding vector by an embedding table as input features, and then learn further their representations by leveraging multi-hop neighbors on the bipartite graph. In real world, the embedding layer incurs gigabytes of memory consumption for hundreds of thousands of users and items, which is difficult to be deployed due to a plethora of engineering challenges. There are different methods to reduce the size of an embedding table. However, most hashing-based models fail to capture graph-structure information and keep topological distance for users/items in the compressed embedding space. To this end, we present Position-aware Compositional Embedding (PCE) as a low-memory alternative to the embedding layer. PCE constructs unique embedding for each user/item by combining fixed-size anchor nodes and its attached co-cluster on the graph. PCE incorporates global and co-cluster positions into compositional embeddings, obtaining competitive representation capability in the compressed case. Extensive experiments on three recommendation graphs demonstrate that our PCE exceeds state-of-the-art compression techniques. In particular, compared with complete embedding table schema, PCE has a ∼5% relative loss in Recall@20 averagely and 16x fewer parameters. Moreover, our model can be compressed by 2x while getting even better accuracy.
更多
查看译文
关键词
Compositional embedding,Compressed embedding table,GNN-based recommendation system
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要