Disentangled Self-Attention with Auto-Regressive Contrastive Learning for Neural Group Recommendation

Linyao Gao, Haonan Zhang,Luoyi Fu

Applied Sciences(2024)

Cited 0|Views5
No score
Abstract
Group recommender systems aim to provide recommendations to a group of users as a whole rather than to individual users. Nonetheless, prevailing methodologies predominantly aggregate user preferences without adequately accounting for the unique individual intents influencing item selection. This oversight becomes particularly problematic in the context of ephemeral groups formed by users with limited shared historical interactions, which exacerbates the data sparsity challenge. In this paper, we introduce a novel Disentangled Self-Attention Group Recommendation framework with auto-regressive contrastive learning method, termed DAGA. This framework not only employs disentangled neural architectures to reconstruct the multi-head self-attention network but also incorporates modules for mutual information optimization via auto-regressive contrastive learning to better leverage the context information of user–item and group–item historical interactions, obtaining group representations and further executing recommendations. Specifically, we develop a disentangled model comprising multiple components to individually assess and interpret the diverse intents of users and their impacts on collective group preferences towards items. Building upon this model, we apply the principle of contrastive mutual information maximization to train our framework, aligning the group representations with the corresponding user representations derived from each factor of the disentangled model, thereby enriching the contextual understanding required to effectively address the challenges posed by ephemeral groups. Empirical evaluations conducted on three real-world benchmark datasets substantiate the superior performance of our proposed framework over existing state-of-the-art group recommendation approaches.
More
Translated text
Key words
group recommendation,disentangled neural network,multi-head self-attention,contrastive learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined