GCRec: Graph-Augmented Capsule Network for Next-Item Recommendation | IEEE Journals & Magazine | IEEE Xplore

GCRec: Graph-Augmented Capsule Network for Next-Item Recommendation


Abstract:

Next-item recommendation has been a hot research, which aims at predicting the next action by modeling users’ behavior sequences. While previous efforts toward this task ...Show More

Abstract:

Next-item recommendation has been a hot research, which aims at predicting the next action by modeling users’ behavior sequences. While previous efforts toward this task have been made in capturing complex item transition patterns, we argue that they still suffer from three limitations: 1) they have difficulty in explicitly capturing the impact of inherent order of item transition patterns; 2) only a simple and crude embedding is insufficient to yield satisfactory long-term users’ representations from limited training sequences; and 3) they are incapable of dynamically integrating long-term and short-term user interest modeling. In this work, we propose a novel solution named graph-augmented capsule network (GCRec), which exploits sequential user behaviors in a more fine-grained manner. Specifically, we employ a linear graph convolution module to learn informative long-term representations of users. Furthermore, we devise a user-specific capsule module and a position-aware gating module, which are sensitive to the relative sequential order of the recently interacted items, to capture sequential patterns at union-level and point-level. To aggregate the long-term and short-term user interests as a representative vector, we design a dual-gating mechanism, which could decide the contribution ratio of each module given different contextual information. Through extensive experiments on four benchmarks, we validate the rationality and effectiveness of GCRec on the next-item recommendation task.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 34, Issue: 12, December 2023)
Page(s): 10164 - 10177
Date of Publication: 25 April 2022

ISSN Information:

PubMed ID: 35468064

Funding Agency:


References

References is not available for this document.