• Corpus ID: 245828047

Introducing Self-Attention to Target Attentive Graph Neural Networks

@inproceedings{Mitheran2021IntroducingST,
  title={Introducing Self-Attention to Target Attentive Graph Neural Networks},
  author={Sai Mitheran and Abhinav Java and Surya Kant Sahu and Arshad Shaikh},
  year={2021}
}
Session-based recommendation systems suggest relevant items to users by modeling user behavior and preferences using short-term anonymous sessions. Existing methods leverage Graph Neural Networks (GNNs) that propagate and aggregate information from neighboring nodes i.e., local message passing. Such graph-based architectures have representational limits, as a single sub-graph is susceptible to overfit the sequential dependencies instead of accounting for complex transitions between items in… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 19 REFERENCES
Graph Contextualized Self-Attention Network for Session-based Recommendation
TLDR
A graph contextualized self-attention model (GC-SAN) is proposed, which utilizes both graph neural network and self-Attention mechanism, for session-based recommendation and outperforms state-of-the-art methods consistently.
TAGNN: Target Attentive Graph Neural Networks for Session-based Recommendation
TLDR
In TAGNN, target-aware attention adaptively activates different user interests with respect to varied target items and the learned interest representation vector varies with different target items, greatly improving the expressiveness of the model.
Session-based Recommendation with Graph Neural Networks
TLDR
In the proposed method, session sequences are modeled as graph-structured data and GNN can capture complex transitions of items, which are difficult to be revealed by previous conventional sequential methods.
Handling Information Loss of Graph Neural Networks for Session-based Recommendation
TLDR
A lossless encoding scheme and an edge-order preserving aggregation layer based on GRU that is dedicatedly designed to process the losslessly encoded graphs are proposed that outperforms the state-of-the-art models on three public datasets.
DGTN: Dual-channel Graph Transition Network for Session-based Recommendation
TLDR
A novel method, namely Dual-channel Graph Transition Network (DGTN), to model item transitions within not only the target session but also the neighbor sessions, which demonstrates that DGTN outperforms other state-of-the-art methods.
Rethinking the Item Order in Session-based Recommendation with Graph Neural Networks
TLDR
This paper designs a novel model which collaboratively considers the sequence order and the latent order in the session graph for a session-based recommender system and proposes a weighted attention graph layer and a Readout function to learn embeddings of items and sessions for the next item recommendation.
Neural Attentive Session-based Recommendation
TLDR
A novel neural networks framework, i.e., Neural Attentive Recommendation Machine (NARM), is proposed to tackle session-based recommendation, which outperforms state-of-the-art baselines on both datasets and achieves a significant improvement on long sessions.
RepeatNet: A Repeat Aware Neural Recommendation Machine for Session-based Recommendation
Recurrent neural networks for session-based recommendation have attracted a lot of attention recently because of their promising performance. repeat consumption is a common phenomenon in many
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
A Collaborative Session-based Recommendation Approach with Parallel Memory Modules
TLDR
This work proposes a Collaborative Session-based Recommendation Machine (CSRM), a novel hybrid framework to apply collaborative neighborhood information to session-based recommendations that demonstrates the effectiveness of CSRM compared to state-of-the-art session- based recommender systems.
...
1
2
...